WO2022267506A1 - Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur - Google Patents

Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur Download PDF

Info

Publication number
WO2022267506A1
WO2022267506A1 PCT/CN2022/077713 CN2022077713W WO2022267506A1 WO 2022267506 A1 WO2022267506 A1 WO 2022267506A1 CN 2022077713 W CN2022077713 W CN 2022077713W WO 2022267506 A1 WO2022267506 A1 WO 2022267506A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
exposure
dynamic range
images
high dynamic
Prior art date
Application number
PCT/CN2022/077713
Other languages
English (en)
Chinese (zh)
Inventor
乔晓磊
丁大钧
肖斌
陈珂
朱聪超
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2022267506A1 publication Critical patent/WO2022267506A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present application belongs to the technical field of image processing, and in particular relates to an image fusion method, electronic equipment, a computer-readable storage medium and a computer program product.
  • the embodiments of the present application provide an image fusion method, an electronic device, a computer-readable storage medium, and a computer program product, which can improve the definition of a fused image.
  • the embodiment of the present application provides an image fusion method, including:
  • the first high dynamic range image and the second high dynamic range image are fused to obtain a fused image.
  • the above process can restore some image details lost due to overexposure of the small field of view image by performing high dynamic range fusion processing on the small field of view image, thereby improving the clarity of the fused image.
  • fusing the at least two frames of second images with different exposures into a second high dynamic range image may include:
  • the image features of the first high dynamic range image to guide the process of fusing the second images with HDR effects, so that the obtained second high dynamic range image and the first high dynamic range image have the same or similar brightness in the overlapping area and other image features, so that the image obtained after fusing the two high dynamic range images will also have the same or similar image features as the first high dynamic range image.
  • This process can be mainly based on the image features of each pixel in the first high dynamic range image, and the image features of each pixel in the second high dynamic range image are calculated from the weights of each second image, and then calculated according to The weights of are used to fuse the respective second images into a second high dynamic range image.
  • the image features of each pixel in the first high dynamic range image are obtained from the respective The weights of the second image may include:
  • each pixel in the first high dynamic range image and the image high dynamic range fusion algorithm corresponding to the first high dynamic range image calculate and obtain each pixel in the first high dynamic range image
  • the image features of each come from the weights of each of the first images
  • the fusion weight of each first image can be calculated, and then calculated accordingly to obtain the first high dynamic range image.
  • the image features of each pixel in the two high dynamic range images come from the weights of the second images respectively.
  • a weight reuse method or an image feature multiplexing method may be used.
  • the image features of each pixel in the second high dynamic range image are calculated according to the weights of the image features of each pixel in the first high dynamic range image from each of the first images.
  • the weights from each of the second images may include:
  • the image features of the corresponding pixel of the target pixel in the first high dynamic range image are respectively obtained from each of the first images
  • the weights are determined as the weights of the image features of the target pixel points from the respective second images.
  • a second high dynamic range image having image characteristics similar to that of the first high dynamic range image can be obtained by using weight reuse.
  • the image features of each pixel in the first high dynamic range image are obtained from the respective The weights of the second image may include:
  • any target pixel in the second high dynamic range image according to the image features of the corresponding pixel of the target pixel in the first high dynamic range image and the position of the target pixel in each of the The image features of the corresponding pixels in the second image are calculated to obtain the image features of the target pixels from the weights of each of the second images, wherein the image features of the target pixels and the target pixels Image features of corresponding pixel points in the first high dynamic range image are equal.
  • the at least two frames of second images with different exposures include a first exposure image and a second exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image; according to the target
  • the image features of the corresponding pixels of the pixels in the first high dynamic range image and the image features of the corresponding pixels of the target pixels in each of the second images are calculated to obtain the image of the target pixels
  • the features come from the weights of each of the second images, and may include:
  • the image features of the target pixel points are calculated according to the following formula from the weights of the first exposure image and the second exposure image:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • A indicates that the target pixel
  • B represents the image feature of the corresponding pixel point of the target pixel point in the second exposure image
  • P represents the image feature of the target pixel point in the Image features of corresponding pixels in the first high dynamic range image.
  • A, B, and P are all known values, so the weight X and weight Y can be calculated.
  • the at least two frames of second images with different exposures include a first exposure image, a second exposure image and a third exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image , the exposure amount of the second exposure image is greater than the exposure amount of the third exposure image; according to the image feature of the corresponding pixel point of the target pixel point in the first high dynamic range image and the target pixel point In the image features of the corresponding pixels in each of the second images, calculating the image features of the target pixels from the weights of each of the second images may include:
  • any one of the weights of an exposure image, the second exposure image and the third exposure image is a set value
  • the image features of the target pixel point are calculated according to the following formula from the weights of the first exposure image, the second exposure image and the third exposure image:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • Z indicates that the target pixel
  • A represents the image feature of the corresponding pixel point of the target pixel point in the first exposure image
  • B represents the image feature of the target pixel point in the second exposure image.
  • C represents the image feature of the corresponding pixel of the target pixel in the third exposure image
  • P represents the image feature of the target pixel in the first high dynamic range image
  • A, B, C and P are all known values, but since there are three unknown quantities X, Y and Z, it is necessary to introduce additional restrictive conditions to calculate the values of X, Y and Z value.
  • the image features of the corresponding pixel points of the target pixel points in the first high dynamic range image come from the respective first images
  • the image features of the target pixel points come from the first exposure image, the second exposure image, and the second exposure image respectively.
  • Any one of the weights of the exposure image and the third exposure image is a set value.
  • acquiring at least two frames of first images with different exposures captured by the first camera may include:
  • Acquiring at least two frames of second images with different exposures captured by the second camera may include:
  • the camera When the camera captures images, it will obtain the corresponding preview stream, that is, the data stream corresponding to the photo preview interface displayed on the display interface of the electronic device after the user turns on the camera.
  • the exposure corresponding to the preview stream is generally the default value set by the camera. You can use the exposure amount corresponding to the preview stream as a benchmark, increase the exposure amount by a certain percentage on this benchmark, and obtain an image with more than one frame of high exposure, and reduce the exposure amount by a certain percentage on this benchmark, and obtain a frame by shooting Image above with less exposure.
  • an image fusion device including:
  • An image acquisition module configured to acquire at least two frames of first images with different exposures captured by the first camera, and acquire at least two frames of second images with different exposures captured by the second camera, wherein the first camera The field of view angle is greater than the field of view angle of the second camera, and the first image includes the second image;
  • a high dynamic range processing module configured to fuse the at least two frames of first images with different exposures into a first high dynamic range image, and fuse the at least two frames of second images with different exposures into a second high dynamic range range image;
  • An image fusion module configured to fuse the first high dynamic range image and the second high dynamic range image to obtain a fused image.
  • the high dynamic range processing module may include:
  • a fusion weight calculation unit configured to calculate, according to the image features of each pixel in the first high dynamic range image, that the image features of each pixel in the second high dynamic range image come from each of the first high dynamic range images. The weight of the second image;
  • a high dynamic range fusion unit configured to combine the at least two frames of second images with different exposures according to the weights of the image features of each pixel in the second high dynamic range image from each of the second images fused into the second high dynamic range image.
  • the fusion weight calculation unit may include:
  • the first fusion weight calculation subunit is used to calculate and obtain the first high dynamic range image according to the image features of each pixel in the first high dynamic range image and the image high dynamic range fusion algorithm corresponding to the first high dynamic range image.
  • the image features of each pixel in a high dynamic range image respectively come from the weights of each of the first images;
  • the second fusion weight calculation subunit is used to calculate and obtain the weights in the second high dynamic range image according to the image features of each pixel in the first high dynamic range image respectively from the weights of the first images.
  • the image features of each pixel point come from the weights of the second images respectively.
  • the second fusion weight calculation subunit can be specifically configured to: for any target pixel in the second high dynamic range image, combine the target pixel in the first high dynamic range image
  • the image features of the corresponding pixel points come from the weights of each of the first images, and the image features determined as the target pixel points come from the weights of each of the second images.
  • the fusion weight calculation unit may include:
  • the third fusion weight calculation subunit is configured to, for any target pixel in the second high dynamic range image, according to the image features of the corresponding pixel of the target pixel in the first high dynamic range image and the image features of the corresponding pixels of the target pixel in each of the second images, the calculated image features of the target pixel are respectively derived from the weights of each of the second images, wherein the target pixel
  • the image feature of the point is equal to the image feature of the corresponding pixel point of the target pixel point in the first high dynamic range image.
  • the at least two frames of second images with different exposures include a first exposure image and a second exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image;
  • the third The fusion weight calculation subunit can be specifically configured to: calculate the weights of the image features of the target pixel from the first exposure image and the second exposure image respectively according to the following formula:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • A indicates that the target pixel
  • B represents the image feature of the corresponding pixel point of the target pixel point in the second exposure image
  • P represents the image feature of the target pixel point in the Image features of corresponding pixels in the first high dynamic range image.
  • the at least two frames of second images with different exposures include a first exposure image, a second exposure image and a third exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image , the exposure amount of the second exposure image is greater than the exposure amount of the third exposure image;
  • the third fusion weight calculation subunit may specifically include:
  • a weight setting subunit configured to set the weight of the target pixel according to the weights of the image features of the corresponding pixels of the target pixel in the first high dynamic range image from each of the first images.
  • Image features are respectively from any one of the weights of the first exposure image, the second exposure image and the third exposure image as a set value;
  • the formula calculation subunit is used to calculate the weights of the image features of the target pixel from the first exposure image, the second exposure image and the third exposure image respectively according to the following formula:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • Z indicates that the target pixel
  • A represents the image feature of the corresponding pixel point of the target pixel point in the first exposure image
  • B represents the image feature of the target pixel point in the second exposure image.
  • C represents the image feature of the corresponding pixel of the target pixel in the third exposure image
  • P represents the image feature of the target pixel in the first high dynamic range image
  • the image acquisition module may include:
  • the first image capturing unit is configured to capture more than one frame of images with an exposure greater than a first reference exposure through the first camera, and the first reference exposure is the exposure corresponding to the preview stream of the first camera ;
  • the second image capturing unit is configured to capture more than one frame of images with an exposure less than the first reference exposure through the first camera;
  • a first image determining unit configured to determine the image with an exposure amount of more than one frame greater than the first reference exposure amount and the image with an exposure amount of more than one frame less than the first reference exposure amount as the at least two frame the first image with different exposures;
  • the third image capturing unit is configured to obtain more than one frame of images with an exposure greater than a second reference exposure through the second camera, and the second reference exposure is the exposure corresponding to the preview stream of the second camera ;
  • a fourth image capturing unit configured to capture more than one frame of images with an exposure less than the second reference exposure through the second camera
  • the second image determining unit is configured to determine the image whose exposure amount is greater than the second reference exposure amount for more than one frame and the image whose exposure amount is less than the second reference exposure amount for more than one frame as the at least two Frame a second image with a different exposure.
  • an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor executes the computer program, , the electronic device implements the following image fusion method:
  • the first high dynamic range image and the second high dynamic range image are fused to obtain a fused image.
  • the electronic device fuses the at least two frames of second images with different exposures into a second high dynamic range image, which may include:
  • the electronic device calculates the image features of each pixel in the second high dynamic range image according to the image features of each pixel in the first high dynamic range image, respectively Weights from each of said second images may include:
  • each pixel in the first high dynamic range image and the image high dynamic range fusion algorithm corresponding to the first high dynamic range image calculate and obtain each pixel in the first high dynamic range image
  • the image features of each come from the weights of each of the first images
  • the electronic device calculates the weight of each pixel in the second high dynamic range image according to the image features of each pixel in the first high dynamic range image respectively from the weights of the first images.
  • the image features of the points come from the weights of each of the second images, which may include:
  • the image features of the corresponding pixel of the target pixel in the first high dynamic range image are respectively obtained from each of the first images
  • the weights are determined as the weights of the image features of the target pixel points from the respective second images.
  • the electronic device calculates the image features of each pixel in the second high dynamic range image according to the image features of each pixel in the first high dynamic range image, respectively Weights from each of said second images may include:
  • any target pixel in the second high dynamic range image according to the image features of the corresponding pixel of the target pixel in the first high dynamic range image and the position of the target pixel in each of the The image features of the corresponding pixels in the second image are calculated to obtain the image features of the target pixels from the weights of each of the second images, wherein the image features of the target pixels and the target pixels Image features of corresponding pixel points in the first high dynamic range image are equal.
  • the at least two frames of second images with different exposures include a first exposure image and a second exposure image, and the exposure of the first exposure image is greater than the exposure of the second exposure image;
  • the electronic device calculates according to the image features of the corresponding pixels of the target pixel in the first high dynamic range image and the image features of the corresponding pixels of the target pixel in each of the second images Obtaining the image features of the target pixel points respectively from the weights of each of the second images may include:
  • the image features of the target pixel points are calculated according to the following formula from the weights of the first exposure image and the second exposure image:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • A indicates that the target pixel
  • B represents the image feature of the corresponding pixel point of the target pixel point in the second exposure image
  • P represents the image feature of the target pixel point in the Image features of corresponding pixels in the first high dynamic range image.
  • the at least two frames of second images with different exposures include a first exposure image, a second exposure image and a third exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image , the exposure amount of the second exposure image is greater than the exposure amount of the third exposure image;
  • the target is calculated and obtained.
  • the image features of the pixels come from the weights of each of the second images, which may include:
  • any one of the weights of an exposure image, the second exposure image and the third exposure image is a set value
  • the image features of the target pixel point are calculated according to the following formula from the weights of the first exposure image, the second exposure image and the third exposure image:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • Z indicates that the target pixel
  • A represents the image feature of the corresponding pixel point of the target pixel point in the first exposure image
  • B represents the image feature of the target pixel point in the second exposure image.
  • C represents the image feature of the corresponding pixel of the target pixel in the third exposure image
  • P represents the image feature of the target pixel in the first high dynamic range image
  • the electronic device acquiring at least two frames of first images with different exposures captured by the first camera may include:
  • the electronic device acquires at least two frames of second images with different exposures captured by the second camera, which may include:
  • the embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed, the image fusion as proposed in the first aspect of the embodiment of the present application is realized method.
  • the embodiment of the present application provides a computer program product, which, when the computer program product runs on the electronic device, causes the electronic device to execute the image fusion method as proposed in the first aspect of the embodiment of the present application.
  • FIG. 1 is a hardware structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a flow chart of an image fusion method provided by an embodiment of the present application.
  • Fig. 3 is a schematic diagram of the shooting range of two cameras with different field of view angles adopted in the embodiment of the present application;
  • Fig. 4 is a schematic diagram of a large field of view image and a corresponding small field of view image provided by the embodiment of the present application;
  • FIG. 5 is a schematic diagram of an operation principle of an image fusion method provided in an embodiment of the present application.
  • Fig. 6 is a schematic diagram of the effect of the long exposure frame of the main path, the short exposure frame of the main path and the high dynamic range image of the main path in Fig. 5;
  • Fig. 7 is a schematic diagram of the effect of the long exposure frame of the auxiliary road, the short exposure frame of the auxiliary road and the high dynamic range image of the auxiliary road in Fig. 5;
  • Fig. 8 is a schematic diagram of the effect of fusing the high dynamic range image of the main road in Fig. 6 and the high dynamic range image of the auxiliary road in Fig. 7;
  • FIG. 9 is a structural diagram of an image fusion device provided in an embodiment of the present application.
  • Fig. 10 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • An electronic device (such as a mobile phone) can usually be provided with multiple cameras with different viewing angles, such as a normal camera, a telephoto camera, and a wide-angle camera.
  • multi-camera joint photography can be used to improve the quality of photos. Improve the clarity of the corresponding area of the image with a large field of view.
  • people will also perform high-dynamic range (High-Dynamic Range, HDR) fusion processing on large-field-of-view images to obtain corresponding high-dynamic-range images, and then combine the high-dynamic-range images with Small field of view image fusion.
  • HDR High-Dynamic Range
  • this application proposes an image fusion method, which performs high dynamic range fusion processing on the small field of view image, and then fuses with the high dynamic range image of the large field of view image to obtain the fused image.
  • image fusion method which performs high dynamic range fusion processing on the small field of view image, and then fuses with the high dynamic range image of the large field of view image to obtain the fused image.
  • the image fusion method proposed in this application can be applied to various electronic devices with at least two cameras with different field of view, such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, notebook computer, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA), smart home equipment, etc.
  • augmented reality augmented reality, AR
  • virtual reality virtual reality
  • VR virtual reality
  • notebook computer notebook computer
  • ultra-mobile personal computer ultra-mobile personal computer
  • netbook personal digital assistant
  • PDA personal digital assistant
  • smart home equipment etc.
  • the specific type of equipment is not limited in any way.
  • FIG. 1 shows a block diagram of a partial structure of the mobile phone provided by the embodiment of the present application.
  • mobile phone comprises: radio frequency (Radio Frequency, RF) circuit 101, memory 102, input unit 103, display unit 104, sensor 105, audio circuit 106, wireless fidelity (wireless fidelity, WiFi) module 107, processor 108 , power supply 109, common camera 110 and telephoto camera 111 and other components.
  • radio frequency Radio Frequency, RF
  • memory 102 input unit 103
  • display unit 104 sensor 105
  • audio circuit 106 wireless fidelity (wireless fidelity, WiFi) module
  • processor 108 wireless fidelity module
  • power supply 109 common camera 110 and telephoto camera 111 and other components.
  • the RF circuit 101 can be used for sending and receiving information or receiving and sending signals during a call.
  • the processor 108 After receiving the downlink information of the base station, it is processed by the processor 108; in addition, the designed uplink data is sent to the base station.
  • an RF circuit includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA), a duplexer, and the like.
  • the RF circuit 101 can also communicate with networks and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), email, Short Messaging Service (SMS), etc.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • SMS Short Messaging Service
  • the memory 102 can be used to store software programs and modules, and the processor 108 executes various functional applications and data processing of the mobile phone by running the software programs and modules stored in the memory 102 .
  • the memory 102 can mainly include a program storage area and a data storage area, wherein the program storage area can store operating devices, at least one application program required by a function (such as a sound playback function, an image playback function, etc.); Data created by the use of mobile phones (such as audio data, phonebook, etc.), etc.
  • the memory 102 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the input unit 103 can be used to receive input numbers or character information, and generate key signal input related to user settings and function control of the mobile phone.
  • the input unit 103 may include a touch panel 1031 and other input devices 1032 .
  • the touch panel 1031 also referred to as a touch screen, can collect touch operations of the user on or near it (for example, the user uses any suitable object or accessory such as a finger or a stylus on the touch panel 1031 or near the touch panel 1031). operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1031 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and sends it to the to the processor 108, and can receive and execute commands sent by the processor 108.
  • the touch panel 1031 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the input unit 103 may also include other input devices 1032 .
  • other input devices 1032 may include, but are not limited to, one or more of physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, joysticks, and the like.
  • the display unit 104 can be used to display information input by or provided to the user and various menus of the mobile phone.
  • the display unit 104 may include a display panel 1041.
  • the display panel 1041 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • the touch panel 1031 may cover the display panel 1041, and when the touch panel 1031 detects a touch operation on or near it, it transmits to the processor 108 to determine the type of the touch event, and then the processor 108 according to the touch event The type provides a corresponding visual output on the display panel 1041 .
  • the touch panel 1031 and the display panel 1041 are used as two independent components to realize the input and input functions of the mobile phone, in some embodiments, the touch panel 1031 and the display panel 1041 can be integrated to form a mobile phone. Realize the input and output functions of the mobile phone.
  • the handset may also include at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1041 according to the brightness of the ambient light, and the proximity sensor may turn off the display panel 1041 and/or when the mobile phone is moved to the ear. or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the application of mobile phone posture (such as horizontal and vertical screen switching, related Games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; as for other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. repeat.
  • mobile phone posture such as horizontal and vertical screen switching, related Games, magnetometer attitude calibration
  • vibration recognition related functions such as pedometer, tap
  • other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. repeat.
  • the audio circuit 106, the speaker 1061, and the microphone 1062 can provide an audio interface between the user and the mobile phone.
  • the audio circuit 106 can transmit the electrical signal converted from the received audio data to the loudspeaker 1061, and the loudspeaker 1061 converts it into a sound signal output; After being received, it is converted into audio data, and then the audio data is processed by the output processor 108, and then sent to another mobile phone through the RF circuit 101, or the audio data is output to the memory 102 for further processing.
  • WiFi is a short-distance wireless transmission technology.
  • the mobile phone can help users send and receive emails, browse web pages, and access streaming media through the WiFi module 107, which provides users with wireless broadband Internet access.
  • FIG. 1 shows the WiFi module 107, it can be understood that it is not an essential component of the mobile phone, and can be completely omitted as required without changing the essence of the application.
  • the processor 108 is the control center of the mobile phone, and uses various interfaces and lines to connect various parts of the entire mobile phone. By running or executing software programs and/or modules stored in the memory 102, and calling data stored in the memory 102, execution Various functions and processing data of the mobile phone, so as to monitor the mobile phone as a whole.
  • the processor 108 may include one or more processing units; preferably, the processor 108 may integrate an application processor and a modem processor, wherein the application processor mainly processes operating devices, user interfaces and application programs, etc. , the modem processor mainly handles wireless communications. It can be understood that the foregoing modem processor may not be integrated into the processor 108 .
  • the mobile phone also includes a power supply 109 (such as a battery) for supplying power to each component.
  • a power supply 109 (such as a battery) for supplying power to each component.
  • the power supply can be logically connected to the processor 108 through a power management device, so that functions such as charging, discharging, and power consumption management can be realized through the power management device.
  • the mobile phone also includes at least two cameras with different viewing angles, for example, one of them is a normal camera 110, and the other is a telephoto camera 111. .
  • the phone may also include other types of cameras such as infrared cameras, hyperspectral cameras, and wide-angle cameras.
  • the position of the camera on the mobile phone may be front or rear, which is not limited in this embodiment of the present application.
  • the mobile phone may also include a bluetooth module, etc., which will not be repeated here.
  • Figure 2 shows a flow chart of an image fusion method provided by an embodiment of the present application, including:
  • the electronic device acquires at least two frames of first images with different exposures captured by the first camera, and acquires at least two frames of second images with different exposures captured by the second camera;
  • the electronic device has at least two cameras with different viewing angles, namely a first camera and a second camera, wherein the viewing angle of the first camera is larger than that of the second camera.
  • the embodiment of the present application does not limit the specific types of the first camera and the second camera.
  • the first camera is an ordinary camera
  • the second camera can be a telephoto camera
  • the first camera is a wide-angle camera
  • the second camera can be It can be a normal camera or a telephoto camera, and so on.
  • the first camera and the second camera should be set on the same surface of the electronic device, and keep the same or similar shooting angles when taking images, so that the shooting range of the first camera covers the shooting range of the second camera , that is, the first image captured by the first camera includes the second image captured by the second camera.
  • the first image may be recorded as an image with a large viewing angle
  • the second image may be recorded as an image with a small viewing angle.
  • the camera can control the exposure of the captured image by adjusting parameters such as aperture size and exposure time.
  • parameters such as aperture size and exposure time.
  • by adjusting the parameters of the first camera at least two frames of the first image with different exposures can be obtained, and by adjusting the parameters of the second
  • the parameters of the second camera are taken to obtain at least two frames of second images with different exposures.
  • acquiring at least two frames of first images with different exposures captured by the first camera may include:
  • the exposure corresponding to the preview stream of the first camera may be acquired as a reference value, that is, the first reference exposure.
  • the camera captures an image, it will obtain the corresponding preview stream, that is, the data stream corresponding to the photo preview interface of the electronic device after the user turns on the camera.
  • the exposure corresponding to the preview stream is generally the default value set by the camera.
  • Take the exposure corresponding to the preview stream as a benchmark increase the exposure by a certain percentage on this benchmark, and shoot more than one frame with a larger exposure (which can be called a long exposure frame), and reduce the exposure by a certain percentage on this benchmark Exposure, more than one frame of images with less exposure (may be referred to as short exposure frames) is obtained by shooting.
  • the exposure corresponding to the preview stream of the first camera is M
  • the long-exposure frame and the short-exposure frame are two acquired first images with different exposures.
  • Acquiring at least two frames of second images with different exposures captured by the second camera may include:
  • the same method as that of the first camera may be used to acquire at least two frames of second images with different exposures captured by the second camera. It should be noted that the number of the first image and the number of the second image may be the same or different; there is no corresponding size-limited relationship between the exposure amount of each frame of the first image and the exposure amount of each frame of the second image.
  • the electronic device fuses the at least two frames of first images with different exposures into a first high dynamic range image, and fuses the at least two frames of second images with different exposures into a second high dynamic range image;
  • HDR High-Dynamic Range, HDR
  • fusion processing refers to synthesizing a high dynamic range image based on multiple frames of low dynamic range images with different exposures, so as to obtain more image details and improve image clarity.
  • the low dynamic range image with a large exposure is mainly used to restore the image details of the dark area of the scene
  • the low dynamic range image with a small exposure is mainly used to restore the image details of the bright area of the scene.
  • the obtained fused image will also have the same characteristics as the first high dynamic range image.
  • the image features of the high dynamic range images differ greatly, and in conventional image fusion scenarios, it is generally expected that the obtained fused image has the same or similar image features as the first high dynamic range image.
  • the process of fusing the second images of each frame can be guided by the HDR effect according to the image characteristics such as the brightness of the first high dynamic range image, so that the obtained second high dynamic range image has the same quality as the first high dynamic range image.
  • Image characteristics such as the same or similar brightness of the image.
  • fusing the at least two frames of second images with different exposures into a second high dynamic range image may include:
  • the image characteristics such as RGB value, brightness value, etc.
  • the features are respectively calculated to obtain the image features of each pixel in the second high dynamic range image from the weights of each second image, that is, to calculate the weight of each second image for fusion.
  • the basic principle is to make the second image obtained after fusion
  • the high dynamic range image has the same or similar image features as the first high dynamic range image.
  • the HDR fusion process can be performed on each second image to obtain the corresponding first Two high dynamic range images.
  • the image feature of a pixel point Q in the second high dynamic range image to be generated comes from the weight of I 1 is X, and the weight from I 2 is Y, the pixel point
  • the image feature of Q corresponding to the pixel point in I1 is A, and the image feature of the corresponding pixel point in I2 is B, then when I1 and I2 are fused into the second high dynamic range image, Q is at the
  • the image feature in the second high dynamic range image is A*X+B*Y, and by analogy, the image feature of each pixel in the second high dynamic range image can be calculated.
  • the image features of each pixel in the first high dynamic range image are calculated from the respective The weight of the second image may include:
  • the first image of each first image can be calculated
  • the fusion weights that is, the image features of each pixel in the first high dynamic range image respectively come from the weights of the first images.
  • each first image is a total of 3 frames of low dynamic range images including a long exposure frame, a medium exposure frame, and a short exposure frame, wherein the exposure amount of the long exposure frame>the exposure amount of the middle exposure frame>the exposure amount of the short exposure frame,
  • the image feature of a certain pixel point Q in the long exposure frame is A
  • the image feature in the medium exposure frame is B
  • the image feature in the short exposure frame is C
  • the first height can be obtained by using the following formulas respectively
  • the image features of pixel Q in the dynamic range image come from the weight X of the long-exposure frame, the weight Y of the medium-exposure frame, and the weight Z of the short-exposure frame:
  • weights whose image features come from each first image can be calculated in the same manner as the above pixel Q.
  • the weights in the second high dynamic range image are calculated and obtained.
  • the image features of each pixel are respectively derived from the weights of the second images, which may include:
  • the image features of the corresponding pixel of the target pixel in the first high dynamic range image are respectively obtained from each of the first images
  • the weights are determined as the weights of the image features of the target pixel points from the respective second images.
  • each first image is a first long exposure frame, a first medium exposure frame and a first short exposure frame
  • each second image is a second long exposure frame, a second medium exposure frame and a second short exposure frame
  • the first The image feature of a target pixel point Q in a high dynamic range image is P
  • the image feature of Q in the first long exposure frame is A
  • the image feature of Q in the first medium exposure frame is B
  • the image feature of Q in the first short exposure frame is
  • the image feature in the frame is C
  • the image feature in the second long exposure frame is D
  • the image feature in the second medium exposure frame is E
  • the image feature in the second short exposure frame is F
  • P A*X+B*Y+C*Z
  • the image feature S D* of Q in the second high dynamic range image X+E*Y+F*Z.
  • the image feature may be a feature of any channel in the RGB domain, or a feature of the Y channel (brightness) in the YUV domain.
  • a second high dynamic range image having image characteristics similar to that of the first high dynamic range image can be obtained by using weight reuse.
  • the image features of each pixel in the first high dynamic range image are calculated from The weights of each of the second images may include:
  • any target pixel in the second high dynamic range image according to the image features of the corresponding pixel of the target pixel in the first high dynamic range image and the position of the target pixel in each of the The image features of the corresponding pixels in the second image are calculated to obtain the image features of the target pixels from the weights of each of the second images, wherein the image features of the target pixels and the target pixels Image features of corresponding pixel points in the first high dynamic range image are equal.
  • the difference in image features between each first image and each second image is relatively large, in order to meet the requirement that the image features of the second high dynamic range image and the image features of the first high dynamic range image are the same or similar , it is inappropriate to use weight reuse to determine the fusion weight of each second image at this time.
  • image feature multiplexing can be adopted, that is, the fusion weight of each second image can be calculated by using the same or similar image feature of each corresponding pixel in the two high dynamic range images as a known condition.
  • each second image includes a first exposure image and a second exposure image
  • the exposure of the first exposure image is greater than the exposure of the second exposure image
  • the image of the target pixel can be calculated according to the following formula The features come from the weights of the first exposure image and the second exposure image respectively:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • A indicates that the target pixel is in the first exposure image.
  • B represents the image feature of the corresponding pixel of the target pixel in the second exposure image
  • P represents the corresponding pixel of the target pixel in the first high dynamic range image
  • the image feature of the point (or other values close to the image feature).
  • A, B, and P are all known values, so the weight X and weight Y can be calculated.
  • the image feature of the corresponding pixel of the target pixel in the first high dynamic range image is 92
  • the image feature of the corresponding pixel of the target pixel in the first exposure image is 120
  • the target pixel is 120 in the second exposure image.
  • the image feature of the corresponding pixel in the image is 10, then the following formula can be obtained:
  • each pixel can be calculated in the same way to obtain the corresponding fusion weight.
  • each pixel is fused according to its corresponding fusion weight to obtain the image A second high dynamic range image having the same characteristics as the first high dynamic range image.
  • each second image includes a first exposure image, a second exposure image, and a third exposure image, wherein the exposure amount of the first exposure image is greater than the exposure amount of the second exposure image, and the exposure amount of the second exposure image is The exposure amount of the second exposure image is greater than the exposure amount of the third exposure image; at this time, the image features of the target pixels can be calculated according to the following formula from the weights of the first exposure image, the second exposure image and the third exposure image respectively :
  • X represents that the image feature of the target pixel point comes from the weight of the first exposure image
  • Y represents that the image feature of the target pixel point comes from the weight of the second exposure image
  • Z represents the image feature of the target pixel point Weights from the third exposure image
  • A represents the image feature of the corresponding pixel of the target pixel in the first exposure image
  • B represents the image feature of the corresponding pixel of the target pixel in the second exposure image
  • C represents the image feature of the corresponding pixel of the target pixel in the third exposure image
  • P represents the image feature of the corresponding pixel of the target pixel in the first high dynamic range image.
  • A, B, C and P are all known values, but since there are three unknown quantities X, Y and Z, it is necessary to introduce additional restrictive conditions to calculate the values of X, Y and Z value.
  • the image features of the corresponding pixel points of the target pixel points in the first high dynamic range image come from the respective first images
  • the image features of the target pixel points come from the first exposure image, the second exposure image, and the second exposure image respectively.
  • Any one of the weights of the exposure image and the third exposure image is a set value.
  • each first image includes a fourth exposure image, a fifth exposure image, and a sixth exposure image with sequentially decreasing exposure amounts
  • the image feature of the corresponding pixel of the target pixel in the fourth exposure image is 100 (fusion weight 50%)
  • the image feature of the corresponding pixel of the target pixel in the fifth exposure image is 50 (fusion weight is 20%)
  • the image feature of the corresponding pixel of the target pixel in the fifth exposure image is 10 ( The fusion weight is 30%)
  • the image feature of the corresponding pixel of the target pixel in the first exposure image is 120
  • the image feature of the corresponding pixel of the target pixel in the second exposure image is 60
  • the image feature of the target pixel in the third exposure image is 5, then according to the above formula, it can be obtained:
  • any weight of X, Y, and Z can be set as a set value.
  • the fusion weight corresponding to the target pixel point in the fourth exposure image is the largest (50%)
  • the corresponding fusion weight X in the exposure image is 50%
  • the values of the other two weights Y and Z can be calculated.
  • the image features targeted by this calculation method can be the features of any channel in the RGB domain, or the features of the Y channel (brightness) in the YUV domain.
  • the characteristics of the UV channel (chroma) in the YUV domain can be directly used.
  • the characteristics of the UV channel of the target pixel point can use the corresponding UV value in the second exposure image , as the UV value of the target pixel in the fused second high dynamic range image.
  • the obtained second high dynamic range image can have the same or similar image characteristics as the first high dynamic range image, such as brightness, so as to meet the requirements of the specified image fusion scene.
  • the electronic device fuses the first high dynamic range image and the second high dynamic range image to obtain a fused image.
  • the two high dynamic range images can be fused, specifically, the second high dynamic range image is fused to the first high dynamic range image and the first high dynamic range image. Regions corresponding to the two high dynamic range images, so as to obtain the fused image.
  • the high dynamic range image fusion process mainly includes image registration and image feature superposition. For details, reference may be made to related content on image fusion in the prior art, which will not be repeated here.
  • the above process can restore some image details lost due to overexposure of the small field of view image by performing high dynamic range fusion processing on the small field of view image, thereby improving the clarity of the fused image.
  • FIG. 3 is a schematic diagram of shooting ranges of two cameras with different field of view angles adopted in the embodiment of the present application.
  • the camera with a larger field of view is the first camera
  • the camera with a smaller field of view is the second camera
  • the first camera and the second camera are located on the same surface of an electronic device.
  • the shooting ranges of the two cameras are shown respectively by two rectangular boxes in Fig. 3, obviously the shooting range of the first camera includes the shooting range of the second camera, so the image (first image) captured by the first camera includes the shooting range of the second camera The captured image (second image).
  • the first camera can be a camera with a larger field of view such as a wide-angle camera or an ordinary camera
  • the second camera can be a camera with a smaller field of view such as a telephoto camera
  • Fig. 4 is a schematic diagram of an image with a large field of view and a corresponding image with a small field of view provided by an embodiment of the present application.
  • the image on the left is an image with a large field of view, that is, a first image captured by the first camera mentioned above;
  • the image on the right is an image with a small field of view, that is, the image captured by the second camera mentioned above
  • a second image is obtained.
  • the image with a small field of view is usually fused into the image with a large field of view.
  • some areas of the image with a small field of view may be overexposed (such as the overexposed area marked in Figure 4), resulting in loss of image details, which will affect the clarity of the fused image.
  • the embodiment of the present application proposes an image fusion method, and a schematic diagram of its specific operation principle is shown in FIG. 5 .
  • the main road that is, the shooting channel where the main camera is located, which can usually be an ordinary camera with a large field of view
  • the main road long exposure frame and the main road short exposure frame are obtained by shooting, and the exposure of the main road long exposure frame is greater than the exposure of the main road short exposure frame; then, perform HDR fusion on the main road long exposure frame and the main road short exposure frame Operation to obtain the high dynamic range image of the main road.
  • the auxiliary road (that is, the shooting channel where the auxiliary camera is located, which can usually be a telephoto camera with a small field of view) will set different exposure parameters according to the preview stream of the auxiliary camera, and the long exposure of the auxiliary road will be obtained by shooting Frames and short-exposure frames of the auxiliary road, wherein the exposure of the long-exposure frame of the auxiliary road is greater than the exposure of the short-exposure frame of the auxiliary road; then, an HDR fusion operation is performed on the long-exposure frame of the auxiliary road and the short-exposure frame of the auxiliary road to obtain a high dynamic range image of the auxiliary road. Finally, the high dynamic range image of the main road and the high dynamic range image of the auxiliary road are fused to obtain the fused image.
  • the HDR fusion process of the long-exposure frame of the auxiliary road and the short-exposure frame of the auxiliary road can be guided according to the image characteristics of the high dynamic range image of the main road (as indicated by the dotted line in FIG. 5 ). shown), so that the obtained auxiliary road high dynamic range image has the same or similar brightness and other image characteristics as the main road high dynamic range image. That is to say, the main road can guide the auxiliary road with HDR effects, and the specific guidance method can refer to the related content above.
  • Figure 6 is the long exposure frame of the main road, the short exposure frame of the main road and the high dynamic range image of the main road in Figure 5 Effect diagram.
  • the long exposure frame of the main road and the short exposure frame of the main road are images of different exposures for the same balcony scene, and the long exposure frame of the main road is an image with a larger exposure, which shows that the overall brightness of the image is brighter;
  • the short-exposure frame of the main road is an image with a small exposure, and it can be seen that the overall brightness of the image is relatively dark.
  • the high dynamic range image of the main road is obtained. It can be seen that the brightness of the image is moderate, and the long-exposure frame of the main road and/or the short-exposure frame of the main road are restored to a certain extent missing image details.
  • FIG 7 it is a schematic diagram of the effect of the long exposure frame of the auxiliary road, the short exposure frame of the auxiliary road and the high dynamic range image of the auxiliary road in Figure 5.
  • the long exposure frame of the auxiliary road and the short exposure frame of the auxiliary road are images of different exposures for the same balcony scene (the same scene as in Figure 6), and the long exposure frame of the auxiliary road is an image with a larger exposure, which can be seen
  • the overall brightness is brighter; the short-exposure frame of the auxiliary path is an image with a small exposure, so the overall brightness of the image is relatively dark.
  • the long exposure frame of the main road shown in Figure 6 contains the long exposure frame of the auxiliary road shown in Figure 7
  • the short exposure frame of the main road shown in Figure 6 contains 7 shows the short-exposure frame of the auxiliary path.
  • the high dynamic range image of the auxiliary road is obtained. It can be seen that the brightness of the image is moderate, and the image details lost in the long exposure frame of the auxiliary road and/or the short exposure frame of the auxiliary road are restored to a certain extent .
  • the obtained high dynamic range image of the auxiliary road in Figure 7 has image characteristics such as brightness that are the same or similar to the high dynamic range image of the main road in Figure 6 .
  • FIG. 8 it is a schematic diagram of the effect of fusing the high dynamic range image of the main road in FIG. 6 and the high dynamic range image of the auxiliary road in FIG. 7 . Since the high dynamic range image of the main road and the high dynamic range image of the auxiliary road have the same or similar image characteristics, the influence of image fusion on image characteristics such as brightness and color can be reduced, so that the obtained fused image has the same or similar image characteristics as the main road image image features.
  • the right side of Fig. 8 is a schematic diagram of the fused image, in which the area within the dotted frame is the target area of image fusion, and it can be seen that the target area has obtained a certain degree of sharpness improvement effect.
  • FIG. 9 shows a structural block diagram of an image fusion device provided in the embodiment of the present application.
  • the device includes:
  • An image acquisition module 901 configured to acquire at least two frames of first images with different exposures captured by the first camera, and acquire at least two frames of second images with different exposures captured by the second camera, wherein the first The angle of view of the camera is larger than the angle of view of the second camera, and the first image includes the second image;
  • a high dynamic range processing module 902 configured to fuse the at least two frames of first images with different exposures into a first high dynamic range image, and fuse the at least two frames of second images with different exposures into a second high dynamic range image dynamic range images;
  • the image fusion module 903 is configured to fuse the first high dynamic range image and the second high dynamic range image to obtain a fused image.
  • the high dynamic range processing module may include:
  • a fusion weight calculation unit configured to calculate, according to the image features of each pixel in the first high dynamic range image, that the image features of each pixel in the second high dynamic range image come from each of the first high dynamic range images. The weight of the second image;
  • a high dynamic range fusion unit configured to combine the at least two frames of second images with different exposures according to the weights of the image features of each pixel in the second high dynamic range image from each of the second images fused into the second high dynamic range image.
  • the fusion weight calculation unit may include:
  • the first fusion weight calculation subunit is used to calculate and obtain the first high dynamic range image according to the image features of each pixel in the first high dynamic range image and the image high dynamic range fusion algorithm corresponding to the first high dynamic range image.
  • the image features of each pixel in a high dynamic range image respectively come from the weights of each of the first images;
  • the second fusion weight calculation subunit is used to calculate and obtain the weights in the second high dynamic range image according to the image features of each pixel in the first high dynamic range image respectively from the weights of the first images.
  • the image features of each pixel point come from the weights of the second images respectively.
  • the second fusion weight calculation subunit can be specifically configured to: for any target pixel in the second high dynamic range image, combine the target pixel in the first high dynamic range image
  • the image features of the corresponding pixel points come from the weights of each of the first images, and the image features determined as the target pixel points come from the weights of each of the second images.
  • the fusion weight calculation unit may include:
  • the third fusion weight calculation subunit is configured to, for any target pixel in the second high dynamic range image, according to the image features of the corresponding pixel of the target pixel in the first high dynamic range image and the image features of the corresponding pixels of the target pixel in each of the second images, the calculated image features of the target pixel are respectively derived from the weights of each of the second images, wherein the target pixel
  • the image feature of the point is equal to the image feature of the corresponding pixel point of the target pixel point in the first high dynamic range image.
  • the at least two frames of second images with different exposures include a first exposure image and a second exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image;
  • the third The fusion weight calculation subunit can be specifically configured to: calculate the weights of the image features of the target pixel from the first exposure image and the second exposure image respectively according to the following formula:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • A indicates that the target pixel
  • B represents the image feature of the corresponding pixel point of the target pixel point in the second exposure image
  • P represents the image feature of the target pixel point in the Image features of corresponding pixels in the first high dynamic range image.
  • the at least two frames of second images with different exposures include a first exposure image, a second exposure image and a third exposure image, the exposure of the first exposure image is greater than the exposure of the second exposure image , the exposure amount of the second exposure image is greater than the exposure amount of the third exposure image;
  • the third fusion weight calculation subunit may specifically include:
  • a weight setting subunit configured to set the weight of the target pixel according to the weights of the image features of the corresponding pixels of the target pixel in the first high dynamic range image from each of the first images.
  • Image features are respectively from any one of the weights of the first exposure image, the second exposure image and the third exposure image as a set value;
  • the formula calculation subunit is used to calculate the weights of the image features of the target pixel from the first exposure image, the second exposure image and the third exposure image respectively according to the following formula:
  • X indicates that the image feature of the target pixel comes from the weight of the first exposure image
  • Y indicates that the image feature of the target pixel comes from the weight of the second exposure image
  • Z indicates that the target pixel
  • A represents the image feature of the corresponding pixel point of the target pixel point in the first exposure image
  • B represents the image feature of the target pixel point in the second exposure image.
  • C represents the image feature of the corresponding pixel of the target pixel in the third exposure image
  • P represents the image feature of the target pixel in the first high dynamic range image
  • the image acquisition module may include:
  • the first image capturing unit is configured to capture more than one frame of images with an exposure greater than a first reference exposure through the first camera, and the first reference exposure is the exposure corresponding to the preview stream of the first camera ;
  • the second image capturing unit is configured to capture more than one frame of images with an exposure less than the first reference exposure through the first camera;
  • a first image determining unit configured to determine the image with an exposure amount of more than one frame greater than the first reference exposure amount and the image with an exposure amount of more than one frame less than the first reference exposure amount as the at least two frame the first image with different exposures;
  • the third image capturing unit is configured to obtain more than one frame of images with an exposure greater than a second reference exposure through the second camera, and the second reference exposure is the exposure corresponding to the preview stream of the second camera ;
  • a fourth image capturing unit configured to capture more than one frame of images with an exposure less than the second reference exposure through the second camera
  • the second image determining unit is configured to determine the image whose exposure amount is greater than the second reference exposure amount for more than one frame and the image whose exposure amount is less than the second reference exposure amount for more than one frame as the at least two Frame a second image with a different exposure.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, various image fusion methods as proposed in the present application are implemented.
  • the embodiment of the present application also provides a computer program product, which, when the computer program product runs on the electronic device, causes the electronic device to execute each image fusion method proposed in the present application.
  • Fig. 10 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 100 of this embodiment includes: at least one processor 1000 (only one is shown in FIG. 10 ), a processor, a memory 1001, and stored in the memory 1001 and can be processed in the at least one processor.
  • the electronic device may include, but not limited to, a processor 1000 and a memory 1001 .
  • FIG. 10 is only an example of the electronic device 100, and does not constitute a limitation to the electronic device 100. It may include more or less components than shown in the figure, or combine certain components, or different components. , for example, may also include input and output devices, network access devices, and so on.
  • the so-called processor 1000 may be a central processing unit (Central Processing Unit, CPU), and the processor 1000 may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit) , ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory 1001 may be an internal storage unit of the electronic device 100 in some embodiments, such as a hard disk or a memory of the electronic device 100.
  • the memory 1001 may also be an external storage device of the electronic device 100 in other embodiments, such as a plug-in hard disk equipped on the electronic device 100, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc.
  • the memory 1001 may also include both an internal storage unit of the electronic device 100 and an external storage device.
  • the memory 1001 is used to store operating devices, application programs, bootloader programs (BootLoader), data, and other programs, such as program codes of the computer programs.
  • the memory 1001 can also be used to temporarily store data that has been output or will be output.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • all or part of the processes in the methods of the above embodiments in the present application can be completed by instructing related hardware through computer programs, and the computer programs can be stored in computer-readable storage media.
  • the steps in the foregoing method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may at least include: any entity or device capable of carrying computer program codes to electronic equipment, recording media, computer memory, read-only memory (ROM, Read-Only Memory), random-access memory (RAM, Random Access Memory), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random-access memory
  • electrical carrier signals telecommunication signals
  • software distribution media Such as U disk, mobile hard disk, magnetic disk or CD, etc.
  • computer readable media may not be electrical carrier signals and telecommunication signals under legislation and patent practice.

Abstract

La présente demande est applicable au domaine technique du traitement d'image, et concerne un procédé de fusion d'image, un dispositif électronique, un support de stockage lisible par ordinateur, et un produit-programme d'ordinateur. Le procédé comprend les étapes suivantes : obtention d'au moins deux trames de premières images présentant différentes quantités d'exposition et photographiées par une première caméra, et obtention d'au moins deux trames de secondes images présentant différentes quantités d'exposition et photographiées par une seconde caméra, l'angle de champ de vision de la première caméra étant supérieur à l'angle de champ de vision de la seconde caméra, et les premières images comprenant les secondes images ; fusion desdites au moins deux trames des premières images présentant différentes quantités d'exposition en une première image à plage dynamique élevée, et fusion desdites au moins deux trames des secondes images présentant différentes quantités d'exposition en une seconde image à plage dynamique élevée ; et fusion de la première image à plage dynamique élevée et de la seconde image à plage dynamique élevée afin d'obtenir une image fusionnée. Au moyen de ce traitement, certains des détails d'image qui sont perdus en raison de l'exposition excessive d'une image d'un petit angle de champ de vision peuvent être restaurés, et ainsi, la définition de l'image fusionnée est améliorée.
PCT/CN2022/077713 2021-06-23 2022-02-24 Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur WO2022267506A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110707247.7 2021-06-23
CN202110707247.7A CN115514876B (zh) 2021-06-23 2021-06-23 图像融合方法、电子设备、存储介质及计算机程序产品

Publications (1)

Publication Number Publication Date
WO2022267506A1 true WO2022267506A1 (fr) 2022-12-29

Family

ID=84499590

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/077713 WO2022267506A1 (fr) 2021-06-23 2022-02-24 Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur

Country Status (2)

Country Link
CN (1) CN115514876B (fr)
WO (1) WO2022267506A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116528058A (zh) * 2023-05-26 2023-08-01 中国人民解放军战略支援部队航天工程大学 一种基于压缩重构的高动态成像方法和系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116363038A (zh) * 2023-06-02 2023-06-30 深圳英美达医疗技术有限公司 超声图像融合方法、装置、计算机设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170034414A1 (en) * 2015-07-31 2017-02-02 Via Alliance Semiconductor Co., Ltd. Methods for generating hdr (high dynamic range) images and apparatuses using the same
CN109863742A (zh) * 2017-01-25 2019-06-07 华为技术有限公司 图像处理方法和终端设备
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN112087580A (zh) * 2019-06-14 2020-12-15 Oppo广东移动通信有限公司 图像采集方法和装置、电子设备、计算机可读存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5188101B2 (ja) * 2007-06-01 2013-04-24 株式会社キーエンス 拡大観察装置、拡大画像撮影方法、拡大画像撮影プログラム及びコンピュータで読み取り可能な記録媒体
CN102457669B (zh) * 2010-10-15 2014-04-16 华晶科技股份有限公司 图像处理方法
CN102970549B (zh) * 2012-09-20 2015-03-18 华为技术有限公司 图像处理方法及装置
CN105933617B (zh) * 2016-05-19 2018-08-21 中国人民解放军装备学院 一种用于克服动态问题影响的高动态范围图像融合方法
CN106791377B (zh) * 2016-11-29 2019-09-27 Oppo广东移动通信有限公司 控制方法、控制装置及电子装置
US10623634B2 (en) * 2017-04-17 2020-04-14 Intel Corporation Systems and methods for 360 video capture and display based on eye tracking including gaze based warnings and eye accommodation matching
CN110276714B (zh) * 2018-03-16 2023-06-06 虹软科技股份有限公司 快速扫描式全景图图像合成方法及装置
WO2019183813A1 (fr) * 2018-03-27 2019-10-03 华为技术有限公司 Dispositif et procédé de capture d'image
CN109005342A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备
US11128809B2 (en) * 2019-02-15 2021-09-21 Samsung Electronics Co., Ltd. System and method for compositing high dynamic range images
CN110611750B (zh) * 2019-10-31 2022-03-22 北京迈格威科技有限公司 一种夜景高动态范围图像生成方法、装置和电子设备
CN110830697A (zh) * 2019-11-27 2020-02-21 Oppo广东移动通信有限公司 控制方法、电子装置和存储介质
CN111917950B (zh) * 2020-06-30 2022-07-22 北京迈格威科技有限公司 图像处理方法、装置、电子设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170034414A1 (en) * 2015-07-31 2017-02-02 Via Alliance Semiconductor Co., Ltd. Methods for generating hdr (high dynamic range) images and apparatuses using the same
CN109863742A (zh) * 2017-01-25 2019-06-07 华为技术有限公司 图像处理方法和终端设备
CN110062159A (zh) * 2019-04-09 2019-07-26 Oppo广东移动通信有限公司 基于多帧图像的图像处理方法、装置、电子设备
CN112087580A (zh) * 2019-06-14 2020-12-15 Oppo广东移动通信有限公司 图像采集方法和装置、电子设备、计算机可读存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116528058A (zh) * 2023-05-26 2023-08-01 中国人民解放军战略支援部队航天工程大学 一种基于压缩重构的高动态成像方法和系统
CN116528058B (zh) * 2023-05-26 2023-10-31 中国人民解放军战略支援部队航天工程大学 一种基于压缩重构的高动态成像方法和系统

Also Published As

Publication number Publication date
CN115514876B (zh) 2023-09-01
CN115514876A (zh) 2022-12-23

Similar Documents

Publication Publication Date Title
US10827140B2 (en) Photographing method for terminal and terminal
CN109863742B (zh) 图像处理方法和终端设备
US10810720B2 (en) Optical imaging method and apparatus
CN107302663B (zh) 一种图像亮度调整方法、终端及计算机可读存储介质
CN107038681B (zh) 图像虚化方法、装置、计算机可读存储介质和计算机设备
CN107566752B (zh) 一种拍摄方法、终端及计算机存储介质
WO2022267506A1 (fr) Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur
CN107948505B (zh) 一种全景拍摄方法及移动终端
CN108419008B (zh) 一种拍摄方法、终端及计算机可读存储介质
CN109639996B (zh) 高动态场景成像方法、移动终端及计算机可读存储介质
CN107040723B (zh) 一种基于双摄像头的成像方法、移动终端及存储介质
CN106993136B (zh) 移动终端及其基于多摄像头的图像降噪方法和装置
CN111064895B (zh) 一种虚化拍摄方法和电子设备
CN110213484B (zh) 一种拍照方法、终端设备及计算机可读存储介质
CN113179374A (zh) 图像处理方法、移动终端及存储介质
CN111447371A (zh) 一种自动曝光控制方法、终端及计算机可读存储介质
CN111885307A (zh) 一种景深拍摄方法、设备及计算机可读存储介质
CN113888452A (zh) 图像融合方法、电子设备、存储介质及计算机程序产品
CN112188082A (zh) 高动态范围图像拍摄方法、拍摄装置、终端及存储介质
CN113179369A (zh) 拍摄画面的显示方法、移动终端及存储介质
CN110177207B (zh) 逆光图像的拍摄方法、移动终端及计算机可读存储介质
US11425355B2 (en) Depth image obtaining method, image capture device, and terminal
WO2021218551A1 (fr) Procédé et appareil de photographie, dispositif terminal et support de stockage
CN114143471B (zh) 图像处理方法、系统、移动终端及计算机可读存储介质
CN115134527B (zh) 处理方法、智能终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827013

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE