WO2021184496A1 - 图像融合方法、装置、存储介质及移动终端 - Google Patents

图像融合方法、装置、存储介质及移动终端 Download PDF

Info

Publication number
WO2021184496A1
WO2021184496A1 PCT/CN2020/087209 CN2020087209W WO2021184496A1 WO 2021184496 A1 WO2021184496 A1 WO 2021184496A1 CN 2020087209 W CN2020087209 W CN 2020087209W WO 2021184496 A1 WO2021184496 A1 WO 2021184496A1
Authority
WO
WIPO (PCT)
Prior art keywords
bright
image
value
difference
bright spot
Prior art date
Application number
PCT/CN2020/087209
Other languages
English (en)
French (fr)
Inventor
朱晓璞
Original Assignee
捷开通讯(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 捷开通讯(深圳)有限公司 filed Critical 捷开通讯(深圳)有限公司
Priority to EP20925319.4A priority Critical patent/EP4123574A4/en
Publication of WO2021184496A1 publication Critical patent/WO2021184496A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3871Composing, repositioning or otherwise geometrically modifying originals the composed originals being of different kinds, e.g. low- and high-resolution originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This application relates to the field of Internet technology, and in particular to an image fusion method, device, storage medium and mobile terminal.
  • the size of smart phones has become smaller and more complete in functions.
  • the body of a smart phone is equipped with a camera for users to take pictures.
  • the pixels of the camera in smart phones have been able to fully meet people's daily needs and are the most favorite shooting equipment for users.
  • High Dynamic Range HDR
  • smartphones are equipped with High Dynamic Range (HDR) camera mode, which can effectively improve the environment of large dynamic range, especially the camera experience under backlight environment.
  • HDR self-start function namely The HDR mode can be determined in real time according to the dynamic range of the environment, but the photos taken in the existing HDR mode are prone to underexposure or overexposure of the subject, resulting in poor shooting effects.
  • the embodiments of the present application provide an image fusion method, device, storage medium, and mobile terminal, which can avoid under-exposure or over-exposure of the subject in an image shot in HDR mode, and the shooting effect is good.
  • the embodiment of the application provides an image fusion method, which is applied to a mobile terminal, and includes:
  • the embodiment of the present application also provides an image fusion device, which is applied to a mobile terminal, and includes:
  • An acquiring module configured to acquire multiple preview images continuously shot when the mobile terminal starts the dynamic shooting mode
  • the first determining module is configured to determine the main body grayscale image and the background grayscale image corresponding to each preview image
  • the second determining module is configured to determine the brightness adjustment parameters of the corresponding preview image according to the main body grayscale image and the background grayscale image;
  • the fusion module is configured to merge the multiple preview images according to the brightness adjustment parameter.
  • the first determining module is specifically configured to:
  • Image segmentation is performed on the grayscale image to obtain corresponding main body grayscale images and background grayscale images.
  • the second determining module specifically includes:
  • a first determining unit configured to determine a first bright point set and a first dark point set in the subject grayscale image, and a second bright point set and a second dark point set in the background grayscale image;
  • the second determining unit is configured to determine the weighted value of the bright spot and the difference of the mean value of the bright spot according to the first bright spot set and the second bright spot set, and determine the dark spot according to the first dark spot set and the second dark spot set The difference between the weighted value and the mean value of the dark spot;
  • the third determining unit is configured to determine the brightness adjustment parameter corresponding to the preview image according to the bright point weight value, the dark point weight value, the average difference of the bright points, and the average difference of the dark points.
  • the second determining unit is specifically configured to:
  • the second determining unit is specifically configured to:
  • the brightness adjustment parameter includes a first brightness adjustment parameter and a second brightness adjustment parameter
  • the second determining unit is specifically configured to:
  • the fusion module is specifically used for:
  • first brightness adjustment parameter to perform brightness adjustment on the first bright point set and the second bright point set
  • second brightness adjustment parameter to perform brightness adjustment on the first dark point set and the second dark point set Set brightness adjustment to adjust the preview image
  • the adjusted multiple preview images are merged into a single image to obtain a merged image.
  • An embodiment of the present application also provides a computer-readable storage medium in which a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor to execute any one of the above-mentioned image fusion methods.
  • An embodiment of the present application also provides a mobile terminal, including a processor and a memory, the processor is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute any of the above Steps in the image fusion method.
  • the image fusion method, device, storage medium, and mobile terminal provided in this application are applied to a mobile terminal.
  • the mobile terminal starts the dynamic shooting mode, it acquires multiple continuously shot preview images, and determines the corresponding preview image for each preview image.
  • the subject grayscale image and the background grayscale image and then determine the brightness adjustment parameters of the corresponding preview image according to the subject grayscale image and the background grayscale image, and merge the multiple preview images according to the brightness adjustment parameter, so as to avoid
  • the images captured in HDR mode appear under-exposed or over-exposed, and the shooting effect is good, which further reduces the number of user retakes and reduces the waste of terminal resources.
  • FIG. 1 is a schematic flowchart of an image fusion method provided by an embodiment of the application.
  • FIG. 2 is a schematic diagram of another process of an image fusion method provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of an image brightness adjustment scene provided by an embodiment of the application.
  • FIG. 4 is a schematic structural diagram of an image fusion device provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of another structure of an image fusion device provided by an embodiment of the application.
  • FIG. 6 is a schematic structural diagram of a mobile terminal provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of another structure of a mobile terminal provided by an embodiment of the application.
  • the embodiments of the present application provide an image fusion method, device, storage medium, and mobile terminal.
  • Fig. 1 is a schematic flow diagram of the image fusion method provided by an embodiment of the present application.
  • the image fusion method is applied to a mobile terminal.
  • the mobile terminal can be a smart phone, an iPad, a smart camera, etc.
  • Dynamic Range, HDR) shooting mode equipment the specific process can be as follows:
  • the dynamic shooting mode is HDR shooting mode, which can be based on the low dynamic range of different exposure times (Low-Dynamic Range, LDR) image, and use the LDR image with the best detail corresponding to each exposure time to synthesize the final HDR image, which can better reflect the visual effect in the real environment.
  • LDR Low-Dynamic Range
  • step S102 may specifically include:
  • Image segmentation is performed on the gray image to obtain the corresponding main gray image and background gray image.
  • the gray-scale processing refers to the color image into a gray-scale image.
  • Each pixel in the gray-scale image is represented by a gray value.
  • the color represents a kind of gray.
  • the RGB value of 0% gray is 255,255,255.
  • image segmentation refers to the separation of the background and the main body in the image, where the part of the gray image where the main body is located is the main gray image, and the part of the gray image where the background is located is the background gray image.
  • step S103 may specifically include:
  • pixels with grayscale values lower than the brightness threshold can be regarded as dark spots, and pixels with grayscale values not lower than the brightness threshold can be regarded as bright spots. Then all the bright spots in the main grayscale image form the first bright The points form the first dark point set, all the bright points in the background grayscale image form the second bright point set, and all the dark points form the second dark point set.
  • the weighted value of bright spots and the weighted value of dark spots can be calculated in the same way, and the mean difference of bright spots and the mean difference of dark spots can be calculated by the same formula.
  • the weighted value of bright spots and the mean difference of bright spots are obtained after processing the bright spots.
  • the dark point weighted value and the average dark point difference are obtained after processing the dark points.
  • the above step of "determining the weighted value of the bright spot and the difference of the mean value of the bright spot according to the first bright spot set and the second bright spot set” may specifically include:
  • the weighted value of the bright spot is determined according to the first bright spot set, the second bright spot set, the center point of the image, and the average difference of the bright spot.
  • the average brightness of the subject reflects the brightness of the subject image
  • the average background brightness reflects the brightness of the background image
  • the difference between the average of the subject and background highlights (that is, the difference of the average brightness) reflects the difference between the subject and the background.
  • the center point of the image can be the geometric center of the grayscale image of the subject, of course, it can also be a point selected based on the key part of the subject.
  • the key part of the subject can be determined based on the type of the subject.
  • the center point of the image can be the center of the face. For architecture, the entire object can be considered as a key part, and the center point of the image can be the geometric center.
  • the above step of "determining a bright spot weight value according to the first bright spot set, the second bright spot set, the image center point, and the average difference of the bright spot” may include:
  • the brightness weighted value is the brightness difference that takes into account the distance between the bright spot and the center point of the image in the entire preview image (including the subject and background), so that the image adjusted according to the brightness weighted value can greatly prevent the subject from overexposing or underexposing. Highlight the main body.
  • the first bright spot set includes ⁇ a1, a2...an ⁇ , and the image center point is O
  • it can be based on the image coordinates of each first bright spot in ⁇ a1, a2...an ⁇ and the image coordinates of point O .
  • the distance value between each first bright spot and the image center point such as the distance La1O between a1 and O.
  • the brightness adjustment parameter includes a first brightness adjustment parameter and a second brightness adjustment parameter
  • the foregoing steps 1-3 may specifically include:
  • different difference ranges can be set for dark points (bright spots) in advance, and each difference range is set with a second brightness adjustment parameter (first brightness adjustment parameter), and then by determining which difference range the actual difference belongs to, Obtain the second brightness adjustment parameter (first brightness adjustment parameter) corresponding to this difference range.
  • the first difference and the second difference can be 0, positive or negative.
  • the exposure can be considered appropriate.
  • the first difference or the second difference When it is a positive number, it can be regarded as underexposure, and when the first difference and the second difference are negative numbers, it can be regarded as overexposure.
  • a preview image is a selfie of user B
  • the subject is a human face
  • the background is an image area other than the human face.
  • the multiple preview images are merged according to the brightness adjustment parameter.
  • step S104 may specifically include:
  • the brightness and dark points are adjusted respectively through the corresponding first brightness adjustment parameter and the corresponding second brightness adjustment parameter, so as to ensure that the brightness of each preview image is within the appropriate range.
  • the multiple preview images are merged, so that the merged image has more image details and the best image effect compared to each original preview image.
  • the image fusion method provided by this application is applied to a mobile terminal.
  • the mobile terminal starts the dynamic shooting mode, it acquires multiple continuously shot preview images, and determines the subject grayscale image corresponding to each preview image. And the background grayscale image, and then determine the brightness adjustment parameter of the corresponding preview image according to the subject grayscale image and the background grayscale image, and merge the multiple preview images according to the brightness adjustment parameter, so as to avoid HDR mode shooting
  • the image of the subject is under-exposed or over-exposed, and the shooting effect is good, which further reduces the number of user retakes and reduces the waste of terminal resources.
  • the image fusion device can be implemented as an independent entity or integrated in a mobile terminal, which can be a smart phone. , IPad, smart camera and other devices with High Dynamic Range (HDR) shooting mode.
  • HDR High Dynamic Range
  • the image fusion device may include: an acquisition module 10, a first determination module 20, a second determination module 30, and a fusion module 40, wherein:
  • the acquiring module 10 is configured to acquire multiple preview images that are continuously shot when the mobile terminal starts the dynamic shooting mode.
  • the dynamic shooting mode is HDR shooting mode, which can be based on the low dynamic range of different exposure times (Low-Dynamic Range, LDR) image, and use the LDR image with the best detail corresponding to each exposure time to synthesize the final HDR image, which can better reflect the visual effect in the real environment.
  • LDR Low-Dynamic Range
  • the first determining module 20 is used to determine the main body grayscale image and the background grayscale image corresponding to each preview image.
  • the first determining module 20 is specifically used for:
  • Image segmentation is performed on the gray image to obtain the corresponding main gray image and background gray image.
  • the gray-scale processing refers to the color image into a gray-scale image.
  • Each pixel in the gray-scale image is represented by a gray value.
  • the color represents a kind of gray.
  • the RGB value of 0% gray is 255,255,255.
  • image segmentation refers to the separation of the background and the main body in the image, where the part of the gray image where the main body is located is the main gray image, and the part of the gray image where the background is located is the background gray image.
  • the second determining module 30 is configured to determine the brightness adjustment parameter of the corresponding preview image according to the subject grayscale image and the background grayscale image.
  • the second determining module 30 specifically includes:
  • the first determining unit 31 is configured to determine the first bright point set and the first dark point set in the subject grayscale image, and the second bright point set and the second dark point set in the background grayscale image.
  • pixels with grayscale values lower than the brightness threshold can be regarded as dark spots, and pixels with grayscale values not lower than the brightness threshold can be regarded as bright spots. Then all the bright spots in the main grayscale image form the first bright The points form the first dark point set, all the bright points in the background grayscale image form the second bright point set, and all the dark points form the second dark point set.
  • the second determining unit 32 is configured to determine the weighted value of the bright point and the difference of the mean value of the bright spot according to the first bright point set and the second bright point set, and determine the sum of the weighted dark point values according to the first dark point set and the second dark point set Mean difference of dark spots.
  • the weighted value of bright spots and the weighted value of dark spots can be calculated in the same way, and the mean difference of bright spots and the mean difference of dark spots can be calculated by the same formula.
  • the weighted value of bright spots and the mean difference of bright spots are obtained after processing the bright spots.
  • the dark point weighted value and the average dark point difference are obtained after processing the dark points.
  • the second determining unit 32 is specifically configured to:
  • pixels with grayscale values lower than the brightness threshold can be regarded as dark spots, and pixels with grayscale values not lower than the brightness threshold can be regarded as bright spots. Then all the bright spots in the main grayscale image form the first bright The points form the first dark point set, all the bright points in the background grayscale image form the second bright point set, and all the dark points form the second dark point set.
  • the weighted value of bright spots and the weighted value of dark spots can be calculated in the same way, and the mean difference of bright spots and the mean difference of dark spots can be calculated by the same formula.
  • the weighted value of bright spots and the mean difference of bright spots are obtained after processing the bright spots.
  • the dark point weighted value and the average dark point difference are obtained after processing the dark points.
  • the second determining unit 22 is specifically configured to: when performing the above-mentioned step "determining the weighted value of the bright spot and the difference in the mean value of the bright spot according to the first bright spot set and the second bright spot set", the second determining unit 22 is specifically configured to:
  • the weighted value of the bright spot is determined according to the first bright spot set, the second bright spot set, the center point of the image, and the average difference of the bright spot.
  • the average brightness of the subject reflects the brightness of the subject image
  • the average background brightness reflects the brightness of the background image
  • the difference between the average of the subject and background highlights (that is, the difference of the average brightness) reflects the difference between the subject and the background.
  • the center point of the image can be the geometric center of the grayscale image of the subject, of course, it can also be a point selected based on the key part of the subject.
  • the key part of the subject can be determined based on the type of the subject.
  • the center point of the image can be the center of the face. For architecture, the entire object can be considered as a key part, and the center point of the image can be the geometric center.
  • the second determining unit 22 is specifically configured to:
  • the brightness weighted value is the brightness difference that takes into account the distance between the bright spot and the center point of the image in the entire preview image (including the subject and background), so that the image adjusted according to the brightness weighted value can greatly prevent the subject from overexposing or underexposing. Highlight the main body.
  • the first bright spot set includes ⁇ a1, a2...an ⁇ , and the image center point is O
  • it can be based on the image coordinates of each first bright spot in ⁇ a1, a2...an ⁇ and the image coordinates of point O .
  • the distance value between each first bright spot and the image center point such as the distance La1O between a1 and O.
  • the third determining unit 33 is configured to determine the brightness adjustment parameter corresponding to the preview image according to the bright point weight value, the dark point weight value, the average difference of the bright points, and the average difference of the dark points.
  • the brightness adjustment parameter includes a first brightness adjustment parameter and a second brightness adjustment parameter
  • the second determining unit 33 is specifically configured to:
  • the first difference and the second difference can be 0, positive or negative.
  • the exposure can be considered appropriate.
  • the first difference or the second difference When it is a positive number, it can be regarded as underexposure, and when the first difference and the second difference are negative numbers, it can be regarded as overexposure.
  • a preview image is a selfie of user B
  • the subject is a human face
  • the background is an image area other than the human face.
  • the fusion module 40 is used for fusing the multiple preview images according to the brightness adjustment parameter.
  • the fusion module 40 is specifically used for:
  • the brightness and dark points are adjusted respectively through the corresponding first brightness adjustment parameter and the corresponding second brightness adjustment parameter, so as to ensure that the brightness of each preview image is within the appropriate range.
  • the multiple preview images are merged, so that the merged image has more image details and the best image effect compared to each original preview image.
  • each of the above units can be implemented as an independent entity, or can be combined arbitrarily, and implemented as the same or several entities.
  • each of the above units please refer to the previous method embodiments, which will not be repeated here.
  • the image fusion device is applied to a mobile terminal.
  • the mobile terminal starts the dynamic shooting mode, it acquires a plurality of continuously photographed preview images through the acquisition module 10, and the first determination module 20 determines each of the preview images.
  • the multiple preview images are merged to avoid under-exposure or over-exposure of the subject in the images captured in HDR mode, and the shooting effect is good, which further reduces the number of user retakes and reduces the waste of terminal resources.
  • an embodiment of the present invention also provides an image fusion system, including any one of the image fusion devices provided in the embodiments of the present invention, and the image fusion device may be integrated in a mobile terminal.
  • the mobile terminal can acquire multiple preview images that are continuously shot when the dynamic shooting mode is activated;
  • the multiple preview images are merged according to the brightness adjustment parameter.
  • the image fusion system can include any image fusion device provided in the embodiment of the present invention, it can achieve the beneficial effects that can be achieved by any image fusion device provided in the embodiment of the present invention. See the previous implementation for details. For example, I won’t repeat them here.
  • the embodiment of the present application also provides a terminal device, which may be a device such as a smart phone or a smart vehicle.
  • the terminal device 200 includes a processor 201 and a memory 202. Wherein, the processor 201 and the memory 202 are electrically connected.
  • the processor 201 is the control center of the terminal device 200. It uses various interfaces and lines to connect the various parts of the entire terminal device. Various functions and processing data of the equipment can be used to monitor the terminal equipment as a whole.
  • the processor 201 in the terminal device 200 will load the instructions corresponding to the process of one or more application programs into the memory 202 according to the following steps, and the processor 201 will run and store the instructions in the memory 202.
  • the processor 201 in the terminal device 200 will load the instructions corresponding to the process of one or more application programs into the memory 202 according to the following steps, and the processor 201 will run and store the instructions in the memory 202.
  • the multiple preview images are merged according to the brightness adjustment parameter.
  • FIG. 7 shows a specific structural block diagram of a terminal device provided by an embodiment of the present invention, and the terminal device may be used to implement the image fusion method provided in the above-mentioned embodiment.
  • the terminal device 300 may be a smart phone or a tablet computer.
  • the RF circuit 310 is used to receive and send electromagnetic waves, realize the mutual conversion between electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices.
  • the RF circuit 310 may include various existing circuit elements for performing these functions, for example, an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a subscriber identity module (SIM) card, a memory, and so on.
  • the RF circuit 310 can communicate with various networks such as the Internet, an intranet, and a wireless network, or communicate with other devices through a wireless network.
  • the aforementioned wireless network may include a cellular telephone network, a wireless local area network, or a metropolitan area network.
  • the above-mentioned wireless network can use various communication standards, protocols and technologies, including but not limited to the Global System for Mobile Communications (Global System for Mobile Communication, GSM), enhanced mobile communication technology (Enhanced Data GSM Environment, EDGE), Wideband Code Division Multiple Access Technology (Wideband Code Division Multiple Access, WCDMA), Code Division Multiple Access Technology (Code Division Access, CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wireless Fidelity, Wi-Fi) (such as the American Institute of Electrical and Electronics Engineers standards IEEE 802.11a, IEEE 802.11b, IEEE802.11g and/or IEEE 802.11n), Internet telephony (Voice over Internet Protocol, VoIP), Worldwide Interconnection for Microwave Access (Worldwide Interoperability for Microwave Access, Wi-Max), other protocols used for mail, instant messaging and short messages, and any other suitable communication protocols, even those that have not yet been developed.
  • GSM Global System for Mobile Communication
  • EDGE Enhanced Data GSM Environment
  • WCDMA Wideband Code Division Multiple Access Technology
  • the memory 320 can be used to store software programs and modules, such as the program instructions/modules corresponding to the automatic light-filling system and method for taking pictures of the front camera in the above-mentioned embodiment.
  • the processor 380 executes the software programs and modules stored in the memory 320 by running Various functional applications and data processing, that is, realize the function of automatically filling light when taking pictures with the front camera.
  • the memory 320 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 320 may further include a memory remotely provided with respect to the processor 380, and these remote memories may be connected to the terminal device 300 through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the input unit 330 may be used to receive inputted digital or character information, and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
  • the input unit 330 may include a touch-sensitive surface 331 and other input devices 332.
  • the touch-sensitive surface 331 also called a touch screen or a touchpad, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on or on the touch-sensitive surface 331. Operation near the touch-sensitive surface 331), and drive the corresponding connection device according to the preset program.
  • the touch-sensitive surface 331 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 380, and can receive and execute the commands sent by the processor 380.
  • the touch-sensitive surface 331 can be realized in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the input unit 330 may also include other input devices 332.
  • the other input device 332 may include, but is not limited to, one or more of a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, and a joystick.
  • the display unit 340 may be used to display information input by the user or information provided to the user and various graphical user interfaces of the terminal device 300. These graphical user interfaces may be composed of graphics, text, icons, videos, and any combination thereof.
  • the display unit 340 may include a display panel 341.
  • an LCD Liquid
  • the display panel 341 is configured in the form of Crystal Display (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, Organic Light-Emitting Diode), etc.
  • the touch-sensitive surface 331 may cover the display panel 341.
  • the touch-sensitive surface 331 When the touch-sensitive surface 331 detects a touch operation on or near it, it is transmitted to the processor 380 to determine the type of the touch event, and then the processor 380 determines the type of the touch event.
  • the type provides corresponding visual output on the display panel 341.
  • the touch-sensitive surface 331 and the display panel 341 are used as two independent components to realize input and output functions, in some embodiments, the touch-sensitive surface 331 and the display panel 341 can be integrated to realize input. And output function.
  • the terminal device 300 may also include at least one sensor 350, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 341 according to the brightness of the ambient light, and the proximity sensor can close the display panel 341 when the terminal device 300 is moved to the ear. And/or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary.
  • the terminal device 300 can also be configured with other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, here No longer.
  • the audio circuit 360, the speaker 361, and the microphone 362 can provide an audio interface between the user and the terminal device 300.
  • the audio circuit 360 can transmit the electric signal after the conversion of the received audio data to the speaker 361, and the speaker 361 converts it into a sound signal for output; on the other hand, the microphone 362 converts the collected sound signal into an electric signal, and the audio circuit 360 After being received, it is converted into audio data, and then processed by the audio data output processor 380, and sent to, for example, another terminal via the RF circuit 310, or the audio data is output to the memory 320 for further processing.
  • the audio circuit 360 may also include an earplug jack to provide communication between a peripheral earphone and the terminal device 300.
  • the terminal device 300 can help users send and receive emails, browse webpages, and access streaming media through the transmission module 370 (for example, a Wi-Fi module), and it provides users with wireless broadband Internet access.
  • the transmission module 370 for example, a Wi-Fi module
  • FIG. 7 shows the transmission module 370, it is understandable that it is not a necessary component of the terminal device 300, and can be omitted as needed without changing the essence of the invention.
  • the processor 380 is the control center of the terminal device 300, which uses various interfaces and lines to connect the various parts of the entire mobile phone, runs or executes software programs and/or modules stored in the memory 320, and calls data stored in the memory 320 , Perform various functions of the terminal device 300 and process data, so as to monitor the mobile phone as a whole.
  • the processor 380 may include one or more processing cores; in some embodiments, the processor 380 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and For application programs, the modem processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 380.
  • the terminal device 300 also includes a power source 390 (such as a battery) for supplying power to various components.
  • the power source may be logically connected to the processor 380 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the power supply 190 may also include any components such as one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, and a power status indicator.
  • the terminal device 300 may also include a camera (such as a front camera, a rear camera), a Bluetooth module, etc., which will not be repeated here.
  • the display unit of the terminal device is a touch screen display, and the terminal device also includes a memory and one or more programs.
  • One or more programs are stored in the memory and configured to be configured by one or more The above processor executes one or more programs including instructions for performing the following operations:
  • the multiple preview images are merged according to the brightness adjustment parameter.
  • each of the above modules can be implemented as an independent entity, or can be combined arbitrarily, and implemented as the same or several entities.
  • each of the above modules please refer to the previous method embodiments, which will not be repeated here.
  • an embodiment of the present invention provides a storage medium in which multiple instructions are stored, and the instructions can be loaded by a processor to execute the steps in any image fusion method provided in the embodiments of the present invention.
  • the storage medium may include: read-only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), disk or CD, etc.
  • the instructions stored in the storage medium can execute the steps in any image fusion method provided by the embodiments of the present invention, it can achieve what can be achieved by any image fusion method provided by the embodiments of the present invention.
  • the beneficial effects refer to the previous embodiment for details, and will not be repeated here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种图像融合方法、装置、存储介质及移动终端,当移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;确定每张该预览图像对应的主体灰度图和背景灰度图;根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数;根据该亮度调整参数对该多张预览图像进行融合。

Description

图像融合方法、装置、存储介质及移动终端
本申请要求于2020年3月17日提交中国专利局、申请号为202010187708.8、发明名称为“图像融合方法、装置、存储介质及移动终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及互联网技术领域,尤其涉及一种图像融合方法、装置、存储介质及移动终端。
背景技术
随着终端技术的不断发展,智能手机的体积变得更加小巧,功能变得更加齐全,例如,智能手机的机身设置有摄像头,以供用户进行拍摄。在近几年的发展过程中,智能手机中摄像头的像素已经完全能够满足人们的日常需求,是广大用户最喜欢用的拍摄装备。
目前智能手机大部分搭载了高动态范围(High Dynamic Range,HDR)拍照模式,可以有效提升大动态范围环境,尤其是逆光环境下的拍照体验,甚至部分智能手机还实现了HDR自启动功能,即可以实时根据环境的动态范围决定是否使用HDR模式,但是现有HDR模式拍摄出的照片容易出现拍摄主体欠曝或过曝现象,导致拍摄效果较差。
技术问题
本申请实施例提供一种图像融合方法、装置、存储介质及移动终端,能避免HDR模式拍摄出的图像出现拍摄主体欠曝或过曝现象,拍摄效果好。
技术解决方案
本申请实施例提供了一种图像融合方法,应用于移动终端,包括:
当所述移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
确定每张所述预览图像对应的主体灰度图和背景灰度图;
根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数;
根据所述亮度调整参数对所述多张预览图像进行融合。
本申请实施例还提供了一种图像融合装置,应用于移动终端,包括:
获取模块,用于当所述移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
第一确定模块,用于确定每张所述预览图像对应的主体灰度图和背景灰度图;
第二确定模块,用于根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数;
融合模块,用于根据所述亮度调整参数对所述多张预览图像进行融合。
其中,所述第一确定模块具体用于:
对每张所述预览图像进行灰度化处理,得到对应的灰度图;
对所述灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
其中,所述第二确定模块具体包括:
第一确定单元,用于确定所述主体灰度图中的第一亮点集和第一暗点集、以及所述背景灰度图中的第二亮点集和第二暗点集;
第二确定单元,用于根据所述第一亮点集和所述第二亮点集确定亮点加权值和亮点均值差,并根据所述第一暗点集和所述第二暗点集确定暗点加权值和暗点均值差;
第三确定单元,用于根据所述亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
其中,所述第二确定单元具体用于:
确定所述第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定所述第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
计算所述主体亮点均值和所述背景亮点均值之间的差值,得到亮点均值差;
确定所述主体灰度图中的图像中心点;
根据所述第一亮点集、所述第二亮点集、所述图像中心点和所述亮点均值差确定亮点加权值。
其中,所述第二确定单元具体用于:
确定所述第一亮点集中每个第一亮点与所述图像中心点之间的第一距离值,并确定所述第二亮点集中每个第二亮点与所述图像中心点之间的第二距离值;
根据所述第一距离值确定对应第一亮点的权重,并根据所述第二距离值确定对应第二亮点的权重;
计算每个所述第一亮点的权重和所述亮点均值差之间的乘积,并计算每个所述第二亮点的权重和所述亮点均值差之间的乘积;
计算所有所述乘积的和值平均值,得到亮点加权值。
其中,所述亮度调整参数包括第一亮度调整参数和第二亮度调整参数,所述第二确定单元具体用于:
计算所述亮点加权值和所述亮点均值差之间的差值,得到第一差值,并根据所述第一差值确定所述第一亮点集和所述第二亮点集的第一亮度调整参数;
计算所述暗点加权值和所述暗点均值差之间的差值,得到第二差值,并根据所述第二差值确定所述第一暗点集和所述第二暗点集的第二亮度调整参数。
其中,所述融合模块具体用于:
利用所述第一亮度调整参数对所述第一亮点集和所述第二亮点集进行亮度调整,并利用所述第二亮度调整参数对所述第一暗点集和所述第二暗点集进行亮度调整,以对所述预览图像进行调整;
将调整后的所述多张预览图像融合成单张图像,得到融合图像。
本申请实施例还提供了一种计算机可读存储介质,所述存储介质中存储有多条指令,所述指令适于由处理器加载以执行上述任一项图像融合方法。
本申请实施例还提供了一种移动终端,包括处理器和存储器,所述处理器与所述存储器电性连接,所述存储器用于存储指令和数据,所述处理器用于执行上述任一项图像融合方法中的步骤。
有益效果
本申请提供的图像融合方法、装置、存储介质及移动终端,应用于移动终端,当该移动终端启动动态拍摄模式时,通过获取连续拍摄的多张预览图像,并确定每张该预览图像对应的主体灰度图和背景灰度图,之后根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数,并根据该亮度调整参数对该多张预览图像进行融合,从而能避免HDR模式拍摄出的图像出现拍摄主体欠曝或过曝现象,拍摄效果好,进一步降低了用户的重拍次数,减少了终端资源浪费现象。
附图说明
下面结合附图,通过对本申请的具体实施方式详细描述,将使本申请的技术方案及其它有益效果显而易见。
图1为本申请实施例提供的图像融合方法的流程示意图。
图2为本申请实施例提供的图像融合方法的另一流程示意图。
图3为本申请实施例提供的图像亮度调整场景的示意图。
图4为本申请实施例提供的图像融合装置的结构示意图。
图5为本申请实施例提供的图像融合装置的另一结构示意图。
图6为本申请实施例提供的移动终端的结构示意图。
图7为本申请实施例提供的移动终端的另一结构示意图。
本发明的实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供一种图像融合方法、装置、存储介质及移动终端。
如图1所示,图1是本申请实施例提供的图像融合方法的流程示意图,该图像融合方法应用于移动终端,该移动终端可以是智能手机、iPad、智能相机等具有高动态范围(High Dynamic Range,HDR)拍摄模式的设备,具体流程可以如下:
S101.当该移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像。
其中,该动态拍摄模式是HDR拍摄模式,其可以根据不同曝光时间的低动态范围(Low-Dynamic Range,LDR)图像,并利用每个曝光时间相对应最佳细节的LDR图像来合成最终HDR图像,从而能更好反映出真实环境中的视觉效果。
S102. 确定每张该预览图像对应的主体灰度图和背景灰度图。
例如,上述步骤S102具体可以包括:
对每张该预览图像进行灰度化处理,得到对应的灰度图;
对该灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
其中,灰度化处理是指将彩色图像变成灰度图像,灰度图像中每个像素点用灰度值表示,在RGB模型中,如果R=G=B,则彩色表示一种灰度颜色,其中R=G=B的值叫灰度值,灰度值范围为0-255。0%的灰度RGB数值是255,255,255。
其中,图像分割是指将图像中的背景和主体分割开来,其中主体所在的部分灰度图像为主体灰度图,背景所在的部分灰度图像为背景灰度图。
S103. 根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数。
例如,请参见图2,上述步骤S103具体可以包括:
1-1.确定该主体灰度图中的第一亮点集和第一暗点集、以及该背景灰度图中的第二亮点集和第二暗点集。
其中,可以将灰阶值低于亮度阈值的像素点作为暗点,将灰阶值不低于亮度阈值的像素点作为亮点,则主体灰度图中的所有亮点组成第一亮点集,所有暗点组成第一暗点集,背景灰度图中的所有亮点组成第二亮点集,所有暗点组成第二暗点集。
1-2.根据该第一亮点集和该第二亮点集确定亮点加权值和亮点均值差,并根据该第一暗点集和该第二暗点集确定暗点加权值和暗点均值差。
其中,亮点加权值和暗点加权值可以采用同样的计算方式,亮点均值差和暗点均值差可以采用同样的计算公式,其中,亮点加权值和亮点均差值是对亮点处理后得到的,暗点加权值和暗点均差值是对暗点处理后得到的。
例如,上述步骤“根据该第一亮点集和该第二亮点集确定亮点加权值和亮点均值差”具体可以包括:
确定该第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定该第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
计算该主体亮点均值和该背景亮点均值之间的差值,得到亮点均值差;
确定该主体灰度图中的图像中心点;
根据该第一亮点集、该第二亮点集、该图像中心点和该亮点均值差确定亮点加权值。
其中,主体亮点均值反映了主体图像的亮度情况,背景亮点均值反映了背景图像的亮度情况,主体亮点均值和背景亮点均值之间的差值(也即亮点均值差)反映了主体和背景之间的亮度对比情况。图像中心点可以是主体灰度图的几何中心,当然还可以是基于主体中重点部位选取的点,主体的重点部位可以基于主体类型而定,比如对于人来说,脸部是重点部位,该图像中心点可以是脸部中心,对于建筑来说,可以认为整个物体均为重点部位,该图像中心点可以是几何中心。
例如,上述步骤“根据该第一亮点集、该第二亮点集、该图像中心点和该亮点均值差确定亮点加权值”可以包括:
确定该第一亮点集中每个第一亮点与该图像中心点之间的第一距离值,并确定该第二亮点集中每个第二亮点与该图像中心点之间的第二距离值;
根据该第一距离值确定对应第一亮点的权重,并根据该第二距离值确定对应第二亮点的权重;
计算每个该第一亮点的权重和该亮点均值差之间的乘积,并计算每个该第二亮点的权重和该亮点均值差之间的乘积;
计算所有该乘积的和值平均值,得到亮点加权值。
其中,可以提前为不同距离值设置一个权重,之后只需获取实际计算出的距离值对应的权重即可,也可以提前设置多个距离值范围,并为每个距离值范围设置权重,之后只需确定实际计算出的距离值属于哪个距离值范围,并获取这个距离值范围对应的权重即可。通常,考虑到拍摄图像应当重点突出主体,故离主体中心越远的亮点或暗点,其权重应当设置的越小,反之权重应当设置的越大。亮点加权值是考虑了整个预览图像(包括主体和背景)中亮点与图像中心点之间距离得到的亮度差值,从而根据亮度加权值调整的图像能大大避免主体过曝或欠曝现象,更突出主体。
比如,第一亮点集包括{a1,a2...an},图像中心点为O,则可以根据{a1,a2...an}中每个第一亮点的图像坐标和O点的图像坐标,来确定每个第一亮点与图像中心点之间的距离值,比如a1与O之间的距离La1O,La1O越大,则得到的权重应当越小。
1-3.根据该亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
例如,该亮度调整参数包括第一亮度调整参数和第二亮度调整参数,上述步骤1-3具体可以包括:
计算该亮点加权值和该亮点均值差之间的差值,得到第一差值,并根据该第一差值确定该第一亮点集和该第二亮点集的第一亮度调整参数;
计算该暗点加权值和该暗点均值差之间的差值,得到第二差值,并根据该第二差值确定该第一暗点集和该第二暗点集的第二亮度调整参数。
其中,可以提前为暗点(亮点)设置不同的差值范围,每个差值范围设置一个亮度第二亮度调整参数(第一亮度调整参数),之后通过确定实际差值属于哪个差值范围,获取这个差值范围对应的第二亮度调整参数(第一亮度调整参数)即可。
具体的,第一差值和第二差值可以是0、正数或负数,当第一差值或第二差值为0时,可以认为曝光适宜,当第一差值或第二差值为正数时,可以认为欠曝光,当第一差值和第二差值为负数时,可以认为过曝光。
比如,在图3中,若某张预览图像为用户B的自拍照,则主体为人脸,背景为除人脸外的图像区域,当确定自拍照曝光适宜时,不做亮度调整,当确定自拍照欠曝光时,需要提高图像亮度,当确定自拍照过曝光时,需要降低图像亮度
S104. 根据该亮度调整参数对该多张预览图像进行融合。
例如,在图2中,上述步骤S104具体可以包括:
2-1.利用该第一亮度调整参数对该第一亮点集和该第二亮点集进行亮度调整,并利用该第二亮度调整参数对该第一暗点集和该第二暗点集进行亮度调整,以对该预览图像进行调整;
2-2.将调整后的该多张预览图像融合成单张图像,得到融合图像。
其中,对于单张预览图像来说,通过相应第一亮度调整参数和相应第二亮度调整参数分别对其亮度和暗点进行亮度调整,从而确保每张预览图像的亮度都在合适范围内,之后,对这多张预览图像进行融合,使融合图像相对于原有的每张预览图像来说,具有更多的图像细节和最佳的图像效果。
由上述可知,本申请提供的图像融合方法,应用于移动终端,当该移动终端启动动态拍摄模式时,通过获取连续拍摄的多张预览图像,并确定每张该预览图像对应的主体灰度图和背景灰度图,之后根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数,并根据该亮度调整参数对该多张预览图像进行融合,从而能避免HDR模式拍摄出的图像出现拍摄主体欠曝或过曝现象,拍摄效果好,进一步降低了用户的重拍次数,减少了终端资源浪费现象。
根据上述实施例所描述的方法,本实施例将从图像融合装置的角度进一步进行描述,该图像融合装置具体可以作为独立的实体来实现,也可以集成在移动终端,该移动终端可以是智能手机、iPad、智能相机等具有高动态范围(High Dynamic Range,HDR)拍摄模式的设备。
请参阅图4,图4具体描述了本申请实施例提供的图像融合装置,该图像融合装置可以包括:获取模块10、第一确定模块20、第二确定模块30和融合模块40,其中:
(1)获取模块10
获取模块10,用于当该移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像。
其中,该动态拍摄模式是HDR拍摄模式,其可以根据不同曝光时间的低动态范围(Low-Dynamic Range,LDR)图像,并利用每个曝光时间相对应最佳细节的LDR图像来合成最终HDR图像,从而能更好反映出真实环境中的视觉效果。
(2)第一确定模块20
第一确定模块20,用于确定每张该预览图像对应的主体灰度图和背景灰度图。
其中,该第一确定模块20具体用于:
对每张该预览图像进行灰度化处理,得到对应的灰度图;
对该灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
其中,灰度化处理是指将彩色图像变成灰度图像,灰度图像中每个像素点用灰度值表示,在RGB模型中,如果R=G=B,则彩色表示一种灰度颜色,其中R=G=B的值叫灰度值,灰度值范围为0-255。0%的灰度RGB数值是255,255,255。
其中,图像分割是指将图像中的背景和主体分割开来,其中主体所在的部分灰度图像为主体灰度图,背景所在的部分灰度图像为背景灰度图。
(3)第二确定模块30
第二确定模块30,用于根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数。
例如,请参见图5,该第二确定模块30具体包括:
第一确定单元31,用于确定该主体灰度图中的第一亮点集和第一暗点集、以及该背景灰度图中的第二亮点集和第二暗点集。
其中,可以将灰阶值低于亮度阈值的像素点作为暗点,将灰阶值不低于亮度阈值的像素点作为亮点,则主体灰度图中的所有亮点组成第一亮点集,所有暗点组成第一暗点集,背景灰度图中的所有亮点组成第二亮点集,所有暗点组成第二暗点集。
第二确定单元32,用于根据该第一亮点集和该第二亮点集确定亮点加权值和亮点均值差,并根据该第一暗点集和该第二暗点集确定暗点加权值和暗点均值差。
其中,亮点加权值和暗点加权值可以采用同样的计算方式,亮点均值差和暗点均值差可以采用同样的计算公式,其中,亮点加权值和亮点均差值是对亮点处理后得到的,暗点加权值和暗点均差值是对暗点处理后得到的。
例如,该第二确定单元32具体用于:
1-1.确定该主体灰度图中的第一亮点集和第一暗点集、以及该背景灰度图中的第二亮点集和第二暗点集。
其中,可以将灰阶值低于亮度阈值的像素点作为暗点,将灰阶值不低于亮度阈值的像素点作为亮点,则主体灰度图中的所有亮点组成第一亮点集,所有暗点组成第一暗点集,背景灰度图中的所有亮点组成第二亮点集,所有暗点组成第二暗点集。
1-2.根据该第一亮点集和该第二亮点集确定亮点加权值和亮点均值差,并根据该第一暗点集和该第二暗点集确定暗点加权值和暗点均值差。
其中,亮点加权值和暗点加权值可以采用同样的计算方式,亮点均值差和暗点均值差可以采用同样的计算公式,其中,亮点加权值和亮点均差值是对亮点处理后得到的,暗点加权值和暗点均差值是对暗点处理后得到的。
其中,在执行上述步骤“根据该第一亮点集和该第二亮点集确定亮点加权值和亮点均值差”时,该第二确定单元22具体用于:
确定该第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定该第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
计算该主体亮点均值和该背景亮点均值之间的差值,得到亮点均值差;
确定该主体灰度图中的图像中心点;
根据该第一亮点集、该第二亮点集、该图像中心点和该亮点均值差确定亮点加权值。
其中,主体亮点均值反映了主体图像的亮度情况,背景亮点均值反映了背景图像的亮度情况,主体亮点均值和背景亮点均值之间的差值(也即亮点均值差)反映了主体和背景之间的亮度对比情况。图像中心点可以是主体灰度图的几何中心,当然还可以是基于主体中重点部位选取的点,主体的重点部位可以基于主体类型而定,比如对于人来说,脸部是重点部位,该图像中心点可以是脸部中心,对于建筑来说,可以认为整个物体均为重点部位,该图像中心点可以是几何中心。
例如,在执行上述步骤“根据该第一亮点集、该第二亮点集、该图像中心点和该亮点均值差确定亮点加权值”时,该第二确定单元22具体用于:
确定该第一亮点集中每个第一亮点与该图像中心点之间的第一距离值,并确定该第二亮点集中每个第二亮点与该图像中心点之间的第二距离值;
根据该第一距离值确定对应第一亮点的权重,并根据该第二距离值确定对应第二亮点的权重;
计算每个该第一亮点的权重和该亮点均值差之间的乘积,并计算每个该第二亮点的权重和该亮点均值差之间的乘积;
计算所有该乘积的和值平均值,得到亮点加权值。
其中,可以提前为不同距离值设置一个权重,之后只需获取实际计算出的距离值对应的权重即可,也可以提前设置多个距离值范围,并为每个距离值范围设置权重,之后只需确定实际计算出的距离值属于哪个距离值范围,并获取这个距离值范围对应的权重即可。通常,考虑到拍摄图像应当重点突出主体,故离主体中心越远的亮点或暗点,其权重应当设置的越小,反之权重应当设置的越大。亮点加权值是考虑了整个预览图像(包括主体和背景)中亮点与图像中心点之间距离得到的亮度差值,从而根据亮度加权值调整的图像能大大避免主体过曝或欠曝现象,更突出主体。
比如,第一亮点集包括{a1,a2...an},图像中心点为O,则可以根据{a1,a2...an}中每个第一亮点的图像坐标和O点的图像坐标,来确定每个第一亮点与图像中心点之间的距离值,比如a1与O之间的距离La1O,La1O越大,则得到的权重应当越小。
第三确定单元33,用于根据该亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
例如,该亮度调整参数包括第一亮度调整参数和第二亮度调整参数,该第二确定单元33具体用于:
计算该亮点加权值和该亮点均值差之间的差值,得到第一差值,并根据该第一差值确定该第一亮点集和该第二亮点集的第一亮度调整参数;
计算该暗点加权值和该暗点均值差之间的差值,得到第二差值,并根据该第二差值确定该第一暗点集和该第二暗点集的第二亮度调整参数。
具体的,第一差值和第二差值可以是0、正数或负数,当第一差值或第二差值为0时,可以认为曝光适宜,当第一差值或第二差值为正数时,可以认为欠曝光,当第一差值和第二差值为负数时,可以认为过曝光。
比如,在图3中,若某张预览图像为用户B的自拍照,则主体为人脸,背景为除人脸外的图像区域,当确定自拍照曝光适宜时,不做亮度调整,当确定自拍照欠曝光时,需要提高图像亮度,当确定自拍照过曝光时,需要降低图像亮度
(4)融合模块40
融合模块40,用于根据该亮度调整参数对该多张预览图像进行融合。
其中,该融合模块40具体用于:
2-1.利用该第一亮度调整参数对该第一亮点集和该第二亮点集进行亮度调整,并利用该第二亮度调整参数对该第一暗点集和该第二暗点集进行亮度调整,以对该预览图像进行调整;
2-2.将调整后的该多张预览图像融合成单张图像,得到融合图像。
其中,对于单张预览图像来说,通过相应第一亮度调整参数和相应第二亮度调整参数分别对其亮度和暗点进行亮度调整,从而确保每张预览图像的亮度都在合适范围内,之后,对这多张预览图像进行融合,使融合图像相对于原有的每张预览图像来说,具有更多的图像细节和最佳的图像效果。
具体实施时,以上各个单元可以作为独立的实体来实现,也可以进行任意组合,作为同一或若干个实体来实现,以上各个单元的具体实施可参见前面的方法实施例,在此不再赘述。
由上述可知,本实施例提供的图像融合装置,应用于移动终端,当该移动终端启动动态拍摄模式时,通过获取模块10获取连续拍摄的多张预览图像,第一确定模块20确定每张该预览图像对应的主体灰度图和背景灰度图,之后第二确定模块30根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数,融合模块40根据该亮度调整参数对该多张预览图像进行融合,从而能避免HDR模式拍摄出的图像出现拍摄主体欠曝或过曝现象,拍摄效果好,进一步降低了用户的重拍次数,减少了终端资源浪费现象。
相应的,本发明实施例还提供一种图像融合系统,包括本发明实施例所提供的任一种图像融合装置,该图像融合装置可以集成在移动终端中。
其中,移动终端可以在启动动态拍摄模式时,获取连续拍摄的多张预览图像;
确定每张该预览图像对应的主体灰度图和背景灰度图;
根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数;
根据该亮度调整参数对该多张预览图像进行融合。
以上各个设备的具体实施可参见前面的实施例,在此不再赘述。
由于该图像融合系统可以包括本发明实施例所提供的任一种图像融合装置,因此,可以实现本发明实施例所提供的任一种图像融合装置所能实现的有益效果,详见前面的实施例,在此不再赘述。
另外,本申请实施例还提供一种终端设备,该终端设备可以是智能手机、智能车辆等设备。如图6所示,终端设备200包括处理器201、存储器202。其中,处理器201与存储器202电性连接。
处理器201是终端设备200的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或加载存储在存储器202内的应用程序,以及调用存储在存储器202内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。
在本实施例中,终端设备200中的处理器201会按照如下的步骤,将一个或一个以上的应用程序的进程对应的指令加载到存储器202中,并由处理器201来运行存储在存储器202中的应用程序,从而实现各种功能:
当该移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
确定每张该预览图像对应的主体灰度图和背景灰度图;
根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数;
根据该亮度调整参数对该多张预览图像进行融合。
图7示出了本发明实施例提供的终端设备的具体结构框图,该终端设备可以用于实施上述实施例中提供的图像融合方法。该终端设备300可以为智能手机或平板电脑。
RF电路310用于接收以及发送电磁波,实现电磁波与电信号的相互转换,从而与通讯网络或者其他设备进行通讯。RF电路310可包括各种现有的用于执行这些功能的电路元件,例如,天线、射频收发器、数字信号处理器、加密/解密芯片、用户身份模块(SIM)卡、存储器等等。RF电路310可与各种网络如互联网、企业内部网、无线网络进行通讯或者通过无线网络与其他设备进行通讯。上述的无线网络可包括蜂窝式电话网、无线局域网或者城域网。上述的无线网络可以使用各种通信标准、协议及技术,包括但并不限于全球移动通信系统(Global System for Mobile Communication, GSM)、增强型移动通信技术(Enhanced Data GSM Environment, EDGE),宽带码分多址技术(Wideband Code Division Multiple Access, WCDMA),码分多址技术(Code Division Access, CDMA)、时分多址技术(Time Division Multiple Access, TDMA),无线保真技术(Wireless Fidelity, Wi-Fi)(如美国电气和电子工程师协会标准 IEEE 802.11a, IEEE 802.11b, IEEE802.11g 和/或 IEEE 802.11n)、网络电话(Voice over Internet Protocol, VoIP)、全球微波互联接入(Worldwide Interoperability for Microwave Access, Wi-Max)、其他用于邮件、即时通讯及短消息的协议,以及任何其他合适的通讯协议,甚至可包括那些当前仍未被开发出来的协议。
存储器320可用于存储软件程序以及模块,如上述实施例中前置摄像头拍照自动补光系统、方法对应的程序指令/模块,处理器380通过运行存储在存储器320内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现前置摄像头拍照自动补光的功能。存储器320可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器320可进一步包括相对于处理器380远程设置的存储器,这些远程存储器可以通过网络连接至终端设备300。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入单元330可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。具体地,输入单元330可包括触敏表面331以及其他输入设备332。触敏表面331,也称为触摸显示屏或者触控板,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触敏表面331上或在触敏表面331附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触敏表面331可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器380,并能接收处理器380发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触敏表面331。除了触敏表面331,输入单元330还可以包括其他输入设备332。具体地,其他输入设备332可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元340可用于显示由用户输入的信息或提供给用户的信息以及终端设备300的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元340可包括显示面板341,可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板341。进一步的,触敏表面331可覆盖显示面板341,当触敏表面331检测到在其上或附近的触摸操作后,传送给处理器380以确定触摸事件的类型,随后处理器380根据触摸事件的类型在显示面板341上提供相应的视觉输出。虽然在图7中,触敏表面331与显示面板341是作为两个独立的部件来实现输入和输出功能,但是在某些实施例中,可以将触敏表面331与显示面板341集成而实现输入和输出功能。
终端设备300还可包括至少一种传感器350,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板341的亮度,接近传感器可在终端设备300移动到耳边时,关闭显示面板341和/或背光。作为运动传感器的一种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等; 至于终端设备300还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路360、扬声器361,传声器362可提供用户与终端设备300之间的音频接口。音频电路360可将接收到的音频数据转换后的电信号,传输到扬声器361,由扬声器361转换为声音信号输出;另一方面,传声器362将收集的声音信号转换为电信号,由音频电路360接收后转换为音频数据,再将音频数据输出处理器380处理后,经RF电路310以发送给比如另一终端,或者将音频数据输出至存储器320以便进一步处理。音频电路360还可能包括耳塞插孔,以提供外设耳机与终端设备300的通信。
终端设备300通过传输模块370(例如Wi-Fi模块)可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图7示出了传输模块370,但是可以理解的是,其并不属于终端设备300的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器380是终端设备300的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器320内的软件程序和/或模块,以及调用存储在存储器320内的数据,执行终端设备300的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器380可包括一个或多个处理核心;在一些实施例中,处理器380可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器380中。
终端设备300还包括给各个部件供电的电源390(比如电池),在一些实施例中,电源可以通过电源管理系统与处理器380逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源190还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,终端设备300还可以包括摄像头(如前置摄像头、后置摄像头)、蓝牙模块等,在此不再赘述。具体在本实施例中,终端设备的显示单元是触摸屏显示器,终端设备还包括有存储器,以及一个或者一个以上的程序,其中一个或者一个以上程序存储于存储器中,且经配置以由一个或者一个以上处理器执行一个或者一个以上程序包含用于进行以下操作的指令:
当该移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
确定每张该预览图像对应的主体灰度图和背景灰度图;
根据该主体灰度图和该背景灰度图确定对应预览图像的亮度调整参数;
根据该亮度调整参数对该多张预览图像进行融合。
具体实施时,以上各个模块可以作为独立的实体来实现,也可以进行任意组合,作为同一或若干个实体来实现,以上各个模块的具体实施可参见前面的方法实施例,在此不再赘述。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过指令来完成,或通过指令控制相关的硬件来完成,该指令可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。为此,本发明实施例提供一种存储介质,其中存储有多条指令,该指令能够被处理器进行加载,以执行本发明实施例所提供的任一种图像融合方法中的步骤。
其中,该存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、磁盘或光盘等。
由于该存储介质中所存储的指令,可以执行本发明实施例所提供的任一种图像融合方法中的步骤,因此,可以实现本发明实施例所提供的任一种图像融合方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
以上各个操作的具体实施可参见前面的实施例,在此不再赘述。
综上该,虽然本申请已以优选实施例揭露如上,但上述优选实施例并非用以限制本申请,本领域的普通技术人员,在不脱离本申请的精神和范围内,均可作各种更动与润饰,因此本申请的保护范围以权利要求界定的范围为准。

Claims (20)

  1. 一种图像融合方法,其应用于移动终端,包括:
    当所述移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
    确定每张所述预览图像对应的主体灰度图和背景灰度图;
    根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数;
    根据所述亮度调整参数对所述多张预览图像进行融合。
  2. 根据权利要求1所述的图像融合方法,其中,所述确定每张所述预览图像对应的主体灰度图和背景灰度图,包括:
    对每张所述预览图像进行灰度化处理,得到对应的灰度图;
    对所述灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
  3. 根据权利要求1所述的图像融合方法,其中,所述根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数,包括:
    确定所述主体灰度图中的第一亮点集和第一暗点集、以及所述背景灰度图中的第二亮点集和第二暗点集;
    根据所述第一亮点集和所述第二亮点集确定亮点加权值和亮点均值差,并根据所述第一暗点集和所述第二暗点集确定暗点加权值和暗点均值差;
    根据所述亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
  4. 根据权利要求3所述的图像融合方法,其中,所述确定所述主体灰度图中的第一亮点集和第一暗点集,包括:
    将所述主体灰度图中灰阶值低于亮度阈值的像素点作为第一暗点,得到第一暗点集;
    将所述主体灰度图中灰阶值不低于亮度阈值的像素点作为第一亮点,得到第一亮点集。
  5. 根据权利要求3所述的图像融合方法,其中,所述根据所述第一亮点集和所述第二亮点集确定亮点加权值和亮点均值差,包括:
    确定所述第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定所述第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
    计算所述主体亮点均值和所述背景亮点均值之间的差值,得到亮点均值差;
    确定所述主体灰度图中的图像中心点;
    根据所述第一亮点集、所述第二亮点集、所述图像中心点和所述亮点均值差确定亮点加权值。
  6. 根据权利要求5所述的图像融合方法,其中,所述根据所述第一亮点集、所述第二亮点集、所述图像中心点和所述亮点均值差确定亮点加权值,包括:
    确定所述第一亮点集中每个第一亮点与所述图像中心点之间的第一距离值,并确定所述第二亮点集中每个第二亮点与所述图像中心点之间的第二距离值;
    根据所述第一距离值确定对应第一亮点的权重,并根据所述第二距离值确定对应第二亮点的权重;
    计算每个所述第一亮点的权重和所述亮点均值差之间的乘积,并计算每个所述第二亮点的权重和所述亮点均值差之间的乘积;
    计算所有所述乘积的和值平均值,得到亮点加权值。
  7. 根据权利要求5所述的图像融合方法,其中,所述亮度调整参数包括第一亮度调整参数和第二亮度调整参数,所述根据所述亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数,包括:
    计算所述亮点加权值和所述亮点均值差之间的差值,得到第一差值,并根据所述第一差值确定所述第一亮点集和所述第二亮点集的第一亮度调整参数;
    计算所述暗点加权值和所述暗点均值差之间的差值,得到第二差值,并根据所述第二差值确定所述第一暗点集和所述第二暗点集的第二亮度调整参数。
  8. 根据权利要求7所述的图像融合方法,其中,所述根据所述亮度调整参数对所述多张预览图像进行融合,包括:
    利用所述第一亮度调整参数对所述第一亮点集和所述第二亮点集进行亮度调整,并利用所述第二亮度调整参数对所述第一暗点集和所述第二暗点集进行亮度调整,以对所述预览图像进行调整;
    将调整后的所述多张预览图像融合成单张图像,得到融合图像。
  9. 一种计算机可读存储介质,其中,所述存储介质中存储有多条指令,所述指令适于由处理器加载以执行以下步骤:
    当移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
    确定每张所述预览图像对应的主体灰度图和背景灰度图;
    根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数;
    根据所述亮度调整参数对所述多张预览图像进行融合。
  10. 根据权利要求9所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    对每张所述预览图像进行灰度化处理,得到对应的灰度图;
    对所述灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
  11. 根据权利要求9所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    确定所述主体灰度图中的第一亮点集和第一暗点集、以及所述背景灰度图中的第二亮点集和第二暗点集;
    根据所述第一亮点集和所述第二亮点集确定亮点加权值和亮点均值差,并根据所述第一暗点集和所述第二暗点集确定暗点加权值和暗点均值差;
    根据所述亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
  12. 根据权利要求11所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    将所述主体灰度图中灰阶值低于亮度阈值的像素点作为第一暗点,得到第一暗点集;
    将所述主体灰度图中灰阶值不低于亮度阈值的像素点作为第一亮点,得到第一亮点集。
  13. 根据权利要求11所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    确定所述第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定所述第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
    计算所述主体亮点均值和所述背景亮点均值之间的差值,得到亮点均值差;
    确定所述主体灰度图中的图像中心点;
    根据所述第一亮点集、所述第二亮点集、所述图像中心点和所述亮点均值差确定亮点加权值。
  14. 根据权利要求13所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    确定所述第一亮点集中每个第一亮点与所述图像中心点之间的第一距离值,并确定所述第二亮点集中每个第二亮点与所述图像中心点之间的第二距离值;
    根据所述第一距离值确定对应第一亮点的权重,并根据所述第二距离值确定对应第二亮点的权重;
    计算每个所述第一亮点的权重和所述亮点均值差之间的乘积,并计算每个所述第二亮点的权重和所述亮点均值差之间的乘积;
    计算所有所述乘积的和值平均值,得到亮点加权值。
  15. 根据权利要求13所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    计算所述亮点加权值和所述亮点均值差之间的差值,得到第一差值,并根据所述第一差值确定所述第一亮点集和所述第二亮点集的第一亮度调整参数;
    计算所述暗点加权值和所述暗点均值差之间的差值,得到第二差值,并根据所述第二差值确定所述第一暗点集和所述第二暗点集的第二亮度调整参数。
  16. 根据权利要求15所述的计算机可读存储介质,其中,所述指令适于由处理器加载以具体执行步骤:
    利用所述第一亮度调整参数对所述第一亮点集和所述第二亮点集进行亮度调整,并利用所述第二亮度调整参数对所述第一暗点集和所述第二暗点集进行亮度调整,以对所述预览图像进行调整;
    将调整后的所述多张预览图像融合成单张图像,得到融合图像。
  17. 一种移动终端,其包括处理器和存储器,所述处理器与所述存储器电性连接,所述存储器用于存储指令和数据,所述处理器用于执行以下步骤:
    当所述移动终端启动动态拍摄模式时,获取连续拍摄的多张预览图像;
    确定每张所述预览图像对应的主体灰度图和背景灰度图;
    根据所述主体灰度图和所述背景灰度图确定对应预览图像的亮度调整参数;
    根据所述亮度调整参数对所述多张预览图像进行融合。
  18. 根据权利要求17所述的移动终端,其中,所述处理器具体用于执行:
    对每张所述预览图像进行灰度化处理,得到对应的灰度图;
    对所述灰度图进行图像分割,得到对应的主体灰度图和背景灰度图。
  19. 根据权利要求17所述的移动终端,其中,所述处理器具体用于执行:
    确定所述主体灰度图中的第一亮点集和第一暗点集、以及所述背景灰度图中的第二亮点集和第二暗点集;
    根据所述第一亮点集和所述第二亮点集确定亮点加权值和亮点均值差,并根据所述第一暗点集和所述第二暗点集确定暗点加权值和暗点均值差;
    根据所述亮点加权值、暗点加权值、亮点均值差和暗点均值差确定对应预览图像的亮度调整参数。
  20. 根据权利要求19所述的移动终端,其中,所述处理器具体用于执行:
    确定所述第一亮点集中所有第一亮点的灰阶值的平均值,得到主体亮点均值,并确定所述第二亮点集中所有第二亮点的灰阶值的平均值,得到背景亮点均值;
    计算所述主体亮点均值和所述背景亮点均值之间的差值,得到亮点均值差;
    确定所述主体灰度图中的图像中心点;
    根据所述第一亮点集、所述第二亮点集、所述图像中心点和所述亮点均值差确定亮点加权值。
PCT/CN2020/087209 2020-03-17 2020-04-27 图像融合方法、装置、存储介质及移动终端 WO2021184496A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP20925319.4A EP4123574A4 (en) 2020-03-17 2020-04-27 IMAGE FUSION METHOD AND DEVICE, STORAGE MEDIUM AND MOBILE TERMINAL DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010187708.8 2020-03-17
CN202010187708.8A CN111372001B (zh) 2020-03-17 2020-03-17 图像融合方法、装置、存储介质及移动终端

Publications (1)

Publication Number Publication Date
WO2021184496A1 true WO2021184496A1 (zh) 2021-09-23

Family

ID=71211890

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087209 WO2021184496A1 (zh) 2020-03-17 2020-04-27 图像融合方法、装置、存储介质及移动终端

Country Status (3)

Country Link
EP (1) EP4123574A4 (zh)
CN (1) CN111372001B (zh)
WO (1) WO2021184496A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023220957A1 (zh) * 2022-05-18 2023-11-23 北京小米移动软件有限公司 图像处理方法、装置、移动终端及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080143739A1 (en) * 2006-12-13 2008-06-19 Harris Jerry G Method and System for Dynamic, Luminance-Based Color Contrasting in a Region of Interest in a Graphic Image
CN104978722A (zh) * 2015-07-06 2015-10-14 天津大学 基于背景建模的多曝光图像融合鬼影去除方法
CN105551061A (zh) * 2015-12-09 2016-05-04 天津大学 高动态范围图像融合中保留无鬼影运动物体处理方法
CN106973240A (zh) * 2017-03-23 2017-07-21 宁波诺丁汉大学 实现高动态范围图像高清显示的数字照相机成像方法
CN110087003A (zh) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 多曝光图像融合方法
CN110751608A (zh) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 一种夜景高动态范围图像融合方法、装置和电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4341295B2 (ja) * 2003-05-16 2009-10-07 セイコーエプソン株式会社 逆光人物画像の判定
JP2005190435A (ja) * 2003-12-26 2005-07-14 Konica Minolta Photo Imaging Inc 画像処理方法、画像処理装置及び画像記録装置
JP5367640B2 (ja) * 2010-05-31 2013-12-11 パナソニック株式会社 撮像装置および撮像方法
JP5335851B2 (ja) * 2011-04-20 2013-11-06 シャープ株式会社 液晶表示装置、マルチディスプレイ装置、発光量決定方法、プログラム、及び記録媒体
CN104320575B (zh) * 2014-09-30 2019-01-15 百度在线网络技术(北京)有限公司 一种用于便携式终端的图像处理方法及图像处理装置
CN106534708B (zh) * 2015-09-15 2020-02-18 瑞昱半导体股份有限公司 宽动态范围影像方法
CN106161967B (zh) * 2016-09-13 2020-03-17 维沃移动通信有限公司 一种逆光场景全景拍摄方法及移动终端
CN106851124B (zh) * 2017-03-09 2021-03-02 Oppo广东移动通信有限公司 基于景深的图像处理方法、处理装置和电子装置
CN108200354B (zh) * 2018-03-06 2020-09-25 Oppo广东移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN109544486A (zh) * 2018-10-18 2019-03-29 维沃移动通信(杭州)有限公司 一种图像处理方法及终端设备
CN110062160B (zh) * 2019-04-09 2021-07-02 Oppo广东移动通信有限公司 图像处理方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080143739A1 (en) * 2006-12-13 2008-06-19 Harris Jerry G Method and System for Dynamic, Luminance-Based Color Contrasting in a Region of Interest in a Graphic Image
CN104978722A (zh) * 2015-07-06 2015-10-14 天津大学 基于背景建模的多曝光图像融合鬼影去除方法
CN105551061A (zh) * 2015-12-09 2016-05-04 天津大学 高动态范围图像融合中保留无鬼影运动物体处理方法
CN106973240A (zh) * 2017-03-23 2017-07-21 宁波诺丁汉大学 实现高动态范围图像高清显示的数字照相机成像方法
CN110087003A (zh) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 多曝光图像融合方法
CN110751608A (zh) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 一种夜景高动态范围图像融合方法、装置和电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4123574A4

Also Published As

Publication number Publication date
CN111372001B (zh) 2021-09-03
EP4123574A1 (en) 2023-01-25
CN111372001A (zh) 2020-07-03
EP4123574A4 (en) 2024-04-10

Similar Documents

Publication Publication Date Title
CN107093418B (zh) 一种屏幕显示方法、计算机设备及存储介质
WO2019129020A1 (zh) 一种摄像头自动调焦方法、存储设备及移动终端
WO2018137267A1 (zh) 图像处理方法和终端设备
CN108307109B (zh) 一种高动态范围图像预览方法及终端设备
CN107707827A (zh) 一种高动态图像拍摄方法及移动终端
WO2018219170A1 (zh) 控制对焦的方法、计算机设备及计算机可读存储介质
WO2016019926A1 (zh) 照片拍摄方法、装置及移动终端
CN108449541B (zh) 一种全景图像拍摄方法及移动终端
CN109639996B (zh) 高动态场景成像方法、移动终端及计算机可读存储介质
CN112449120A (zh) 高动态范围视频生成方法及装置
US11050942B2 (en) Screen fill light photographing method for mobile terminal, system and mobile terminal
CN110213484B (zh) 一种拍照方法、终端设备及计算机可读存储介质
WO2019129092A1 (zh) 一种降帧率拍照方法、移动终端及存储介质
CN111182236A (zh) 一种图像合成方法、装置、存储介质及终端设备
CN109104578B (zh) 一种图像处理方法及移动终端
CN111447371A (zh) 一种自动曝光控制方法、终端及计算机可读存储介质
WO2022266907A1 (zh) 处理方法、终端设备及存储介质
CN114037692A (zh) 图像处理方法、移动终端及存储介质
CN109561255B (zh) 终端拍照方法、装置及存储介质
WO2022267506A1 (zh) 图像融合方法、电子设备、存储介质及计算机程序产品
CN111556248B (zh) 拍摄方法、装置、存储介质及移动终端
WO2021184496A1 (zh) 图像融合方法、装置、存储介质及移动终端
CN108449560B (zh) 一种录像方法及终端
WO2022262259A1 (zh) 一种图像处理方法、装置、设备、介质和芯片
CN113301252B (zh) 图像拍摄方法、移动终端及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925319

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 17906465

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020925319

Country of ref document: EP

Effective date: 20221017