WO2020073957A1 - 一种图像拍摄方法和终端设备 - Google Patents

一种图像拍摄方法和终端设备 Download PDF

Info

Publication number
WO2020073957A1
WO2020073957A1 PCT/CN2019/110377 CN2019110377W WO2020073957A1 WO 2020073957 A1 WO2020073957 A1 WO 2020073957A1 CN 2019110377 W CN2019110377 W CN 2019110377W WO 2020073957 A1 WO2020073957 A1 WO 2020073957A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
color information
fusion
frame
Prior art date
Application number
PCT/CN2019/110377
Other languages
English (en)
French (fr)
Inventor
张俪耀
马靖
赵智彪
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201980066192.7A priority Critical patent/CN112840642B/zh
Priority to RU2021112627A priority patent/RU2758595C1/ru
Priority to EP19871713.4A priority patent/EP3840369B1/en
Priority to US17/284,117 priority patent/US11595588B2/en
Publication of WO2020073957A1 publication Critical patent/WO2020073957A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Definitions

  • the present application relates to the field of image capturing technology, and in particular, to an image capturing method and terminal device.
  • the image capturing function is one of the functions that users use more frequently, and users are increasingly concerned about the imaging quality of mobile phones.
  • Embodiments of the present application provide an image capturing method and a terminal device to improve image capturing quality.
  • an embodiment of the present application provides an image capturing method, which can be executed by a terminal device.
  • the method includes: in response to user operation, the terminal device opens a camera application, starts a camera, and displays a viewfinder interface; the terminal device converts the original image collected by the camera into an RGB image; and the terminal device converts the RGB image
  • the brightness is reduced to less than the first brightness or increased to greater than the second brightness to obtain a first image, the first brightness is greater than the second brightness
  • the terminal device uses HDR technology to convert the RGB image into an N-frame HDR image ,
  • the brightness of each HDR image of the N frame HDR image is different, and when the brightness of the RGB image decreases to less than the first brightness, the brightness of the N frame HDR image is greater than the first brightness, or in the When the brightness of the RGB image is increased to be greater than the second brightness, the brightness of the N-frame HDR image is less than the second brightness
  • N is a positive integer
  • the terminal device divides the first image and the N-frame
  • the terminal device adjusts the brightness of the RGB image and the multi-frame HDR image by different amplitudes. In this case, the details maintained on the RGB image and the HDR image are different.
  • the terminal device merges the color information of the pixels at the same position on the RGB image and the multi-frame HDR image to obtain a better quality image and improve the image shooting quality.
  • the terminal device merges the color information of pixels at the same position in the first image and the N-frame HDR image to obtain a final image, which may specifically be: the terminal device According to the exposure parameters of the original image collected, a first fusion curve corresponding to the exposure parameter is determined from multiple fusion curves; the first fusion curve is used to indicate the correspondence between color information and the fusion coefficient;
  • the terminal device uses the pixels at the same position on each frame of the HDR image as the first pixel, respectively, and executes for each first pixel: the terminal device according to the first pixel on the frame of the HDR image Color information, determining a fusion coefficient corresponding to the color information on the first fusion curve; according to the fusion coefficient, the terminal device compares the first pixel on the HDR image of each frame and the first image The color information of the second pixel on the image is fused to obtain the final image; wherein the second pixel is determined by the terminal device according to the matching algorithm and The same position as the pixel corresponding to the pixel points;
  • the terminal device determines a fusion curve according to the exposure parameters of the collected original image, and then determines the fusion coefficient on the fusion curve according to the color information of the pixels on the HDR image.
  • the terminal device fuses the color information of the pixels at the same position on the RGB image and the HDR image according to the fusion coefficient to obtain a better quality image and improve the image shooting quality.
  • the terminal device determines the fusion coefficient corresponding to the color information on the first fusion curve according to the color information of the first pixel on the HDR image of each frame, which may specifically include : The terminal device determines the first fusion coefficient corresponding to the R color information on the first fusion curve according to the R color information of the first pixel on the HDR image of each frame to obtain N first Fusion coefficient; the terminal device determines the second fusion coefficient corresponding to the G color information on the first fusion curve according to the G color information of the first pixel on the HDR image of each frame to obtain N The second fusion coefficient; the terminal device determines the third fusion coefficient corresponding to the B color information on the first fusion curve according to the B color information of the first pixel on the HDR image of each frame to obtain N third fusion coefficients; accordingly, the terminal device fuses the color information of the first pixel on the HDR image of each frame and the second pixel on the first image according to the fusion coefficient, specifically can The method includes: the terminal device fusing R
  • the terminal device separately determines the fusion coefficient according to the three kinds of color information of the first pixel on the HDR image.
  • the terminal device fuses the color information of the pixels at the same position on the RGB image and the HDR image according to the fusion coefficient of the color information. That is, when the terminal device merges the color information of the pixels at the same position on the RGB image and the HDR image, each color information of the pixels at the same position on the RGB image and the HDR image is fused separately. In this way, a better quality image is obtained, and the image quality is improved.
  • the terminal device fuses the R color information of the first pixel and the second pixel on the HDR image of each frame according to the N first fusion coefficients, and may Meet the following formula requirements:
  • R src is the value of the R color information of the second pixel on the first image
  • R src Represents the value of the R color information of the first pixel on the i-th frame HDR image in the N-frame HDR image
  • N represents the number of frames of the HDR image
  • f i is the i-th frame based on the i
  • the value of the R color information of the first pixel on the HDR image is the first fusion coefficient determined on the first fusion curve
  • R dst is the value of the R color information of a pixel on the final image.
  • the terminal device when the terminal device merges R (red) color information of pixels at the same position on the RGB image and the HDR image, it meets the requirements of the above formula. In this way, a better quality image is obtained, and the image quality is improved.
  • the terminal device fuses the G color information of the first pixel and the second pixel on the HDR image of each frame according to the N second fusion coefficients, and may Meet the following formula requirements:
  • G src is the value of the G color information of the second pixel on the first image
  • N represents the number of frames of the HDR image
  • f i is the i-th frame of the N second fusion coefficients
  • the value of the G color information of the first pixel on the HDR image is the second fusion coefficient determined on the first fusion curve
  • G dst is the value of the R color information of a pixel on the final image.
  • the terminal device when the terminal device merges G (green) color information of pixels at the same position on the RGB image and the HDR image, it meets the requirements of the above formula. In this way, a better quality image is obtained, and the image quality is improved.
  • the terminal device fuses the B color information of the first pixel and the second pixel on the HDR image of each frame according to the N third fusion coefficients, which may be Meet the following formula requirements:
  • B src is the value of the B color information of the second pixel on the first image
  • N represents the number of frames of the HDR image
  • f i is the i th frame based on the i th frame of the N third fusion coefficients
  • the value of the G color information of the first pixel on the HDR image is the third fusion coefficient determined on the first fusion curve
  • B dst is the value of the B color information of a pixel on the final image.
  • the terminal device when the terminal device merges the B (blue) color information of the pixels at the same position on the RGB image and the HDR image, it meets the requirements of the above formula. In this way, a better quality image is obtained, and the image quality is improved.
  • inventions of the present application provide a terminal device.
  • the terminal device includes a camera, a display screen, a memory, and a processor: the processor is used to open a camera application and start the camera in response to a user operation; the display screen is used to display a framing interface of the camera application; the camera is used For acquiring original images; the memory is used to store one or more computer programs; when the one or more computer programs stored in the memory are executed by the processor, the terminal device can implement the first aspect or the first On the one hand, any possible design method.
  • an embodiment of the present application further provides a terminal device, the terminal device includes a module / unit that executes the method of the first aspect or any one of the possible designs of the first aspect; these modules / units may be implemented by hardware Realization can also be achieved by hardware executing corresponding software.
  • a computer-readable storage medium is also provided in an embodiment of the present application.
  • the computer-readable storage medium includes a computer program.
  • the terminal device executes the first aspect or the above Any possible design method of the first aspect.
  • an embodiment of the present application further provides a computer program product, which, when the computer program product runs on a terminal device, causes the terminal device to perform the first aspect or any one of the above possible first aspects Design method.
  • FIG. 1 is a schematic structural diagram of a mobile phone 100 provided by an embodiment of this application.
  • FIG. 2 is a schematic structural diagram of a mobile phone 100 provided by an embodiment of this application.
  • FIG. 3 is a schematic flowchart of capturing an image through a mobile phone 100 according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of the effect of the image shooting method provided by the embodiment of the present application.
  • FIG. 5 is a schematic flowchart of an image capturing method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a fusion curve provided by an embodiment of this application.
  • FIG. 7 is a schematic flowchart of an image capturing method provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a terminal device provided by an embodiment of this application.
  • FIG. 9 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • the exposure parameters involved in the embodiments of the present application are the parameters set by the terminal device when capturing an image.
  • the exposure parameter can be used to indicate the total amount of light received by the device when the device is shooting the scene.
  • the exposure parameters may include exposure time and / or exposure intensity.
  • the value of the exposure parameter can determine the brightness value of the final captured image. For example, if the exposure time is long or the exposure intensity is large, the amount of light entering the device when capturing an image is large, so the brightness of the captured image is large. If the exposure time is short or the exposure intensity is small, the amount of light entering the device when capturing an image is small, so the brightness of the captured image is small.
  • the pixels involved in the embodiments of the present application are the smallest imaging unit on a frame of image.
  • a pixel can correspond to a coordinate point on the image.
  • a pixel can correspond to one parameter (such as gray scale), or it can correspond to a set of multiple parameters (such as gray scale, brightness, color, etc.).
  • a frame of image has three basic colors, namely red (Red, denoted by R below), green (Green, denoted by G below), blue (Blue, denoted by B below), and others Color can be formed by combining these three basic colors. Therefore, each pixel on a frame of image may contain R, G, and B color information, and the values of the R, G, and B color information on each pixel point are different. For example, when the value of the R, G, and B color information corresponding to a pixel is 0, the pixel appears white, and the value of the R, G, and B color information for a pixel is all At 255, the pixel appears black.
  • the original image involved in the embodiment of the present application is the output image of the camera, that is, the original data obtained by the camera converting the light information reflected by the collected object into a digital image signal, and the original data has not been processed.
  • the original image can be raw format data.
  • the raw format data may include object information and camera parameters.
  • the camera parameters may include sensitivity (international standardization, ISO), shutter speed, aperture value, white balance, and so on.
  • the original image is also the input image of the ISP and network neural unit, such as the neural-network processing unit (NPU) below.
  • the first image involved in the embodiment of the present application is an output image of an ISP.
  • the original image is processed by the ISP to obtain an image in the RGB format or YUV format, and the image obtained by adjusting the brightness of the image in the RGB format or YUV format.
  • the specific value of the ISP adjusting the brightness of the RGB format or YUV format image may be set by the user, or may be set by the mobile phone at the factory.
  • the first image is also an input image of a processor such as a graphics processing unit (GPU) below.
  • GPU graphics processing unit
  • the HDR image involved in the embodiment of the present application is an output image of a network neural unit.
  • the neural network unit may be based on the HDR image obtained by the high dynamic range technology in the prior art.
  • the embodiments of the present application will not repeat them in detail.
  • the HDR image is also an input image of a processor (such as a GPU below).
  • images such as the original image, the first image, and the HDR image, may refer to pictures, or may be parameters (such as pixel information, color information, and brightness information). set.
  • the terminal device may be a portable terminal containing a device such as a camera and having an image acquisition function, such as a mobile phone, a tablet computer, and the like.
  • portable terminal devices include, but are not limited to Or portable terminal devices of other operating systems.
  • the above portable terminal device may also be other portable terminal devices, such as a digital camera, as long as it has an image acquisition function. It should also be understood that, in some other embodiments of the present application, the terminal device may not be a portable terminal device, but a desktop computer with an image collection function.
  • terminal equipment supports multiple applications. For example, one or more of the following applications: camera application, instant messaging application, photo management application, etc. Among them, there can be multiple instant messaging applications. Such as WeChat, Tencent chat software (QQ), WhatsApp Messenger, Line Me, Line Sharing, Instagram, Kakao Talk, Nail, etc. Users can send text, voice, pictures, video files and other various files and other information to other contacts through the instant messaging application; or, users can realize video or audio calls with other contacts through the instant messaging application.
  • instant messaging applications Such as WeChat, Tencent chat software (QQ), WhatsApp Messenger, Line Me, Line Sharing, Instagram, Kakao Talk, Nail, etc.
  • Users can send text, voice, pictures, video files and other various files and other information to other contacts through the instant messaging application; or, users can realize video or audio calls with other contacts through the instant messaging application.
  • FIG. 1 shows a schematic structural diagram of a mobile phone 100.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, key 190, motor 191, indicator 192, camera 193, display screen 194, and user Identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 100.
  • the mobile phone 100 may include more or fewer components than shown, or combine some components, or split some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), and image signals.
  • processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec video codec
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of fetching instructions and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. The repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the mobile phone 100 realizes a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel may use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix organic light-emitting diode (active-matrix organic light) emitting diode, AMOLED), flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the mobile phone 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone 100 can realize the image capturing function through the processor 110, the camera 193, the display screen 194, and the like.
  • the camera 193 is used to capture still images or videos.
  • the camera 193 may include a photosensitive element such as a lens group and an image sensor, where the lens group includes a plurality of lenses (convex lens or concave lens) for collecting the light signal reflected by the object to be photographed and transmitting the collected light signal to the image sensor .
  • the image sensor generates an original image of the object to be captured according to the light signal.
  • the image sensor transmits the generated original image to the processor 110.
  • the processor 110 runs the image shooting algorithm provided by the embodiment of the present application to process the original image to obtain the processed image, and the display screen 194 displays the processed image.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the mobile phone 100.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one function required application programs (such as sound playback function, image playback function, etc.) and so on.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100 and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the distance sensor 180F is used to measure the distance.
  • the mobile phone 100 can measure the distance by infrared or laser. In some embodiments, when shooting scenes, the mobile phone 100 may use the distance sensor 180F to measure distance to achieve fast focusing. In other embodiments, the mobile phone 100 may also use the distance sensor 180F to detect whether a person or object is close.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the mobile phone 100 emits infrared light outward through a light emitting diode.
  • the mobile phone 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the mobile phone 100. When insufficient reflected light is detected, the mobile phone 100 can determine that there is no object near the mobile phone 100.
  • the mobile phone 100 can use the proximity light sensor 180G to detect that the user is holding the mobile phone 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense the brightness of ambient light.
  • the mobile phone 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile phone 100 can use the collected fingerprint characteristics to unlock a fingerprint, access an application lock, take a photo with a fingerprint, and answer a call with a fingerprint.
  • the temperature sensor 180J is used to detect the temperature.
  • the mobile phone 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs to reduce the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the mobile phone 100 heats the battery 142 to avoid abnormal shutdown of the mobile phone 100 due to low temperature. In some other embodiments, when the temperature is lower than another threshold, the mobile phone 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K may be provided on the display screen 194, and the touch sensor 180K and the display screen 194 constitute a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the mobile phone 100, which is different from the location where the display screen 194 is located.
  • the mobile phone 100 can realize audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the mobile phone 100 may receive key 190 input and generate key signal input related to user settings and function control of the mobile phone 100.
  • the mobile phone 100 may use the motor 191 to generate vibration prompts (such as incoming call vibration prompts).
  • the indicator 192 in the mobile phone 100 can be an indicator light, which can be used to indicate the charging state, the power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 in the mobile phone 100 is used to connect a SIM card. The SIM card can be inserted into or removed from the SIM card interface 195 to achieve contact and separation with the mobile phone 100.
  • the wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive the electromagnetic wave from the antenna 1, filter and amplify the received electromagnetic wave, and transmit it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor and convert it to electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), and global navigation satellites that are applied to the electronic device 100. Wireless communication solutions such as global navigation (satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR), etc.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives the electromagnetic wave via the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module 160 may also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic waves through the antenna 2 to radiate it out.
  • the following embodiments can all be implemented in a terminal device (such as a mobile phone 100, a tablet computer, etc.) having the above hardware structure.
  • a terminal device such as a mobile phone 100, a tablet computer, etc.
  • the image capturing algorithm provided by the embodiment of the present application will be introduced below through components related to the image capturing algorithm provided by the embodiment of the present application. See Figure 2 for details.
  • the processor 110 integrates the GPU 110-1, the ISP 110-2, and the NPU 110-3 as an example.
  • FIG. 3 is a schematic diagram of a process of capturing an image by the mobile phone 100 provided by the embodiment of the present application. As shown in Figure 3, the process includes:
  • the mobile phone 100 responds to the user operation, enters the camera application, opens the camera 193, and displays the viewfinder interface.
  • the display screen 194 of the mobile phone 100 displays a main interface, which includes icons of various application programs (such as a phone application icon, a video player icon, a music player icon, a camera application icon, a browser application icon, etc.) .
  • the user clicks the icon of the camera application in the main interface through the touch sensor 180K (not shown in FIG. 2, see FIG. 1) provided on the display screen 194 to start the camera application and turn on the camera 193.
  • the display screen 194 displays an interface of a camera application, for example, a viewfinder interface.
  • the camera 193 acquires the original image based on the set exposure parameters.
  • the process of the camera 193 collecting the original image is: the lens group 193-1 in the camera 193 collects the light signal reflected by the object to be photographed, and transmits the collected light signal to the image sensor 193 -2.
  • the image sensor 193-2 generates an original image of the object to be captured according to the light signal.
  • the camera parameters are usually set, such as the value of the exposure parameter (which may be set by the user or set by the mobile phone 100).
  • Exposure parameters include at least one of multiple parameters such as ISO, exposure time, aperture size, shutter, and light intake; or, the exposure parameters may be other parameters calculated according to IOS, aperture, and shutter to indicate the degree of exposure As long as it can reflect the degree of exposure, the embodiments of the present application are not limited.
  • the original image collected by the camera 193 is an unprocessed original image.
  • the original image may be raw format data, and the raw format data includes information of the object to be photographed and camera parameters (such as exposure parameters).
  • ISP110-1 converts the original image into an RGB image.
  • the exposure parameter is set higher, the brightness of the original image collected by camera 193 based on the set exposure parameter will be higher.
  • ISP110-1 can convert the RGB image Adjust the brightness to less than the first brightness to get the first image; if the exposure parameter setting is low, the brightness of the original image collected by the camera 193 based on the set exposure parameter will be low, then the ISP110-1 can adjust the brightness of the RGB image Adjust to be greater than the second brightness to get the first image.
  • the values of the first brightness and the second brightness can be set in advance, and the first brightness is greater than or equal to the second brightness.
  • the original image includes information of the object to be photographed and camera parameters. Because, the original image collected by the camera 193 is an image that has not been processed. ISP110-1 can process the original image based on the RGB color mode to obtain an RGB image that contains color information. Taking the raw image collected by the camera 193 as raw format data as an example, the ISP110-1 can convert the raw format data into RGB format data based on the RGB color mode. Among them, the RGB format data is data containing color information. Among them, the process of SP110-1 converting raw format data into RGB format data based on the RGB color mode can refer to the prior art, which is not limited in the embodiments of the present application. Specifically, which brightness value the ISP110-1 adjusts the RGB image to can be determined according to the user's setting. Here, only the first brightness and the second brightness are used as examples for description.
  • ISP110-1 can also convert raw format data into YUV images containing color information based on the YUV color mode, which is not limited in the embodiments of the present application.
  • the ISP 110-1 sends the exposure parameters and the first image to the GPU 110-3, and sends the RGB image to the NPU 110-2.
  • the NPU110-2 processes the obtained RGB image according to the high dynamic range technology to obtain an HDR image. Specifically, when the ISP 110-1 adjusts the brightness of the RGB image to be less than the first brightness, the NPU 110-2 may increase the brightness of the HDR image to be greater than the first brightness based on the high dynamic range technology. When the ISP 110-1 adjusts the brightness of the RGB image to be greater than the second brightness, the NPU 110-2 may reduce the brightness of the HDR image to be lower than the second brightness based on the high dynamic range technology.
  • the NPU 110-2 sends the obtained HDR image to the GPU 110-3.
  • GPU110-3 runs the code of the image shooting algorithm provided in the embodiment of the present application, determines a fusion curve corresponding to the exposure parameter according to the exposure parameter, and fuses the first image and the HDR image based on the fusion curve to obtain a final image.
  • FIG. 4 is a schematic diagram of the fusion of the first image and the HDR image by GPU110-3.
  • GPU 110-3 determines a fusion curve according to the exposure parameters, and fuses the first image and the HDR image according to the fusion curve to obtain the final image.
  • the inappropriate adjustment of exposure parameters will affect the details of the image. For example, when overexposure, the image is too bright, which leads to clear details in bright parts but loss of details in dark parts, underexposure When the image is too dark, the dark part of the image is clear but the bright part is lost.
  • ISP110-1 when the value of the exposure parameter is set higher, ISP110-1 can reduce the brightness of the GRB image, that is, the brightness of the first image is lower (in this case, the Dark details remain).
  • NPU110-2 uses high dynamic range technology to increase the brightness of RGB images to obtain HDR images to restore highlight details on HDR images (in this way, highlight details on HDR images are maintained).
  • GPU110-3 fuses the first image and the HDR image, that is, fuses the details of the dark part on the first image with the details of the bright part of the HDR image to obtain a final image with high image quality.
  • ISP110-1 when the value of the exposure parameter is set lower, ISP110-1 may increase the brightness of the GRB image, that is, the brightness of the first image is higher (in this case, the Highlight details remain).
  • NPU110-2 uses high dynamic range technology to reduce the brightness of RGB images to obtain HDR images to restore the details of dark parts of HDR images (in this way, the details of dark parts on HDR images are maintained).
  • GPU110-3 fuses the first image with the HDR image, that is, the details of the bright part on the first image and the details of the dark part of the HDR image are fused to obtain a final image with high image quality.
  • ISP110-1 can compare the value of the exposure parameter in the original image with the value of the exposure parameter stored in advance, if the value of the exposure parameter in the original image is greater than the value of the exposure parameter stored in advance, then determine the exposure The value of the parameter is high. If the value of the exposure parameter in the original image is less than or equal to the value of the previously stored exposure parameter, it is determined that the value of the exposure parameter is lower.
  • the display screen 194 displays the final image in the viewfinder interface.
  • the processor 110 integrates the GPU 110-1, ISP110-2, and NPU110-3 as an example.
  • the processor 110 may only integrate the GPU 110-1, ISP110-2. And one or both of NPU110-3.
  • the function of the NPU 110-3 (the function of obtaining the HDR image based on the original image) in the above embodiment may be performed by the GPU 110-1 or the ISP 110-2.
  • the function of the GPU 110-1 in the above embodiment (the function of running the image capturing algorithm provided by the embodiment of the present application to merge the first image and the HDR image) -3 or ISP110-2 implementation.
  • the processor 110 in FIG. 2 may also integrate other processors except the GPU 110-1, ISP110-2, and NPU110-3, such as a central processing CPU, and the above GPU 110-1, ISP110-2, and NPU110- The functions of 3 are all executed by the CPU; or the processor 110 in FIG. 2 may integrate the CPU, ISP110-2, and NPU110-3, then the functions of the above GPU 110-1 are executed by the CPU.
  • the image capturing algorithm of the embodiment of the present application can run on various processors, and the embodiment of the present application is not limited.
  • the flow shown in FIG. 3 is taken by taking an image through the camera application of the mobile phone 100 as an example.
  • the image capturing method provided by the embodiments of the present application can also be applied to other scenes.
  • a video call of a WeChat application in the mobile phone 100, or a scene of a QQ video call, etc. uses a camera to collect images.
  • GPU 110-3 in the mobile phone 100 running the image capturing method provided by the embodiment of the present application, determining the fusion curve corresponding to the exposure parameter according to the exposure parameter, and fusing the first image and the HDR image based on the fusion curve to obtain the final
  • the image process is the seventh step shown in Figure 3. Specifically, as shown in FIG. 5, GPU 110-3 runs the code of the image capturing algorithm provided by the embodiment of the present application, and performs the following process:
  • S501 Determine a first fusion curve corresponding to the exposure parameter from multiple fusion curves according to the exposure parameters of the original image.
  • a plurality of fusion curves may be stored in the mobile phone 100 in advance. These fusion curves may be obtained by the designer through experiments before the mobile phone 100 leaves the factory, and stored in the mobile phone 100 (such as the internal memory 121).
  • FIG. 6 is a schematic diagram of a fusion curve provided by an embodiment of the present application.
  • the horizontal and vertical directions are color information (such as the values of R, G, and B color information), and the vertical axis is the fusion coefficient.
  • the fusion curve can reflect the correspondence between the fusion coefficient and the value of R color information (G color information or B color information).
  • GPU 110-3 determines the first fusion curve corresponding to the value 1 among the multiple fusion curves shown in FIG. 6 according to the value 1 of the exposure parameter.
  • FIG. 6 takes three fusion curves as an example. In practical applications, the mobile phone 100 may include more fusion curves.
  • GPU110-3 determines that the exposure parameter is less than the first exposure parameter (for example, the value of the first exposure parameter is 1), it determines the first fusion curve (for example, the exposure parameter in FIG. 6 is 1 Fusion curve), when GPU110-3 determines that the exposure parameter is greater than or equal to the first exposure parameter and less than the second exposure parameter (such as the value of the second exposure parameter is 2), it determines the second fusion curve (such as the exposure parameter in FIG. 6 Fuse curve of 2), when GPU110-3 determines that the exposure parameter is greater than or equal to the second exposure parameter and less than the third exposure parameter (for example, the third exposure parameter takes the value of 3), it determines the third fusion curve (for example, FIG. 6 (The exposure parameter in the value is a fusion curve of 3). In this way, there is no need to store many fusion curves in the mobile phone 100, saving memory.
  • the first fusion curve for example, the exposure parameter in FIG. 6 is 1 Fusion curve
  • the second fusion curve such as the exposure parameter in FIG. 6 Fuse curve
  • the exposure parameters and the fusion curve are in a one-to-one correspondence, that is, each exposure parameter can determine a different fusion curve.
  • each exposure parameter can determine a different fusion curve.
  • S502 Determine the respective values of the R, G, and B color information of the first pixel in the HDR image.
  • ISP110-1 processes the original image to obtain an RGB image containing color information, and then adjusts the brightness of the RGB image to obtain the first image. Therefore, reference may be made to the prior art for step S502, and details are not described in the embodiments of the present application.
  • S503 Determine a first fusion coefficient corresponding to the R color information on the first fusion curve according to the value of R color information; determine a first fusion coefficient on the first fusion curve according to the value of G color information A second fusion coefficient corresponding to the G color information; and determining a third fusion coefficient corresponding to the B color information on the first fusion curve according to the value of the B color information.
  • GPU110-1 determines that the first fusion curve is the fusion curve of the exposure parameter value 2 shown in FIG. 6, then GPU110-1 is in the horizontal direction.
  • the value of R color information found on the coordinate is a value of 150, and the ordinate corresponding to the value of 150 on the first fusion curve is determined, and the ordinate is the first fusion coefficient.
  • G and B color information a similar manner can also be used, which will not be described in detail.
  • the fusion curve shown in FIG. 6 can be used, that is, the abscissa shown in FIG. 6 can represent the three color information of R, G, and B. value.
  • the abscissa of the fusion curve shown in FIG. 6 is only the corresponding relationship between the R color information and the fusion coefficient. The designer can design a set of fusion curves similar to that shown in FIG. 6 for both the G and B color information.
  • S504 Determine a second pixel point on the first image that is at the same position as the first pixel point on the HDR image.
  • the GPU 110-3 may determine the coordinate value of each pixel on the first image and the HDR image. Please continue to refer to FIG. 4 as an example.
  • FIG. 4 shows a frame of HDR image and a frame of first image.
  • GPU110-3 can select a pixel B (ie the first pixel) on the HDR image.
  • GPU110-3 can be based on The matching algorithm of the prior art (such as the similarity matching algorithm) obtains the pixel A corresponding to the pixel B on the first image (ie, the second pixel); or, the GPU 110-3 may determine the difference between the first image and the HDR image Pixels in the same position (same coordinate value).
  • the matching algorithm of the prior art such as the similarity matching algorithm
  • each pixel on a frame of image contains R, G, and B color information, and the values of the R, G, and B color information on each pixel are different. Therefore, in the embodiment of the present application, when the GPU 110-3 fuses the first image and the HDR image, the color information of each pixel on the two frames of images may be fused separately.
  • GPU110-3 determines pixel A and pixel B
  • the three color information of pixel A, R, G, and B and the color information of pixel B, R, G, and B can be separated Fusion.
  • GPU110-3 fuses the R color information of pixel A and the R color information of pixel B, fuses the G color information of pixel A and the G color information of pixel B, and the B color information of pixel A Fusion with the B color information of pixel B.
  • GPU 110-3 may fuse R color information of pixel A and R color information of pixel B according to the first fusion formula and the first fusion coefficient (the first fusion coefficient determined in S503).
  • the mobile phone 100 may store a first fusion formula (a fusion formula for calculating R color information of a pixel on the final image), please refer to formula (1):
  • R dst (1-f1) ⁇ R src + f1 ⁇ R res formula (1)
  • R src is the value of the R color information of a pixel (such as pixel A) on the first image
  • R res represents the value of the R color information of a pixel (such as pixel B) of the HDR image
  • f1 is based on the first fusion coefficient determined on the first fusion curve by the value of the R color information of the pixel point (such as pixel point B) on the HDR image (step S503).
  • the value of R color information of a pixel (such as pixel C) on the final image, that is, R dst is obtained by formula (1).
  • the mobile phone 100 may store a second fusion formula (a fusion formula used to calculate the G color information of a pixel on the final image), please refer to formula (2):
  • G src is the value of the G color information of a pixel (such as pixel A) on the first image
  • G res represents the value of the G color information of a pixel (such as pixel B) of the HDR image.
  • f2 is based on the second fusion coefficient determined on the first fusion curve based on the value of the G color information of the pixel point (such as pixel point B) on the HDR image (step S503).
  • the value of the G color information of a pixel point (such as pixel point C) on the final image, that is, G dst is obtained by formula (2).
  • the mobile phone 100 may store a third fusion formula (a fusion formula used to calculate the B color information of a pixel on the fused image), please refer to formula (3):
  • B src is the value of the B color information of a pixel (such as pixel A) on the first image
  • B res represents the value of the B color information of a pixel (such as pixel B) of the HDR image.
  • f3 is based on the third fusion coefficient determined on the first fusion curve based on the value of the B color information of the pixel point (such as pixel point B) on the HDR image (step S503).
  • the value of the B color information of a pixel on the final image (such as pixel C) is obtained, namely B dst .
  • the above formulas (1)-(3) can be used to determine the RGB color information of a pixel (ie, pixel C) on the fused image. For other pixels, a similar method can be used to finally determine the RGB of each pixel Color information to obtain the final image.
  • GPU 110-1 fuse the first image and the HDR image to obtain the final image.
  • the NPU110-2 can make different brightness adjustments to different areas on the RGB image to obtain a multi-frame HDR image.
  • NPU110-2 makes different brightness adjustments to different areas on the RGB image to obtain multi-frame HDR images.
  • the NPU 110-2 sends the obtained multi-frame HDR image to the GPU 110-3.
  • GPU110-3 fuses the multi-frame HDR image with the first image to obtain the final image (corresponding to the seventh step shown in FIG. 3).
  • FIG. 7 is a schematic flowchart of an image shooting process provided by an embodiment of the present application.
  • GPU110-3 determines the fusion curve according to the exposure parameters, and fuses the first image and the multi-frame HDR image according to the fusion curve to obtain the final image.
  • NPU110-2 uses high dynamic range technology for different areas on the RGB image to obtain multi-frame HDR images. Taking the two-frame HDR image as an example, when ISP110-1 adjusts the brightness of the RGB image to be less than the first brightness, NPU110-2 can adjust the brightness of the first area (full area or partial area) of the RGB image based on high dynamic range Raise to a certain brightness value greater than the first brightness to obtain a frame of HDR image.
  • the NPU 110-2 may increase the brightness of the second area (the whole area or part of the area) of the RGB image to another brightness value greater than the first brightness based on the high dynamic range technology to obtain another frame of HDR image.
  • ISP110-1 adjusts the brightness of the RGB image to be greater than the second brightness
  • NPU110-2 can reduce the brightness of the first area (full area or partial area) on the RGB image to be lower than the first based on the high dynamic range technology A certain brightness value of the second brightness, to get a frame of HDR image.
  • the NPU 110-2 may reduce the brightness of the second area (full area or partial area) of the RGB image to another brightness value less than the second brightness based on the high dynamic range technology to obtain another frame of HDR image.
  • the brightness of the multi-frame HDR image obtained by NPU110-2 can be different.
  • the details of different regions on each frame of the HDR image in the multi-frame HDR image obtained by NPU110-2 are different.
  • GPU110-3 can fuse the first image and multi-frame HDR image to obtain the final image.
  • the process of fusing the multi-frame HDR image and the first image by the GPU 110-3 is similar to the process shown in FIG. 4. The difference is that in S404, the GPU 110-3 determines the second pixel point (such as pixel point B1, B2) corresponding to the first pixel point (such as pixel point A) on the first image on each frame of HDR image. In S405, the fusion formula adopted by GPU110-3 is different.
  • R src is the value of the R color information of a pixel (such as pixel A) on the first image
  • the value of the R color information representing the pixel of the i-th frame HDR image (such as Is the value of the R color information of the pixel B1 on the first frame HDR image, Is the value of the R color information of pixel B2 on the second frame HDR image, and so on);
  • n represents the number of HDR images obtained by NPU110-2 (when n is equal to 1, that is, NPU110-2 obtains a frame of HDR image , That is, the aforementioned embodiment shown in FIG. 4).
  • f i is the value of the R color information based on the pixels on the i-th frame HDR image (i.e.
  • the fusion coefficient determined on the first fusion curve (for example, f 1 is based on the value of the R color information of the pixel B1 on the first frame HDR image, that is The fusion coefficient determined on the first fusion curve, f 2 is based on the value of the R color information of the pixel B2 on the second frame HDR image, that is (The fusion coefficient determined on the first fusion curve, and so on).
  • the value of the G color information of a pixel point (such as pixel point C) on the final image, that is, G dst , is obtained by formula (5).
  • G src is the value of the G color information of the pixel (such as pixel A) on the first image
  • the value of the G color information representing the pixel of the i-th frame HDR image for example, such as Is the value of the G color information of the pixel B1 on the HDR image of the first frame, Is the value of the G color information of the pixel B2 on the HDR image of the second frame, and so on.
  • f i is the value of the G color information based on the pixels on the i-th frame HDR image (i.e.
  • the fusion coefficient determined on the first fusion curve (for example, f 1 is based on the value of the G color information of the pixel B1 on the first frame HDR image, ie The fusion coefficient determined on the first fusion curve, f 2 is based on the G color information of the pixel B2 on the second frame HDR image, that is The fusion coefficient determined by the value of on the first fusion curve, and so on).
  • the value of the G color information of a pixel point (such as pixel point C) on the final image, that is, G dst , is obtained by formula (5).
  • B src is the value of the B color information of the pixel point (such as pixel point A) on the first image
  • the value of the B color information representing the pixel of the i-th frame HDR image (such as Is the value of the B color information of the pixel B1 on the first frame HDR image, Is the value of the B color information of the pixel B2 on the HDR image of the second frame, and so on).
  • n represents the number of frames of the HDR image obtained by NPU110-2.
  • f i is the B color information based on the pixels on the i-th frame HDR image (i.e.
  • the value of the fusion coefficient determined on the first fusion curve (for example, f 1 is based on the value of the B color information of the pixel B1 on the first frame HDR image, that is The fusion coefficient determined on the first fusion curve, f 2 is based on the value of the B color information of the pixel B2 on the second frame HDR image, that is (The fusion coefficient determined on the first fusion curve, and so on).
  • the value of the B color information of a pixel point (such as pixel point C) on the final image, that is, B dst , is obtained by formula (6).
  • the above formulas (4)-(6) can be used to determine the R, G, and B color information of a pixel (ie, pixel C) on the fused image. For other pixels, a similar method can be used to finally determine each Pixel R, G, B color information.
  • the method provided by the embodiments of the present application is introduced from the perspective of the terminal device (mobile phone 100) as an execution subject.
  • the terminal may include a hardware structure and / or a software module, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above functions is executed in a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application of the technical solution and design constraints.
  • FIG. 8 shows a terminal device 800 provided by the present application.
  • the terminal device 800 may include: a camera 801, a display screen 802, one or more processors 803; a memory 804; and one or more computer programs 805, and the above devices may communicate through one or more The bus 806 is connected.
  • the camera 1402 is used to acquire the original image; wherein, the one or more computer programs 805 shown are stored in the above-mentioned memory 804 and configured to be executed by the one or more processors 803, the one or more computer programs 805 includes instructions, and the above instructions may be used to perform all or part of the steps shown in FIG. 3 or FIG. 5 and various steps in the corresponding embodiments.
  • FIG. 9 shows a terminal device 900 provided by the present application.
  • the terminal device 900 may include an image acquisition unit 901, a processing unit 902, and a display unit 903.
  • the processing unit 902 is configured to open a camera application and start a camera in response to user operations.
  • the display unit 903 is used to display a framing interface.
  • the processing unit 902 is further configured to convert the original image collected by the camera into an RGB image; reduce the brightness of the RGB image to less than the first brightness or increase to greater than the second brightness to obtain the first image, the The first brightness is greater than the second brightness; HDR technology is used to convert the RGB image into an N-frame HDR image, the N-frame HDR image has a different brightness for each HDR image, and the brightness of the RGB image is reduced to less than At the first brightness, the brightness of the N-frame HDR image is greater than the first brightness, or when the brightness of the RGB image is increased to greater than the second brightness, the brightness of the N-frame HDR image is less than the first brightness Two brightness; N is a positive integer; the first image and the color information of pixels at the same position in the N frame HDR image are fused to obtain a final image.
  • the display unit 903 is also used to display the final image in the viewfinder interface.
  • An embodiment of the present invention further provides a computer storage medium.
  • the storage medium may include a memory, and the memory may store a program.
  • the program When the program is executed, the terminal is executed as described above. All or part of the steps performed by the terminal described in the method embodiment.
  • An embodiment of the present invention also provides a computer program product that, when the computer program product runs on a terminal, causes the terminal to perform the method described in the foregoing embodiment shown in FIG. 3 or FIG. 5. All or part of the steps performed.
  • Computer-readable media includes computer storage media and communication media, where communication media includes any medium that facilitates transfer of a computer program from one place to another.
  • a storage medium may be any available medium that can be accessed by a computer.
  • computer-readable media may include RAM, ROM, electrically erasable programmable read-only memory (electrically erasable programmable read only memory, EEPROM), compact disc read-only (memory, CD- ROM) or other optical disk storage, magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store a desired program code in the form of instructions or data structures and that can be accessed by a computer. Also. Any connection can become a computer-readable medium as appropriate.
  • disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (DVDs), floppy disks, and Blu-ray discs, where Disks usually copy data magnetically, while disks use lasers to copy data optically.
  • CDs compact discs
  • DVDs digital video discs
  • floppy disks floppy disks
  • Blu-ray discs where Disks usually copy data magnetically, while disks use lasers to copy data optically.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种图像拍摄方法和终端设备。该方法包括:终端设备进入相机应用,启动摄像头,显示取景界面;终端设备将所述摄像头采集的原始图像转换成RGB图像,并将RGB图像的亮度降低到小于第一亮度或提高到大于第二亮度,得到第一图像;终端设备采用HDR技术将RGB图像转换成N帧HDR图像,N帧HDR图像每帧HDR图像的亮度不同,且在RGB图像的亮度降低到小于第一亮度时N帧HDR图像的亮度均大于第一亮度,或者在RGB图像的亮度提高到大于第二亮度时N帧HDR图像的亮度均小于第二亮度;终端设备将第一图像与N帧HDR图像中任意相同位置的像素点的颜色信息融合,得到最终图像。通过这种方式,有助于提高拍摄图像的质量。

Description

一种图像拍摄方法和终端设备
本申请要求于2018年10月11日提交中国国家知识产权局、申请号为201811185480.8、申请名称为“一种图像拍摄方法和终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像拍摄技术领域,尤其涉及一种图像拍摄方法和终端设备。
背景技术
随着终端技术的进步,终端设备的各种功能不断完善。以手机为例,图像拍摄功能是用户使用频率较高的功能之一,而且,用户越来越关注手机的成像质量。
然而,目前的手机拍摄图像的曝光度不好把控,容易出现过曝或欠曝的问题,导致在图像中亮部细节清晰但是暗部细节丢失,或者暗部细节清晰但是亮部细节丢失,最终导致图像的成像质量不佳。
发明内容
本申请实施例提供一种图像拍摄方法和终端设备,用以提高图像拍摄质量。
第一方面,本申请实施例提供一种图像拍摄方法,该方法可由终端设备执行。所述方法包括:终端设备响应于用户操作,打开相机应用,启动摄像头,显示取景界面;所述终端设备将所述摄像头采集的原始图像转换成RGB图像;所述终端设备将所述RGB图像的亮度降低到小于第一亮度或提高到大于第二亮度,得到第一图像,所述第一亮度大于所述第二亮度;所述终端设备采用HDR技术将所述RGB图像转换成N帧HDR图像,所述N帧HDR图像每帧HDR图像的亮度不同,且在所述RGB图像的亮度降低到小于第一亮度时所述N帧HDR图像的亮度均大于所述第一亮度,或者在所述RGB图像的亮度提高到大于第二亮度时所述N帧HDR图像的亮度均小于所述第二亮度;N为正整数;所述终端设备将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像;所述终端设备在所述取景界面中显示所述最终图像。
在本申请实施例中,终端设备将RGB图像和多帧HDR图像的亮度分别做不同幅度的调整。这样的话,RGB图像和HDR图像上保持的细节不同。终端设备将RGB图像和多帧HDR图像上相同位置的像素点的颜色信息融合后,得到质量较好的图像,提高图像拍摄质量。
在一种可能的设计中,所述终端设备将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像,具体可以为:所述终端设 备根据采集所述原始图像的曝光参数,从多条融合曲线中确定与所述曝光参数对应的第一融合曲线;所述第一融合曲线用于指示颜色信息与融合系数之间的对应关系;所述终端设备将所述每帧HDR图像上相同位置的像素点分别作为第一像素点,针对每个第一像素点分别执行:所述终端设备根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数;所述终端设备根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合,得到最终图像;其中,所述第二像素点是所述终端设备根据匹配算法确定出的与所述第一像素点对应的位置相同的像素点;
在本申请实施例中,终端设备根据采集原始图像的曝光参数确定一条融合曲线,然后根据HDR图像上的像素点的颜色信息在融合曲线上确定融合系数。终端设备根据融合系数将RGB图像和HDR图像上相同位置的像素点的颜色信息融合后,得到质量较好的图像,提高图像拍摄质量。
在一种可能的设计中,所述终端设备根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数,具体可以包括:所述终端设备根据所述每帧HDR图像上的第一像素点的R颜色信息,在所述第一融合曲线上确定与所述R颜色信息对应的第一融合系数,得到N个第一融合系数;所述终端设备根据所述每帧HDR图像上的第一像素点的G颜色信息,在所述第一融合曲线上确定与所述G颜色信息对应的第二融合系数,得到N个第二融合系数;所述终端设备根据所述每帧HDR图像上的第一像素点的B颜色信息,在所述第一融合曲线上确定与所述B颜色信息对应的第三融合系数,得到N个第三融合系数;相应地,所述终端设备根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合,具体可以包括:所述终端设备根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合;所述终端设备根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的G颜色信息融合;所述终端设备根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合。
在本申请实施例中,终端设备根据HDR图像上第一像素点的三种颜色信息分别确定融合系数。终端设备根据某种颜色信息的融合系数将RGB图像和HDR图像上相同位置的像素点的该种颜色信息融合。即终端设备在融合RGB图像和HDR图像上相同位置的像素点的颜色信息时,对RGB图像和HDR图像上相同位置的像素点的每种颜色信息分别融合。通过这样的方式,得到质量较好的图像,提高图像拍摄质量。
在一种可能的设计中,所述终端设备根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合,可以符合下述公式要求:
Figure PCTCN2019110377-appb-000001
其中,R src是第一图像上的第二像素点的R颜色信息的取值,
Figure PCTCN2019110377-appb-000002
代表所述N帧HDR图像中第i帧HDR图像上第一像素点的R颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第一融合系数中基于第i帧HDR图像上的第一像素点的R颜色信息的取值在第一融合曲线上确定的第一融合系数;R dst为所述最终图像上的一个像 素点的R颜色信息的取值。
在本申请实施例中,终端设备融合RGB图像和HDR图像上相同位置的像素点的R(红)颜色信息时,符合上述公式要求。通过这样的方式,得到质量较好的图像,提高图像拍摄质量。
在一种可能的设计中,所述终端设备根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的G颜色信息融合,可以符合下述公式要求:
Figure PCTCN2019110377-appb-000003
其中,G src是第一图像上的第二像素点的G颜色信息的取值,
Figure PCTCN2019110377-appb-000004
代表所述N帧HDR图像中第i帧HDR图像上第一像素点的G颜色信息的取值,N代表HDR图像的帧数;f i是所述N个第二融合系数中基于第i帧HDR图像上第一像素点的G颜色信息的取值在第一融合曲线上确定的第二融合系数;G dst为所述最终图像上的一个像素点的R颜色信息的取值。
在本申请实施例中,终端设备融合RGB图像和HDR图像上相同位置的像素点的G(绿)颜色信息时,符合上述公式要求。通过这样的方式,得到质量较好的图像,提高图像拍摄质量。
在一种可能的设计中,所述终端设备根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合,可以符合下述公式要求:
Figure PCTCN2019110377-appb-000005
其中,B src是第一图像上的第二像素点的B颜色信息的取值,
Figure PCTCN2019110377-appb-000006
代表所述N帧HDR图像中第i帧HDR图像上第一像素点的B颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第三融合系数中基于第i帧HDR图像上第一像素点的G颜色信息的取值在第一融合曲线上确定的第三融合系数;B dst为最终图像上的一个像素点的B颜色信息的取值。
在本申请实施例中,终端设备融合RGB图像和HDR图像上相同位置的像素点的B(蓝)颜色信息时,符合上述公式要求。通过这样的方式,得到质量较好的图像,提高图像拍摄质量。
第二方面,本申请实施例提供一种终端设备。该终端设备包括摄像头、显示屏、存储器和处理器:所述处理器用于响应于用户操作,打开相机应用,启动摄像头;所述显示屏用于显示所述相机应用的取景界面;所述摄像头用于采集原始图像;所述存储器用于存储一个或多个计算机程序;当所述存储器存储的一个或多个计算机程序被所述处理器执行时,使得所述终端设备能够实现第一方面或者第一方面的任意一种可能的设计的方法。
第三方面,本申请实施例还提供了一种终端设备,所述终端设备包括执行第一方面或者第一方面的任意一种可能的设计的方法的模块/单元;这些模块/单元可以通过硬件实现,也可以通过硬件执行相应的软件实现。
第四方面,本申请实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当计算机程序在终端上运行时,使得所述终端设备执行第一 方面或上述第一方面的任意一种可能的设计的方法。
第五方面,本申请实施例还提供一种包含计算机程序产品,当所述计算机程序产品在终端设备上运行时,使得所述终端设备执行第一方面或上述第一方面的任意一种可能的设计的方法。
附图说明
图1为本申请实施例提供的手机100的结构示意图;
图2为本申请实施例提供的手机100的结构示意图;
图3为本申请实施例提供的通过手机100拍摄图像的流程示意图;
图4为本申请实施例提供的图像拍摄方法的效果示意图;
图5为本申请实施例提供的图像拍摄方法的流程示意图;
图6为本申请实施例提供的融合曲线的示意图;
图7为本申请实施例提供的图像拍摄方法的流程示意图;
图8为本申请实施例提供的终端设备的结构示意图;
图9为本申请实施例提供的终端设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
以下,对本发明实施例中的部分用语进行解释说明,以便于本领域技术人员理解。
本申请实施例涉及的曝光参数,是终端设备在拍摄图像时设置的参数。曝光参数可以用于指示设备在拍摄景物时,接收景物发出的光线的总量。曝光参数可以包括曝光时间和/或曝光强度等。
通常,曝光参数的取值的大小可以决定最终拍摄的图像的亮度值的大小。例如曝光时间较长或者曝光强度较大,则设备在拍摄图像时的进光量较大,所以拍摄的图像的亮度较大。如果曝光时间较短或者曝光强度较小,则设备在拍摄图像时的进光量较小,所以拍摄的图像的亮度较小。
本申请实施例涉及的像素,为一帧图像上的最小成像单元。一个像素可以对应图像上的一个坐标点。一个像素可以对应一个参数(比如灰度),也可以对应多个参数的集合(比如灰度、亮度、颜色等)。以颜色信息为例,通常,一帧图像上具有三种基础颜色,即红(Red,下文以R表示),绿(Green,下文以G表示),蓝(Blue,下文以B表示),其它颜色可以通过这三种基础颜色组合而成。因此,一帧图像上的每个像素点可以包含R、G、B三种颜色信息,且每个像素点上的R、G、B三种颜色信息的取值不同。比如,一个像素点对应的R、G、B三种颜色信息的取值均为0时,该像素点呈现白色,一个像素点对饮的R、G、B三种颜色信息的取值均为255时,该像素点呈现黑色。
本申请实施例涉及的原始图像,是摄像头的输出图像,即摄像头将采集的物体反射的光信息转化为数字图像信号而得到的原始数据,该原始数据未经过加工处理。比如,原始图像可以是raw格式数据。该raw格式数据中可以包括物体的信息和摄像头参数。其中,摄像头参数可以包括感光度(international standardization organization, ISO)、快门速度、光圈值、白平衡等。原始图像也是ISP和网络神经单元,比如下文的神经网络处理器(neural-network processing unit,NPU)的输入图像。
本申请实施例涉及的第一图像,是ISP的输出图像,由ISP对原始图像进行处理得到RGB格式或者YUV格式的图像,并且将RGB格式或者YUV格式的图像的亮度调整后得到的图像。其中,ISP将RGB格式或者YUV格式的图像的亮度调整的具体值,可以是用户设置的,也可以是手机在出厂时设置好的。第一图像也是处理器比如下文中的图形处理器(graphics processing unit,GPU)的输入图像。
本申请实施例涉及的HDR图像,即高动态范围图像(high dynamic range,HDR)图像,是网络神经单元的输出图像。神经网络单元可以基于现有技术中的高动态范围技术得到的HDR图像,关于高动态范围技术,本申请实施例不多赘述。其中,HDR图像也是处理器(比如下文中的GPU)的输入图像。
需要说明的是,本申请实施例涉及的“图像”,例如原始图像、第一图像、HDR图像等,可以是指图片,也可以是一些参数(比如,像素信息,颜色信息、亮度信息)的集合。
本申请实施例涉及的多个,是指大于或等于两个。
需要说明的是,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,如无特殊说明,一般表示前后关联对象是一种“或”的关系。且在本发明实施例的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。
以下介绍终端设备、用于这样的终端设备的图形用户界面(graphical user interface,GUI)、和用于使用这样的终端设备的实施例。在本申请一些实施例中,终端设备可以是包含摄像头等具有图像采集功能的器件的便携式终端,诸如手机、平板电脑等。便携式终端设备的示例性实施例包括但不限于搭载
Figure PCTCN2019110377-appb-000007
或者其它操作系统的便携式终端设备。上述便携式终端设备也可以是其它便携式终端设备,例如数码相机,只要具有图像采集功能即可。还应当理解的是,在本申请其他一些实施例中,上述终端设备也可以不是便携式终端设备,而是具有图像采集功能的台式计算机等。
通常情况下,终端设备支持多种应用。比如以下应用中的一个或多个:相机应用、即时消息收发应用、照片管理应用等。其中,即时消息收发应用可以有多种。比如微信、腾讯聊天软件(QQ)、WhatsApp Messenger、连我(Line)、照片分享(instagram)、Kakao Talk、钉钉等。用户通过即时消息收发应用,可以将文字、语音、图片、视频文件以及其他各种文件等信息发送给其他联系人;或者,用户可以通过即时消息收发应用实现与其他联系人的视频或音频通话。
以终端设备是手机为例,图1示出了手机100的结构示意图。
手机100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160、音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190, 马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
其中,处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是手机100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
手机100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机100可以包括1个或N个显示屏194,N为大于1的正整数。
手机100可以通过处理器110,摄像头193,显示屏194等实现图像拍摄功能。摄像头193用于捕获静态图像或视频。通常,摄像头193可以包括感光元件比如镜头组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的原始图像。图像传感器将生成的原始图像发送给处理器110。处理器110运行本申请实施例提供的图像拍摄算法对原始图像进行处理,得到处理后的图像,显示屏194显示处理后的图像。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
其中,距离传感器180F,用于测量距离。手机100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,手机100可以利用距离传感器180F测距以实现快速对焦。在另一些实施例中,手机100还可以利用距离传感器180F检测是否有人或物体靠近。
其中,接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机100通过发光二极管向外发射红外光。手机100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手机100附近有物体。当检测到不充分的反射光时,手机100可以确定手机100附近没有物体。手机100可以利用接近光传感器180G检测用户手持手机100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。手机100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测手机100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。手机100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,手机100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,手机100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,手机100对电池142加热,以避免低温导致手机100异常关机。在其他一些实施例中,当温度低于又一阈值时,手机100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机100的表面,与显示屏194所处的位置不同。
另外,手机100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。手机100可以接收按键190输入,产生与手机100的用户设置以及功能控制有关的键信号输入。 手机100可以利用马达191产生振动提示(比如来电振动提示)。手机100中的指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。手机100中的SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机100的接触和分离。
手机100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
以下实施例均可以在具有上述硬件结构的终端设备(例如手机100、平板电脑等)中实现。
为了方便描述本申请实施例提供的图像拍摄算法,下文将通过与本申请实施例提供的图像拍摄算法相关的部件介绍本申请实施例提供的图像拍摄算法。具体请参见图2。图2所示的部件可以参考关于图1的相关描述。需要说明的是,在图2中,以处理器110集成GPU 110-1、ISP110-2和NPU110-3为例。
下面介绍通过图2所示的手机100拍摄得到一张图像的过程。请参见图3所示,为本申请实施例提供的手机100拍摄得到一张图像的过程示意图。如图3所述,该过程包括:
第一步,手机100响应于用户操作,进入相机应用,打开摄像头193,显示取景界面。
示例性的,手机100的显示屏194显示主界面,主界面中包括各个应用程序的图标(比如电话应用图标、视频播放器图标、音乐播放器图标、相机应用图标、浏览器应用图标等等)。用户通过设置于显示屏194上的触摸传感器180K(图2中未示出,可参见图1所示)点击主界面中的相机应用的图标,启动相机应用,打开摄像头193。显示屏194显示相机应用的界面,例如取景界面。
第二步,摄像头193基于设置的曝光参数采集原始图像。
具体而言,请继续参见图2所示,摄像头193采集原始图像的过程为:摄像头193中的镜头组193-1采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器193-2。图像传感器193-2根据所述光信号生成待拍摄物体的原始图像。
需要说明的是,摄像头193采集原始图像之前,通常会设置好摄像头参数,比如曝光参数(可以是用户自定义设置的,也可以是手机100自己设置的)的取值。曝光参数包括ISO、曝光时间、光圈的大小、快门、进光量等多个参数中的至少一个参数;或者,曝光参数也可以是根据IOS、光圈、快门计算得到的用于指示曝光程度的其它参数,只要能够反映曝光程度即可,本申请实施例不作限定。
摄像头193采集的原始图像是未经过加工的原始图像。比如,原始图像可以是raw格式数据,raw格式数据中包括待拍摄物体的信息和摄像头参数(比如曝光参数)。
第三步,ISP110-1将原始图像转换成RGB图像,通常如果曝光参数设置较高时,摄像头193基于设置的曝光参数采集的原始图像的亮度会较高,这时ISP110-1可以将RGB图像的亮度调整到小于第一亮度,得到第一图像;如果曝光参数设置较低时,摄像头193基于设置的曝光参数采集的原始图像的亮度会较低,这时ISP110-1可以将RGB图像的亮度调整到大于第二亮度,得到第一图像。第一亮度和第二亮度的值可以预先设定,且第一亮度大于等于第二亮度。
如前述内容可知,原始图像中包括待拍摄物体的信息和摄像头参数。由于,摄像头193采集的原始图像是未经过加工处理的图像。ISP110-1可以基于RGB色彩模式对原始图像进行处理,得到包含颜色信息的图像即RGB图像。以摄像头193采集的原始图像是raw格式数据为例,ISP110-1可以基于RGB色彩模式将raw格式数据转换成RGB格式数据。其中,RGB格式数据是包含颜色信息数据。其中,SP110-1基于RGB色彩模式将raw格式数据转换成RGB格式数据的过程,可以参见现有技术,本申请实施例不限定。具体而言,ISP110-1将RGB图像调整到哪一个亮度值,可以根据用户的设置而定,这里仅是以第一亮度和第二亮度为例进行说明。
当然,ISP110-1还可以对基于YUV色彩模式将raw格式数据转换成包含颜色信息的YUV图像,本申请实施例不作限定。
第四步,ISP110-1将曝光参数和所述第一图像发送给GPU110-3,并将所述RGB图像发送给NPU110-2。
第五步,NPU110-2根据高动态范围技术对得到的RGB图像进行处理,得到HDR图像。具体而言,当ISP110-1将RGB图像的亮度调整到小于第一亮度时,NPU110-2可以基于高动态范围技术将HDR图像的亮度提升到大于所述第一亮度。当ISP110-1将RGB图像的亮度调整到大于第二亮度时,NPU110-2可以基于高动态范围技术将HDR图像的亮度降低到低于所述第二亮度。
关于高动态范围技术,可参见现有技术,本申请实施例不作限定。
第六步,NPU110-2将得到的HDR图像发送给GPU110-3。
第七步,GPU110-3运行本申请实施例提供的图像拍摄算法的代码,根据曝光参数确定与所述曝光参数对应的融合曲线,基于所述融合曲线将第一图像和HDR图像融合,得到最终图像。
请参见图4所示,为GPU110-3将第一图像和HDR图像融合的示意图。如图4所示,GPU110-3根据曝光参数确定融合曲线,根据该融合曲线将第一图像和HDR图像融合,得到最终图像。需要说明的是,如前述内容可知,曝光参数的调整的不合适会对图像的细节信息造成影响,比如,当过曝时,图像过亮,进而导致亮部细节清晰但是暗部细节丢失,欠曝时,图像过暗,进而导致图像暗部细节清晰但是亮部细节丢失。
因此,在本申请一些实施例中,当曝光参数的取值设置的较高时,ISP110-1可以将GRB图像的亮度降低,即第一图像的亮度较低(这样的话,第一图像上的暗部细节保持)。NPU110-2利用高动态范围技术将RGB图像的亮度提升,得到HDR图像,以恢复HDR图像上的亮部细节(这样的话,HDR图像上的亮部细节保持)。GPU110-3将第一图像和HDR图像融合,即将第一图像上的暗部细节和HDR图像的亮部细节融合,得到图像质量较高的最终图像。
在本申请另一些实施例中,当曝光参数的取值设置的较低时,ISP110-1可以将GRB图像的亮度升高,即第一图像的亮度较高(这样的话,第一图像上的亮部细节保持)。NPU110-2利用高动态范围技术将RGB图像的亮度降低,得到HDR图像,以恢复HDR图像暗部细节(这样的话,HDR图像上的暗部细节保持)。GPU110-3将第一图像和HDR图像融合,即将第一图像上的亮部细节和HDR图像的暗部细节融合,得到图像质量较高的最终图像。
其中,ISP110-1可以将原始图像中的曝光参数的取值与事先存储的曝光参数的取值比较,若原始图像中的曝光参数的取值大于事先存储的曝光参数的取值,则确定曝光参数的取值较高。若原始图像中的曝光参数的取值小于等于事先存储的曝光参数的取值,则确定曝光参数的取值较低。
其他关于第七步的具体实现过程,将在后文介绍。
第八步,显示屏194在取景界面中显示最终图像。
需要说明的是,在图2中是以处理器110集成GPU 110-1、ISP110-2和NPU110-3为例的,在实际应用中,处理器110可以只集成GPU 110-1、ISP110-2和NPU110-3中的一个或者两个。假设处理器110只集成GPU 110-1和ISP110-2,那么上述实施例中NPU110-3的功能(基于原始图像得到HDR图像的功能)可以由GPU 110-1或ISP110-2执行。再假设处理器110只集成NPU110-3和ISP110-2,那么上述实施例中GPU 110-1的功能(运行本申请实施例提供的图像拍摄算法融合第一图像和HDR图像 的功能)可以由NPU110-3或ISP110-2执行。另外,图2中的处理器110还可以只集成除去GPU 110-1、ISP110-2和NPU110-3之外的其它处理器比如中央处理CPU,则上述GPU 110-1、ISP110-2和NPU110-3的功能均由CPU执行;或者图2中的处理器110可以集成CPU、ISP110-2和NPU110-3,那么上述GPU 110-1的功能由CPU执行。总之,本申请实施例的图像拍摄算法可以在各类处理器上运行,本申请实施例不作限定。
图3所示的流程是通过手机100的相机应用拍摄图像为例的。实际上,本申请实施例提供的图像拍摄方法还可以适用于其它场景。比如,手机100中的微信应用的视频通话、或者QQ视频通话的场景等使用摄像头采集图像的场景。
下面介绍手机100中的GPU110-3运行本申请实施例提供的图像拍摄方法、根据曝光参数确定与所述曝光参数对应的融合曲线,基于所述融合曲线将第一图像和HDR图像融合,得到最终图像的过程,即图3所示的第七步过程。具体而言,请参见图5所示,GPU110-3运行本申请实施例提供的图像拍摄算法的代码,执行如下过程:
S501:根据原始图像的曝光参数从多条融合曲线中确定与所述曝光参数对应的第一融合曲线。
作为一种示例,手机100中可以预先存储有多条融合曲线。这些融合曲线可以是设计人员在手机100出厂前经过试验获得的,并存储在手机100(比如内部存储器121)中。
请参见图6所示,为本申请实施例提供的融合曲线的示意图。如图6所示,横纵是颜色信息(比如R、G、B颜色信息的取值)、纵轴是融合系数。以其中一条融合曲线为例,该融合曲线可以反映融合系数与R颜色信息(G颜色信息或B颜色信息)的取值之间的对应关系。
假设曝光参数的取值是取值1,GPU110-3根据曝光参数的取值1在图6所示的多条融合曲线中,确定出与该取值1对应的第一融合曲线。
需要说明的是,图6是以三条融合曲线为例的,在实际应用中,手机100可以包含更多的融合曲线。
作为一种示例,GPU110-3确定曝光参数小于第一曝光参数(比如第一曝光参数的取值为取值1)时,确定第一融合曲线(比如图6中的曝光参数取值为1的融合曲线),GPU110-3确定曝光参数大于等于第一曝光参数小于第二曝光参数(比如第二曝光参数的取值为取值2)时,确定第二融合曲线(比如图6中的曝光参数取值为2的融合曲线),GPU110-3确定曝光参数大于等于第二曝光参数小于第三曝光参数(比如第三曝光参数取值为取值3)时,确定第三融合曲线(比如图6中的曝光参数取值为3的融合曲线)。这种方式中,手机100中无需存储很多的融合曲线,节省内存。
作为另一种示例,曝光参数和融合曲线是一一对应的关系,即每个曝光参数可以确定一条不同的融合曲线。这种方式中,手机100中存储的融合曲线较多,根据曝光参数确定的融合曲线较为准确。
S502:确定HDR图像中第一像素点的R、G、B颜色信息各自的取值。
如前述内容可知,ISP110-1对原始图像进行处理,得到包含颜色信息的RGB图像,然后将RGB图像的亮度调整,得到第一图像。因此,步骤S502可参见现有技术,本 申请实施例不多赘述。
S503:根据R颜色信息的取值在所述第一融合曲线上确定与所述R颜色信息对应的第一融合系数;根据G颜色信息的取值在所述第一融合曲线上确定与所述G颜色信息对应的第二融合系数;根据B颜色信息的取值在所述第一融合曲线上确定与所述B颜色信息对应的第三融合系数。
举例来说,假设R颜色信息的取值是150,再假设在S501中,GPU110-1确定第一融合曲线是图6所示的曝光参数的取值2的融合曲线,那么GPU110-1在横坐标上找到R颜色信息的取值是150的值,并确定在该第一融合曲线上与取值是150的值对应的纵坐标,该纵坐标即第一融合系数。对于G、B颜色信息,也可以采用类似的方式,不多赘述。
需要说明的是,对于R、G、B颜色信息的融合系数的确定可以均采用图6所示的融合曲线,即图6所示的横坐标可以代表R、G、B三种颜色信息的取值。或者,图6所示的融合曲线的横坐标仅仅是R颜色信息和融合系数的对应关系,设计人员可以对G、B颜色信息均设计一组类似于图6所示的融合曲线。
S504:在第一图像上确定与HDR图像上的第一像素点处于相同位置的第二像素点。
由于第一图像和HDR图像都是基于原始图像得到的,而原始图像上的坐标点是在成像平面上可以确定的。因此,GPU110-3可以确定第一图像和HDR图像上的每个像素点的坐标值。请继续参见图4为例,图4示出了一帧HDR图像和一帧第一图像,GPU110-3可以在HDR图像上选择一个像素点B(即第一像素点),GPU110-3可以根据现有技术的匹配算法(比如相似度匹配算法)得到第一图像上与像素点B对应的像素点A(即第二像素点);或者,GPU110-3可以确定第一图像和HDR图像上的处于相同位置(坐标值相同)像素点。
S505:根据第一融合系数和第一融合公式,将所述第一像素点和所述第二像素点上的R颜色信息融合;根据第二融合系数和第二融合公式,将所述第一像素点和所述第二像素点上的G颜色信息融合;根据第三融合系数和第三融合公式,将所述第一像素点和所述第二像素点上的B颜色信息融合,得到最终图像。
如前述内容可知,一帧图像上的每个像素点包含R、G、B三种颜色信息,且每个像素点上的R、G、B三种颜色信息的取值不同。因此,在本申请实施例中,GPU110-3融合第一图像和HDR图像时,可以将这两帧图像上的每个像素点的颜色信息分别融合。
继续以图4为例,GPU110-3确定像素点A和像素点B之后,可以将像素点A的R、G、B三种颜色信息和像素点B的R、G、B三种颜色信息分别融合。具体而言,GPU110-3将像素点A的R颜色信息和像素点B的R颜色信息融合,将像素点A的G颜色信息和像素点B的G颜色信息融合,像素点A的B颜色信息和像素点B的B颜色信息融合。
以R颜色信息为例,GPU110-3可以根据第一融合公式和第一融合系数(S503中确定的第一融合系数)将像素点A的R颜色信息和像素点B的R颜色信息融合。
作为一种示例,手机100中可以存储有第一融合公式(用于计算最终图像上一个 像素点的R颜色信息的融合公式),请参见公式(1):
R dst=(1-f1)·R src+f1·R res  公式(1)
其中,R src是第一图像上的一个像素点(比如像素点A)的R颜色信息的取值,R res代表HDR图像的一个像素点(比如像素点B)的R颜色信息的取值;f1基于HDR图像上像素点(比如像素点B)的R颜色信息的取值在第一融合曲线上确定的第一融合系数(步骤S503)。通过公式(1)得到最终图像上的一个像素点(比如像素点C)的R颜色信息的取值,即R dst
作为一种示例,手机100中可以存储有第二融合公式(用于计算最终图像上一个像素点的G颜色信息的融合公式),请参见公式(2):
G dst=(1-f2)·G src+f2·G res  公式(2)
其中,G src是第一图像上的一个像素点(比如像素点A)的G颜色信息的取值,G res代表HDR图像的一个像素点(比如像素点B)的G颜色信息的取值。f2基于HDR图像上像素点(比如像素点B)的G颜色信息的取值在第一融合曲线上确定的第二融合系数(步骤S503)。通过公式(2)得到最终图像上的一个像素点(比如像素点C)的G颜色信息的取值,即G dst
作为一种示例,手机100中可以存储有第三融合公式(用于计算融合后的图像上一个像素点的B颜色信息的融合公式),请参见公式(3):
B dst=(1-f3)·B src+f3·B res  公式(3)
其中,B src是第一图像上的一个像素点(比如像素点A)的B颜色信息的取值,B res代表HDR图像的一个像素点(比如像素点B)的B颜色信息的取值。f3基于HDR图像上像素点(比如像素点B)的B颜色信息的取值在第一融合曲线上确定的第三融合系数(步骤S503)。通过公式(3)得到最终图像上的一个像素点(比如像素点C)的B颜色信息的取值,即B dst
通过上述公式(1)-(3)可以确定融合后的图像上的一个像素点(即像素点C)的RGB颜色信息,对于其它像素点可以采用类似的方式,最终确定每个像素点的RGB颜色信息,进而得到最终图像。
需要说明的是,图5所示实施例中的S501-S505之间的执行顺序,本申请实施例不限定。
在上述实施例中,GPU110-1是将第一图像和HDR图像融合得到最终图像。下面介绍另一实施例,在该实施例中,NPU110-2可以对RGB图像上的不同区域做不同的亮度调整,得到多帧HDR图像,即图3所示的流程中的第五步中,NPU110-2对RGB图像上的不同区域做不同的亮度调整,得到多帧HDR图像。NPU110-2将得到的多帧HDR图像发送给GPU110-3。GPU110-3将该多帧HDR图像和第一图像融合,得到最终图像(对应图3所示的第七步)。
请参见图7所示,为本申请实施例提供的图像拍摄过程的流程示意图。如图7所示,GPU110-3根据曝光参数确定融合曲线,根据该融合曲线将第一图像和多帧HDR图像融合,得到最终图像。其中,NPU110-2对RGB图像上的不同区域分别采用高动态范围技术,得到多帧HDR图像。以两帧HDR图像为例,当ISP110-1将RGB图像的亮度调整到小于第一亮度时,NPU110-2可以基于高动态范围技术将RGB图像的第 一区域(全部区域或部分区域)的亮度提升到大于所述第一亮度的某个亮度值,得到一帧HDR图像。NPU110-2可以基于高动态范围技术将RGB图像的第二区域(全部区域或部分区域)的亮度提升到大于所述第一亮度的另一个亮度值,得到另一帧HDR图像。当ISP110-1将RGB图像的亮度调整到大于第二亮度时,NPU110-2可以基于高动态范围技术将RGB图像上的第一区域(全部区域或部分区域)的亮度降低到低于所述第二亮度的某个亮度值,得到一帧HDR图像。NPU110-2可以基于高动态范围技术将RGB图像的第二区域(全部区域或部分区域)的亮度降低到小于所述第二亮度的另一个亮度值,得到另一帧HDR图像。总之,NPU110-2得到多帧HDR图像的亮度可以不同。这样的话,NPU110-2得到的多帧HDR图像中每帧HDR图像上不同区域的细节不同。GPU110-3可以将第一图像和多帧HDR图像融合,得到最终图像。
需要说明的是,GPU110-3将多帧HDR图像和第一图像融合的流程与图4所示的流程类似。不同的是,在S404中,GPU110-3在每帧HDR图像上确定与第一图像上的第一像素点(比如像素点A)对应的第二像素点(比如像素点B1、B2)。在S405中,GPU110-3采用的融合公式不同。
以R颜色信息为例,用于计算最终图像上一个像素点的R颜色信息的融合公式与前述的公式(1)不同,具体请参见公式(4):
Figure PCTCN2019110377-appb-000008
其中,R src是第一图像上的一个像素点(比如像素点A)的R颜色信息的取值,
Figure PCTCN2019110377-appb-000009
代表第i帧HDR图像的像素点的R颜色信息的取值(比如
Figure PCTCN2019110377-appb-000010
是第一帧HDR图像上的像素点B1的R颜色信息取值、
Figure PCTCN2019110377-appb-000011
是第二帧HDR图像上的像素点B2的R颜色信息取值,以此类推);n代表NPU110-2得到的HDR图像的数量(当n等于1时,即NPU110-2得到一帧HDR图像,即前述的图4所示的实施例)。f i是基于第i帧HDR图像上像素点的R颜色信息的取值(即
Figure PCTCN2019110377-appb-000012
)在第一融合曲线上确定的融合系数(比如,f 1是基于第一帧HDR图像上的像素点B1的R颜色信息的取值即
Figure PCTCN2019110377-appb-000013
在第一融合曲线上确定出的融合系数,f 2是基于第二帧HDR图像上的像素点B2的R颜色信息的取值即
Figure PCTCN2019110377-appb-000014
在第一融合曲线上确定出的融合系数,以此类推)。
通过公式(5)得到最终图像上的一个像素点(比如像素点C)的G颜色信息的取值,即G dst
以G颜色信息为例,用于计算最终图像上一个像素点的G颜色信息的融合公式与前述的公式(2)不同,具体请参见公式(5):
Figure PCTCN2019110377-appb-000015
其中,G src是第一图像上的该像素点(比如像素点A)的G颜色信息的取值,公式(5)中
Figure PCTCN2019110377-appb-000016
代表第i帧HDR图像的像素点的G颜色信息的取值(比如,比如
Figure PCTCN2019110377-appb-000017
是第一帧HDR图像上的像素点B1的G颜色信息的取值、
Figure PCTCN2019110377-appb-000018
是第二帧HDR图像上的像素点B2的G颜色信息的取值,以此类推),n代表NPU110-2得到的HDR图像的帧数。f i是基于第i帧HDR图像上像素点的G颜色信息的取值(即
Figure PCTCN2019110377-appb-000019
)在第一融合曲线上确定的融合系数(比如,f 1是基于第一帧HDR图像上的像素点B1的G颜色信息的取值即
Figure PCTCN2019110377-appb-000020
在第一融合曲线上确定出的融合系数,f 2是基于第二帧HDR图像上的像素点B2的G颜色信息即
Figure PCTCN2019110377-appb-000021
的取值在第一融合曲线上确定出的融合系数,以此类 推)。
通过公式(5)得到最终图像上的一个像素点(比如像素点C)的G颜色信息的取值,即G dst
以B颜色信息为例,用于计算最终图像上一个像素点的B颜色信息的融合公式与前述的公式(3)不同,具体请参见公式(6):
Figure PCTCN2019110377-appb-000022
其中,B src是第一图像上的该像素点(比如像素点A)的B颜色信息的取值,
Figure PCTCN2019110377-appb-000023
代表第i帧HDR图像的像素点的B颜色信息的取值(比如
Figure PCTCN2019110377-appb-000024
是第一帧HDR图像上的像素点B1的B颜色信息的取值、
Figure PCTCN2019110377-appb-000025
是第二帧HDR图像上的像素点B2的B颜色信息的取值,以此类推)。n代表NPU110-2得到的HDR图像的帧数。f i是基于第i帧HDR图像上像素点的B颜色信息(即
Figure PCTCN2019110377-appb-000026
)的取值在第一融合曲线上确定的融合系数(比如,f 1是基于第一帧HDR图像上的像素点B1的B颜色信息的取值即
Figure PCTCN2019110377-appb-000027
在第一融合曲线上确定出的融合系数,f 2是基于第二帧HDR图像上的像素点B2的B颜色信息的取值即
Figure PCTCN2019110377-appb-000028
在第一融合曲线上确定出的融合系数,以此类推)。
通过公式(6)得到最终图像上的一个像素点(比如像素点C)的B颜色信息的取值,即B dst
通过上述公式(4)-(6)可以确定融合后的图像上的一个像素点(即像素点C)的R、G、B颜色信息,对于其它像素点可以采用类似的方式,最终确定每个像素点的R、G、B颜色信息。
本申请的各个实施方式可以任意进行组合,以实现不同的技术效果。
上述本申请提供的实施例中,从终端设备(手机100)作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,终端可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
基于相同的构思,图8所示为本申请提供的一种终端设备800。如图8所示,该终端设备800可以包括:摄像头801、显示屏802、一个或多个处理器803;存储器804;以及一个或多个计算机程序805,上述各器件可以通过一个或多个通信总线806连接。
其中,摄像头1402用于采集原始图像;其中,所示一个或多个计算机程序805被存储在上述存储器804中并被配置为被该一个或多个处理器803执行,该一个或多个计算机程序805包括指令,上述指令可以用于执行如图3或图5中的全部或部分步骤及相应实施例中的各个步骤。
基于相同的构思,图9所示为本申请提供的一种终端设备900。如图9所示,该终端设备900可以包括:图像采集单元901、处理单元902和显示单元903。
处理单元902,用于响应于用户操作,打开相机应用,启动摄像头。
显示单元903,用于显示取景界面。
所述处理单元902,还用于将所述摄像头采集的原始图像转换成RGB图像;将所述RGB图像的亮度降低到小于第一亮度或提高到大于第二亮度,得到第一图像,所述第一亮度大于所述第二亮度;采用HDR技术将所述RGB图像转换成N帧HDR图像, 所述N帧HDR图像每帧HDR图像的亮度不同,且在所述RGB图像的亮度降低到小于第一亮度时,所述N帧HDR图像的亮度均大于所述第一亮度,或者在所述RGB图像的亮度提高到大于第二亮度时,所述N帧HDR图像的亮度均小于所述第二亮度;N为正整数;将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像。
所述显示单元903还用于在所述取景界面中显示所述最终图像。
本发明实施例还提供一种计算机存储介质,该存储介质可以包括存储器,该存储器可存储有程序,该程序被执行时,使得终端执行包括如前的执行如前的图3或图5所示的方法实施例中记载的终端所执行的全部或部分步骤。
本发明实施例还提供一种包含计算机程序产品,当所述计算机程序产品在终端上运行时,使得所述终端执行包括如前的图3或图5所示的方法实施例中记载的终端所执行的全部或部分步骤。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本申请实施例可以用硬件实现,或固件实现,或它们的组合方式来实现。当使用软件实现时,可以将上述功能存储在计算机可读介质中或作为计算机可读介质上的一个或多个指令或代码进行传输。计算机可读介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是计算机能够存取的任何可用介质。以此为例但不限于:计算机可读介质可以包括RAM、ROM、电可擦可编程只读存储器(electrically erasable programmable read only memory,EEPROM)、只读光盘(compact disc read-Only memory,CD-ROM)或其他光盘存储、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质。此外。任何连接可以适当的成为计算机可读介质。例如,如果软件是使用同轴电缆、光纤光缆、双绞线、数字用户线(digital subscriber line,DSL)或者诸如红外线、无线电和微波之类的无线技术从网站、服务器或者其他远程源传输的,那么同轴电缆、光纤光缆、双绞线、DSL或者诸如红外线、无线和微波之类的无线技术包括在所属介质的定影中。如本申请实施例所使用的,盘(disk)和碟(disc)包括压缩光碟(compact disc,CD)、激光碟、光碟、数字通用光碟(digital video disc,DVD)、软盘和蓝光光碟,其中盘通常磁性的复制数据,而碟则用激光来光学的复制数据。上面的组合也应当包括在计算机可读介质的保护范围之内。
总之,以上所述仅为本申请的实施例而已,并非用于限定本申请的保护范围。凡根据本申请的揭露,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (14)

  1. 一种图像拍摄方法,其特征在于,所述方法包括:
    终端设备响应于用户操作,打开相机应用,启动摄像头,显示取景界面;
    所述终端设备将所述摄像头采集的原始图像转换成RGB图像;
    所述终端设备将所述RGB图像的亮度降低到小于第一亮度或提高到大于第二亮度,得到第一图像,所述第一亮度大于所述第二亮度;
    所述终端设备采用HDR技术将所述RGB图像转换成N帧HDR图像,所述N帧HDR图像每帧HDR图像的亮度不同,且在所述RGB图像的亮度降低到小于第一亮度时,所述N帧HDR图像的亮度均大于所述第一亮度,或者在所述RGB图像的亮度提高到大于第二亮度时,所述N帧HDR图像的亮度均小于所述第二亮度;N为正整数;
    所述终端设备将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像;
    所述终端设备在所述取景界面中显示所述最终图像。
  2. 如权利要求1所述的方法,其特征在于,所述终端设备将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像,包括:
    所述终端设备根据采集所述原始图像的曝光参数,从多条融合曲线中确定与所述曝光参数对应的第一融合曲线;所述第一融合曲线用于指示颜色信息与融合系数之间的对应关系;
    所述终端设备将所述每帧HDR图像上相同位置的像素点分别作为第一像素点,针对每个第一像素点分别执行:
    所述终端设备根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数;
    所述终端设备根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合,得到最终图像;其中,所述第二像素点是所述第一图像上与所述第一像素点处于位置相同的像素点。
  3. 如权利要求2所述的方法,其特征在于,所述终端设备根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数;包括:
    所述终端设备根据所述每帧HDR图像上的第一像素点的R颜色信息,在所述第一融合曲线上确定与所述R颜色信息对应的第一融合系数,得到N个第一融合系数;
    所述终端设备根据所述每帧HDR图像上的第一像素点的G颜色信息,在所述第一融合曲线上确定与所述G颜色信息对应的第二融合系数,得到N个第二融合系数;
    所述终端设备根据所述每帧HDR图像上的第一像素点的B颜色信息,在所述第一融合曲线上确定与所述B颜色信息对应的第三融合系数,得到N个第三融合系数;
    所述终端设备根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合,包括:
    所述终端设备根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合;
    所述终端设备根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像 素点和所述第二像素点的G颜色信息融合;
    所述终端设备根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合。
  4. 如权利要求3所述的方法,其特征在于,所述终端设备根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100001
    其中,R src是第一图像上的第二像素点的R颜色信息的取值,
    Figure PCTCN2019110377-appb-100002
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的R颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第一融合系数中基于第i帧HDR图像上的第一像素点的R颜色信息的取值在第一融合曲线上确定的第一融合系数;R dst为所述最终图像上的一个像素点的R颜色信息的取值。
  5. 如权利要求3或4所述的方法,其特征在于,所述终端设备根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的G颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100003
    其中,G src是第一图像上的第二像素点的G颜色信息的取值,
    Figure PCTCN2019110377-appb-100004
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的G颜色信息的取值,N代表HDR图像的帧数;f i是所述N个第二融合系数中基于第i帧HDR图像上第一像素点的G颜色信息的取值在第一融合曲线上确定的第二融合系数;G dst为所述最终图像上的一个像素点的R颜色信息的取值。
  6. 如权利要求3-5任一所述的方法,其特征在于,所述终端设备根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100005
    其中,B src是第一图像上的第二像素点的B颜色信息的取值,
    Figure PCTCN2019110377-appb-100006
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的B颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第三融合系数中基于第i帧HDR图像上第一像素点的G颜色信息的取值在第一融合曲线上确定的第三融合系数;B dst为最终图像上的一个像素点的B颜色信息的取值。
  7. 一种终端设备,其特征在于,包括摄像头、显示屏、存储器和处理器:
    所述处理器用于响应于用户操作,打开相机应用,启动摄像头;
    所述显示屏用于显示所述相机应用的取景界面;
    所述摄像头用于采集原始图像;
    所述存储器用于存储一个或多个计算机程序;
    所述处理器调用所述存储器存储的一个或多个计算机程序,执行:
    将所述摄像头采集的原始图像转换成RGB图像;
    将所述RGB图像的亮度降低到小于第一亮度或提高到大于第二亮度,得到第一图像,所述第一亮度大于所述第二亮度;
    采用HDR技术将所述RGB图像转换成N帧HDR图像,所述N帧HDR图像每帧HDR图像的亮度不同,且在所述RGB图像的亮度降低到小于第一亮度时所述N帧HDR图像的亮度均大于所述第一亮度,或者在所述RGB图像的亮度提高到大于第二亮度时所述N帧HDR图像的亮度均小于所述第二亮度;N为正整数;
    将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像;
    并在所述显示屏显示的所述取景界面中显示所述最终图像。
  8. 如权利要求7所述的终端设备,其特征在于,所述处理器将所述第一图像与所述N帧HDR图像中任意相同位置的像素点的颜色信息进行融合,得到最终图像时,具体执行:
    根据采集所述原始图像的曝光参数,从多条融合曲线中确定与所述曝光参数对应的第一融合曲线;所述第一融合曲线用于指示颜色信息与融合系数之间的对应关系;
    将所述每帧HDR图像上相同位置的像素点分别作为第一像素点,针对每个第一像素点分别执行:
    根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数;
    根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合,得到最终图像;其中,所述第二像素点是所述第一图像上与所述第一像素点处于位置相同的像素点。
  9. 如权利要求8所述的终端设备,其特征在于,所述处理器根据所述每帧HDR图像上第一像素点的颜色信息,在所述第一融合曲线上确定与所述颜色信息对应的融合系数时,具体执行:
    根据所述每帧HDR图像上的第一像素点的R颜色信息,在所述第一融合曲线上确定与所述R颜色信息对应的第一融合系数,得到N个第一融合系数;
    根据所述每帧HDR图像上的第一像素点的G颜色信息,在所述第一融合曲线上确定与所述G颜色信息对应的第二融合系数,得到N个第二融合系数;
    根据所述每帧HDR图像上的第一像素点的B颜色信息,在所述第一融合曲线上确定与所述B颜色信息对应的第三融合系数,得到N个第三融合系数;
    所述处理器根据所述融合系数将所述每帧HDR图像上的第一像素点和所述第一图像上的第二像素点的颜色信息融合时,具体执行:
    根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合;
    根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的G颜色信息融合;
    根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合。
  10. 如权利要求9所述的终端设备,其特征在于,所述处理器执根据所述N个第一融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的R颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100007
    其中,R src是第一图像上的第二像素点的R颜色信息的取值,
    Figure PCTCN2019110377-appb-100008
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的R颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第一融合系数中基于第i帧HDR图像上的第一像素点的R颜色信息的取值在第一融合曲线上确定的第一融合系数;R dst为所述最终图像上的一个像素点的R颜色信息的取值。
  11. 如权利要求9或10所述的终端设备,其特征在于,所述处理器根据所述N个第二融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的G颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100009
    其中,G src是第一图像上的第二像素点的G颜色信息的取值,
    Figure PCTCN2019110377-appb-100010
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的G颜色信息的取值,N代表HDR图像的帧数;f i是所述N个第二融合系数中基于第i帧HDR图像上第一像素点的G颜色信息的取值在第一融合曲线上确定的第二融合系数;G dst为所述最终图像上的一个像素点的R颜色信息的取值。
  12. 如权利要求9-11任一所述的终端设备,其特征在于,所述处理器根据所述N个第三融合系数将所述每帧HDR图像上的所述第一像素点和所述第二像素点的B颜色信息融合,符合下述公式要求:
    Figure PCTCN2019110377-appb-100011
    其中,B src是第一图像上的第二像素点的B颜色信息的取值,
    Figure PCTCN2019110377-appb-100012
    代表所述N帧HDR图像中第i帧HDR图像上第一像素点的B颜色信息的取值;N代表HDR图像的帧数;f i是所述N个第三融合系数中基于第i帧HDR图像上第一像素点的B颜色信息的取值在第一融合曲线上确定的第三融合系数;B dst为最终图像上的一个像素点的B颜色信息的取值。
  13. 一种计算机存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当计算机程序在终端上运行时,使得所述终端执行如权利要求1至6任一所述的方法。
  14. 一种包含指令的计算机程序产品,其特征在于,当所述指令在计算机上运行时,使得所述计算机执行如权利要求1-6任一项所述的方法。
PCT/CN2019/110377 2018-10-11 2019-10-10 一种图像拍摄方法和终端设备 WO2020073957A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201980066192.7A CN112840642B (zh) 2018-10-11 2019-10-10 一种图像拍摄方法和终端设备
RU2021112627A RU2758595C1 (ru) 2018-10-11 2019-10-10 Способ захвата изображения и оконечное устройство
EP19871713.4A EP3840369B1 (en) 2018-10-11 2019-10-10 Image capturing method and terminal device
US17/284,117 US11595588B2 (en) 2018-10-11 2019-10-10 Image capturing method and terminal device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811185480.8 2018-10-11
CN201811185480.8A CN111050143B (zh) 2018-10-11 2018-10-11 一种图像拍摄方法和终端设备

Publications (1)

Publication Number Publication Date
WO2020073957A1 true WO2020073957A1 (zh) 2020-04-16

Family

ID=70163944

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/110377 WO2020073957A1 (zh) 2018-10-11 2019-10-10 一种图像拍摄方法和终端设备

Country Status (5)

Country Link
US (1) US11595588B2 (zh)
EP (1) EP3840369B1 (zh)
CN (2) CN111050143B (zh)
RU (1) RU2758595C1 (zh)
WO (1) WO2020073957A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111050143B (zh) * 2018-10-11 2021-09-21 华为技术有限公司 一种图像拍摄方法和终端设备
CN111885312B (zh) * 2020-07-27 2021-07-09 展讯通信(上海)有限公司 Hdr图像的成像方法、系统、电子设备及存储介质
US11689822B2 (en) 2020-09-04 2023-06-27 Altek Semiconductor Corp. Dual sensor imaging system and privacy protection imaging method thereof
CN115767290B (zh) * 2022-09-28 2023-09-29 荣耀终端有限公司 图像处理方法和电子设备
CN116452437B (zh) * 2023-03-20 2023-11-14 荣耀终端有限公司 高动态范围图像处理方法及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104320575A (zh) * 2014-09-30 2015-01-28 百度在线网络技术(北京)有限公司 一种用于便携式终端的图像处理方法及图像处理装置
US20150130967A1 (en) * 2013-11-13 2015-05-14 Nvidia Corporation Adaptive dynamic range imaging
CN104869297A (zh) * 2015-06-15 2015-08-26 联想(北京)有限公司 图像处理方法和电子设备
CN105872393A (zh) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 高动态范围图像的生成方法和装置
CN106920221A (zh) * 2017-03-10 2017-07-04 重庆邮电大学 兼顾亮度分布和细节呈现的曝光融合方法
CN108307109A (zh) * 2018-01-16 2018-07-20 维沃移动通信有限公司 一种高动态范围图像预览方法及终端设备

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6204881B1 (en) 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US6753876B2 (en) 2001-12-21 2004-06-22 General Electric Company Method for high dynamic range image construction based on multiple images with multiple illumination intensities
JP4424403B2 (ja) * 2007-09-28 2010-03-03 ソニー株式会社 撮像装置、撮像方法及び撮像プログラム
CN101394487B (zh) 2008-10-27 2011-09-14 华为技术有限公司 一种合成图像的方法与系统
US8406569B2 (en) * 2009-01-19 2013-03-26 Sharp Laboratories Of America, Inc. Methods and systems for enhanced dynamic range images and video from multiple exposures
US8525900B2 (en) 2009-04-23 2013-09-03 Csr Technology Inc. Multiple exposure high dynamic range image capture
US8570396B2 (en) * 2009-04-23 2013-10-29 Csr Technology Inc. Multiple exposure high dynamic range image capture
KR101633893B1 (ko) * 2010-01-15 2016-06-28 삼성전자주식회사 다중노출 영상을 합성하는 영상합성장치 및 방법
US9204113B1 (en) 2010-06-28 2015-12-01 Ambarella, Inc. Method and/or apparatus for implementing high dynamic range image processing in a video processing system
CN102457669B (zh) * 2010-10-15 2014-04-16 华晶科技股份有限公司 图像处理方法
JP5802520B2 (ja) * 2011-11-11 2015-10-28 株式会社 日立産業制御ソリューションズ 撮像装置
CN103247036B (zh) * 2012-02-10 2016-05-18 株式会社理光 多曝光图像融合方法和装置
US9167174B1 (en) 2014-11-05 2015-10-20 Duelight Llc Systems and methods for high-dynamic range images
CN102970549B (zh) * 2012-09-20 2015-03-18 华为技术有限公司 图像处理方法及装置
US10255888B2 (en) * 2012-12-05 2019-04-09 Texas Instruments Incorporated Merging multiple exposures to generate a high dynamic range image
US9363446B2 (en) * 2013-04-15 2016-06-07 Htc Corporation Automatic exposure control for sequential images
JP6278729B2 (ja) * 2014-02-19 2018-02-14 キヤノン株式会社 撮像装置、その制御方法及びプログラム
JP2015207861A (ja) * 2014-04-18 2015-11-19 コニカミノルタ株式会社 撮像装置および撮像方法
US9344638B2 (en) * 2014-05-30 2016-05-17 Apple Inc. Constant bracket high dynamic range (cHDR) operations
WO2016020189A1 (en) 2014-08-08 2016-02-11 Koninklijke Philips N.V. Methods and apparatuses for encoding hdr images
JP2017028490A (ja) * 2015-07-22 2017-02-02 ルネサスエレクトロニクス株式会社 撮像センサ及びセンサモジュール
CN105578068B (zh) * 2015-12-21 2018-09-04 广东欧珀移动通信有限公司 一种高动态范围图像的生成方法、装置及移动终端
CN105704349B (zh) * 2016-04-01 2019-03-15 成都振芯科技股份有限公司 一种基于亮区和暗区分别调整的单帧宽动态增强方法
CN105933617B (zh) * 2016-05-19 2018-08-21 中国人民解放军装备学院 一种用于克服动态问题影响的高动态范围图像融合方法
CN108335279B (zh) * 2017-01-20 2022-05-17 微软技术许可有限责任公司 图像融合和hdr成像
CN106791475B (zh) * 2017-01-23 2019-08-27 上海兴芯微电子科技有限公司 曝光调整方法及所适用的车载摄像装置
CN107205120B (zh) * 2017-06-30 2019-04-09 维沃移动通信有限公司 一种图像的处理方法和移动终端
CN107451979B (zh) * 2017-08-08 2022-11-01 腾讯科技(深圳)有限公司 一种图像处理方法、装置和存储介质
CN107800971B (zh) * 2017-10-27 2019-08-20 Oppo广东移动通信有限公司 全景拍摄的自动曝光控制处理方法、装置及设备
CN108391059A (zh) 2018-03-23 2018-08-10 华为技术有限公司 一种图像处理的方法和装置
CN111050143B (zh) 2018-10-11 2021-09-21 华为技术有限公司 一种图像拍摄方法和终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150130967A1 (en) * 2013-11-13 2015-05-14 Nvidia Corporation Adaptive dynamic range imaging
CN104320575A (zh) * 2014-09-30 2015-01-28 百度在线网络技术(北京)有限公司 一种用于便携式终端的图像处理方法及图像处理装置
CN104869297A (zh) * 2015-06-15 2015-08-26 联想(北京)有限公司 图像处理方法和电子设备
CN105872393A (zh) * 2015-12-08 2016-08-17 乐视移动智能信息技术(北京)有限公司 高动态范围图像的生成方法和装置
CN106920221A (zh) * 2017-03-10 2017-07-04 重庆邮电大学 兼顾亮度分布和细节呈现的曝光融合方法
CN108307109A (zh) * 2018-01-16 2018-07-20 维沃移动通信有限公司 一种高动态范围图像预览方法及终端设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3840369A4

Also Published As

Publication number Publication date
US20210385368A1 (en) 2021-12-09
EP3840369B1 (en) 2024-02-14
RU2758595C1 (ru) 2021-11-01
CN111050143B (zh) 2021-09-21
EP3840369A1 (en) 2021-06-23
EP3840369A4 (en) 2021-10-06
CN112840642A (zh) 2021-05-25
US11595588B2 (en) 2023-02-28
CN112840642B (zh) 2022-06-10
CN111050143A (zh) 2020-04-21

Similar Documents

Publication Publication Date Title
WO2020073957A1 (zh) 一种图像拍摄方法和终端设备
WO2021052232A1 (zh) 一种延时摄影的拍摄方法及设备
CN109729279B (zh) 一种图像拍摄方法和终端设备
CN114092364B (zh) 图像处理方法及其相关设备
CN113810601B (zh) 终端的图像处理方法、装置和终端设备
CN113810600B (zh) 终端的图像处理方法、装置和终端设备
WO2021057277A1 (zh) 一种暗光下拍照的方法及电子设备
WO2021043045A1 (zh) 一种网络配置信息的配置方法及设备
CN113452898B (zh) 一种拍照方法及装置
WO2020015144A1 (zh) 一种拍照方法及电子设备
CN113660408B (zh) 一种视频拍摄防抖方法与装置
US20240137659A1 (en) Point light source image detection method and electronic device
CN115604572B (zh) 图像的获取方法、电子设备及计算机可读存储介质
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN114422682A (zh) 拍摄方法、电子设备和可读存储介质
CN105407295A (zh) 移动终端拍摄装置和方法
CN115767290A (zh) 图像处理方法和电子设备
CN115706869A (zh) 终端的图像处理方法、装置和终端设备
WO2020077544A1 (zh) 一种物体识别方法和终端设备
CN115631250A (zh) 图像处理方法与电子设备
CN115705663B (zh) 图像处理方法与电子设备
CN117119314B (zh) 一种图像处理方法及相关电子设备
CN117714890B (zh) 一种曝光补偿方法、电子设备及存储介质
WO2024078275A1 (zh) 一种图像处理的方法、装置、电子设备及存储介质
CN115696067B (zh) 终端的图像处理方法、终端设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19871713

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 19871713.4

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2019871713

Country of ref document: EP

Effective date: 20210318

NENP Non-entry into the national phase

Ref country code: DE