CN109993722B - Image processing method, image processing device, storage medium and electronic equipment - Google Patents

Image processing method, image processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN109993722B
CN109993722B CN201910280476.8A CN201910280476A CN109993722B CN 109993722 B CN109993722 B CN 109993722B CN 201910280476 A CN201910280476 A CN 201910280476A CN 109993722 B CN109993722 B CN 109993722B
Authority
CN
China
Prior art keywords
image
raw
images
electronic device
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910280476.8A
Other languages
Chinese (zh)
Other versions
CN109993722A (en
Inventor
黄杰文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910280476.8A priority Critical patent/CN109993722B/en
Publication of CN109993722A publication Critical patent/CN109993722A/en
Application granted granted Critical
Publication of CN109993722B publication Critical patent/CN109993722B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method, which comprises the following steps: acquiring a RAW image packet, wherein the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times; unpacking the RAW image packet to obtain at least two frames of RAW images; synthesizing at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range; acquiring an operation parameter value of the electronic equipment, wherein the operation parameter value is used for expressing the computing capacity of the electronic equipment; according to the operation parameter value, determining to store the RAW synthetic image into a preset image cache queue in a target scene, or converting the RAW synthetic image into a YUV synthetic image and then storing the YUV synthetic image into the preset image cache queue; when a photographing instruction is received, obtaining multi-frame historical cache images of a target scene from a preset image cache queue, and carrying out multi-frame noise reduction processing on the multi-frame historical cache images to obtain noise reduction images; and responding to the photographing instruction according to the noise reduction image.

Description

Image processing method, image processing device, storage medium and electronic equipment
Technical Field
The present application belongs to the field of image technologies, and in particular, to an image processing method, an image processing apparatus, a storage medium, and an electronic device.
Background
Due to the hardware limitation of the electronic equipment, the current electronic equipment can only shoot scenes with a relatively small brightness range, and if the scenes Jing Mingan have too large difference, the shot images are easy to lose details of bright places and/or dark places. That is, the imaging effect of the image processed by the electronic device is poor.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and an electronic device, which can improve the imaging effect of an image.
An embodiment of the present application provides an image processing method, including:
acquiring a RAW image packet, wherein the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times;
unpacking the RAW image packet to obtain the at least two frames of RAW images;
synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
acquiring an operation parameter value of an electronic device, wherein the operation parameter value is used for representing the computing capacity of the electronic device;
according to the operation parameter value, determining to store the RAW synthetic image into a preset image cache queue in the target scene, or converting the RAW synthetic image into a YUV synthetic image and then storing the YUV synthetic image into a preset image cache queue;
when a photographing instruction is received, acquiring a multi-frame historical cache image of a target scene from the preset image cache queue, and performing multi-frame noise reduction processing on the multi-frame historical cache image to obtain a noise reduction image;
and responding to the photographing instruction according to the noise reduction image.
An embodiment of the present application provides an image processing apparatus, including:
the system comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for acquiring a RAW image packet, the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired under a target scene and have different exposure time;
the unpacking module is used for unpacking the RAW image packet to obtain the at least two frames of RAW images;
the synthesis module is used for synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
the second acquisition module is used for acquiring an operation parameter value of the electronic equipment, wherein the operation parameter value is used for representing the computing capacity of the electronic equipment;
the buffer module is used for determining to store the RAW synthetic image into a preset image buffer queue under the target scene according to the operation parameter value, or converting the RAW synthetic image into a YUV synthetic image and then storing the YUV synthetic image into a preset image buffer queue;
the processing module is used for acquiring a multi-frame historical cache image of a target scene from the preset image cache queue when a photographing instruction is received, and performing multi-frame noise reduction processing on the multi-frame historical cache image to obtain a noise reduction image;
and the response module is used for responding to the photographing instruction according to the noise reduction image.
An embodiment of the present application provides a storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to execute an image processing method provided by an embodiment of the present application.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the image processing method provided in the embodiment of the present application by calling a computer program stored in the memory.
In this embodiment, the electronic device may first synthesize RAW images with different exposure times to obtain a RAW synthesized image with a high dynamic range, and store the RAW synthesized image or a YUV synthesized image obtained by converting the RAW synthesized image into a preset image buffer queue. When photographing, the electronic device may obtain multiple frames of images with a high dynamic range from the preset image cache queue to perform multiple frame noise reduction, so as to obtain a noise-reduced image, and respond to a photographing instruction according to the noise-reduced image. Therefore, the photographed image obtained by the present embodiment has a high dynamic range effect and is less noisy, that is, the present embodiment can improve the imaging effect of the image.
Drawings
The technical solutions and advantages of the present application will be apparent from the following detailed description of specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a RAW image packet according to an embodiment of the present application.
Fig. 3 is another schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 4 to fig. 6 are schematic scene diagrams of an image processing method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 9 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of an image processing circuit according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
It can be understood that the execution subject of the embodiment of the present application may be an electronic device such as a smartphone or a tablet computer.
Referring to fig. 1, fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present disclosure, where the flowchart may include:
in 101, a RAW image packet is acquired, the RAW image packet including at least two frames of RAW images, the at least two frames of RAW images being images acquired in a target scene and having different exposure times.
The image processing method provided by the embodiment can be applied to electronic equipment with a camera module. The camera module of the electronic equipment is composed of a lens and an image sensor, wherein the lens is used for collecting external light source signals and supplying the external light source signals to the image sensor, and the image sensor senses the light source signals from the lens and converts the light source signals into digitized RAW image data, namely RAW image data. RAW is in an unprocessed, also uncompressed, format that can be visually referred to as "digital negative". The image sensor of the camera module of the electronic device can have a first operating mode and a second operating mode.
In the first operation mode, the image sensor will generate at least two RAW images with different exposure times within one frame time, and output the RAW images in the form of RAW image packets. For example, if the frame rate of the image sensor is 30fps, the image sensor will generate two RAW images with different exposure times within thirty-one second, and output the RAW images in the form of RAW image packets. For example, referring to fig. 2, the RAW image packet output by the image sensor operating in the first operating mode may include data of two RAW images, and the exposure time of one RAW image is twice as long as that of the other RAW image. Of course, the ratio of the exposure time lengths of the two frames of RAW images may also be other values, such as 3:1, 4:1, 3:2, and the like, which is not limited in this embodiment. For example, the image sensor generates two RAW images with different exposure times, such as a first RAW image L1 and a second RAW image S1, in the first thirty-one second. That is, after the image sensor performs a long exposure to obtain the L1 image frame, the L1 image frame is not immediately output, but the L1 image frame is stored in a buffer of the image sensor, the charge accumulated on the image sensor is immediately cleared, and then a short exposure is performed to obtain the S1 image frame. After reading out the S1 image frame, the electronic device may pack and process data of the L1 image frame and the S1 image frame into a RAW image packet and output the RAW image packet. The image sensor also generates two RAW images with different exposure times, such as a first RAW image L2 and a second RAW image S2, in the second twentieth second. The image sensor generates the first RAW image L3 and the second RAW image S3 within the third thirty-one second. The image sensor generates the first RAW image L4 and the second RAW image S4 within the fourth thirty-one second, and so on. In this embodiment, the images with a long exposure time in the RAW image packet are collectively referred to as a first RAW image, and the images with a short exposure time are collectively referred to as a second RAW image. In the present embodiment, the exposure time of all the first RAW images may be the same, for example, T1. The exposure time of all the second RAW images may be the same, e.g., T2.
It is understood that, since the shooting time interval is short, it can be considered that the RAW images in the RAW image packet are images shot in the same scene. For example, if the RAW image packet includes data of a first RAW image and a second RAW image, the first RAW image and the second RAW image may be considered as images captured in the same scene (e.g., a target scene).
The second operating mode is a normal operating mode. In this second mode of operation, the image sensor will generate a single RAW image within one frame of image time, rather than a RAW image packet. For example, in the normal operation mode, the frame rate of the image sensor is 60fps, and the image sensor will generate and output one frame of RAW image in every sixty-one second.
In this embodiment, the electronic device may first acquire a RAW image packet through the image sensor operating in the first operating mode. The optical signal from the current scene is converged on the image sensor after passing through the lens of the camera, the image sensor performs alternate exposure with different exposure time, and the RAW image packet including at least two frames of RAW images is continuously output. It is understood that at least two frames of RAW images included in the RAW image packet are acquired under the same scene and have different exposure times.
For example, when the user operates the electronic device to start the camera application, the electronic device enables the image sensor and operates the image sensor in the first operating mode. If a user operates a camera of the electronic device to align to a certain scene, the electronic device will continuously acquire a RAW image packet in the scene through an image sensor working in a first working mode, where the RAW image packet includes at least two frames of RAW images with different exposure times.
At 102, the RAW image packet is unpacked to obtain at least two frames of RAW images.
In 103, the at least two frames of RAW images are subjected to a synthesis process to obtain a RAW synthesized image with a high dynamic range.
For example, 102 and 103 may include:
after obtaining the RAW image packet, the electronic device may perform unpacking processing on the RAW image packet, so as to obtain at least two frames of RAW images.
After unpacking the RAW image packet, the electronic device may perform HDR synthesis processing on at least two frames of RAW images obtained by unpacking, so as to obtain a RAW synthesized image with a high dynamic range.
At 104, an operating parameter value of the electronic device is obtained, the operating parameter value being indicative of a computing power of the electronic device.
For example, after obtaining a RAW composite image with a high dynamic range, the electronic device may obtain an operation parameter value thereof, where the operation parameter value may be a numerical value of a parameter representing a computing capability (processing capability) of the electronic device.
In 105, according to the operation parameter value, it is determined that the RAW composite image is stored in a preset image buffer queue in the target scene, or the RAW composite image is converted into a YUV composite image and then stored in the preset image buffer queue.
For example, after obtaining the operation parameter value, the electronic device may determine, according to the operation parameter value, that the RAW composite image is stored in a preset image buffer queue in the target scene, or that the RAW composite image is converted into a YUV composite image and then stored in the preset image buffer queue.
For example, according to the operation parameter value, the electronic device determines that the RAW composite image is stored in the preset image buffer queue in the target scene, and then the electronic device may store the RAW composite image in the preset image buffer queue.
Or, according to the operation parameter value, the electronic equipment determines that the RAW synthetic image is converted into the YUV synthetic image in the target scene and then stores the YUV synthetic image into a preset image cache queue. Then, the electronic device may convert the RAW composite image into YUV format to obtain a YUV composite image, and then store the YUV composite image in a preset image buffer queue.
It should be noted that, in the process of 105, the result determined by the electronic device (i.e., the RAW composite image is stored in the preset image buffer queue in the target scene, or the RAW composite image is converted into the YUV composite image and then stored in the preset image buffer queue) is valid in the target scene (i.e., when the scene is not changed).
For example, after the user opens the camera application, the camera is directed to the scene a, and the electronic device can quickly and continuously acquire RAW image packets. If the electronic device determines that the RAW composite image is stored in the preset image cache queue according to the operation parameter value, all the RAW image packets acquired in the process that the camera is aligned with the scene a store the RAW composite image in the preset image cache queue. For example, in the process that the camera is aligned with the scene a, after the electronic device acquires the first RAW image packet, it is determined according to the operating parameters of the electronic device that the RAW composite image is stored in the preset image cache queue, and then the RAW image packets of the scene a that are acquired next store the RAW composite image in the preset image cache queue. For example, when the electronic device acquires 10 RAW image packets in the scene a, the RAW composite image is stored in the preset image buffer queue for all the 10 RAW image packets.
Then, for example, the user moves the camera to align the camera with the scene B, the electronic device may determine whether to store the RAW composite image in the preset image buffer queue in the scene B or to store the RAW composite image in the preset image buffer queue after converting the RAW composite image into the YUV composite image according to the operation parameters. For example, the electronic device determines that the RAW composite image is converted into a YUV composite image and then stored in the preset image buffer queue. For example, in the process that the camera is aligned with the scene B, after the electronic device acquires the first RAW image packet, it is determined according to the operating parameters of the electronic device that the RAW composite image is converted into the YUV composite image and then stored in the preset image cache queue, and then the acquired RAW image packets of the scene B all convert the RAW composite image into the YUV composite image and store in the preset image cache queue. For example, the electronic device acquires 15 RAW image packets in the scene B, and then the 15 RAW image packets are stored in a preset image buffer queue after the RAW composite image is converted into the YUV composite image.
That is, each time a scene changes, the electronic device needs to determine whether to store the RAW composite image into the preset image buffer queue or to convert the RAW composite image into the YUV composite image and then store the YUV composite image into the preset image buffer queue again according to the operation parameters. For example, after the scene B, the user aligns the camera with the scene a, and at this time, the electronic device needs to determine whether to store the RAW composite image into the preset image buffer queue or to convert the RAW composite image into the YUV composite image and store the YUV composite image into the preset image buffer queue according to the operation parameters. For example, at this time, the electronic device may determine that the RAW composite image is converted into a YUV composite image and then stored in a preset image buffer queue, and so on.
In 106, when a photographing instruction is received, obtaining multiple frames of historical cache images of the target scene from a preset image cache queue, and performing multiple frames of noise reduction processing on the multiple frames of historical cache images to obtain noise-reduced images.
For example, after storing the RAW composite image or the YUV composite image in the preset image cache queue, when receiving a photographing instruction for photographing a target scene, the electronic device may obtain a multi-frame history cache image of the target scene from the preset image cache queue, and perform multi-frame noise reduction processing on the multi-frame history cache image to obtain a noise-reduced image.
In 107, a photographing instruction is responded according to the noise-reduced image.
For example, after obtaining the noise-reduced image, the electronic device may respond to a photographing instruction according to the noise-reduced image. For example, the electronic device may display the noise-reduced image as an image taken in the target scene on an interface of a camera application for viewing by a user.
For example, if the user opens the corresponding application and points the camera at scene a, the electronic device may enable the image sensor and operate it in the first operating mode. At this time, the electronic device may start to quickly acquire the RAW image packet. After the electronic device acquires the first RAW image packet, it may perform unpacking processing on the RAW image packet to obtain at least two frames of RAW images, and perform synthesis processing on the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range. The electronic device may then obtain its operating parameter value, which may be used to indicate the current computing power of the electronic device. And the electronic device can determine whether to store the RAW composite image into a preset image buffer queue or to convert the RAW composite image into a YUV composite image and then store the YUV composite image into the preset image buffer queue in the scene a according to the operation parameter value. For example, if the electronic device determines that the RAW composite image is stored in the preset image buffer queue in the scene a, the RAW composite image of the RAW image packet acquired in the scene a is stored in the preset image buffer queue. For example, in scene a, the electronic device acquires 10 RAW image packets, and the RAW composite images of the 10 RAW image packets are stored in the preset image buffer queue. When receiving a command for photographing the scene a, the electronic device may obtain a multi-frame historical RAW composite image related to the scene a from a preset image cache queue, perform multi-frame denoising processing on the multi-frame historical RAW composite image to obtain a denoised image, and respond to the photographing command in the scene a according to the denoised image.
Then, for example, the user moves the camera to align the camera with the scene B, so that the electronic device can quickly acquire the RAW image packet related to the scene B. After the electronic device acquires the first RAW image packet in the scene B, the electronic device may perform unpacking processing on the RAW image packet to obtain at least two frames of RAW images, and perform synthesis processing on the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range. The electronic device may then obtain its operating parameter values. And the electronic device can determine whether to store the RAW composite image into a preset image buffer queue or to convert the RAW composite image into a YUV composite image and then store the YUV composite image into the preset image buffer queue in the scene B according to the operation parameter value. For example, if the electronic device determines that the RAW composite image is converted into a YUV composite image in the scene B and then stored in the preset image buffer queue, the YUV composite image acquired in the scene B is stored in the preset image buffer queue. For example, in scene B, the electronic device acquires 15 RAW image packets, and the YUV composite images of the 15 RAW image packets are stored in the preset image buffer queue. When receiving an instruction for taking a picture of the scene B, the electronic device may obtain a multi-frame historical YUV synthesized image about the scene B from a preset image cache queue, perform multi-frame noise reduction processing on the array historical YUV synthesized image to obtain a noise-reduced image, and respond to the picture taking instruction in the scene B according to the noise-reduced image.
It can be understood that, in this embodiment, the electronic device may first synthesize RAW images with different exposure times to obtain a RAW synthesized image with a high dynamic range, and store the RAW synthesized image or a YUV synthesized image obtained by converting the RAW synthesized image into a preset image buffer queue. When the photographing is performed, the electronic device can acquire multiple frames of images with high dynamic ranges from the preset image cache queue to perform multi-frame noise reduction to obtain noise-reduced images, and respond to the photographing instruction according to the noise-reduced images. Therefore, the photographed image obtained by the present embodiment has a high dynamic range effect and is less noisy, that is, the present embodiment can improve the imaging effect of the image.
Referring to fig. 3, fig. 3 is another schematic flow chart of an image processing method according to an embodiment of the present application, where the flow chart may include:
in 201, the electronic device acquires a RAW image packet, wherein the RAW image packet comprises a first RAW image and a second RAW image which are sequentially exposed, the first RAW image and the second RAW image are images acquired under a target scene, and the exposure time of the first RAW image is longer than that of the second RAW image.
In 202, the electronic device unpacks the RAW image packet to obtain a first RAW image and a second RAW image.
At 203, the electronic device performs a synthesis process on the first RAW image and the second RAW image to obtain a RAW synthesized image with a high dynamic range.
For example, 201, 202, and 203 may include:
when the user operates the electronic device to start the camera application, the electronic device enables the image sensor and enables the image sensor to work in a first working mode. If a user operates a camera of the electronic device to aim at a certain scene (for example, the electronic device determines the certain scene as a target scene), the electronic device continuously acquires a RAW image packet in the target scene through an image sensor operating in a first operating mode, where the RAW image packet includes a first RAW image and a second RAW image that are sequentially exposed. Wherein the exposure time of the first RAW image is longer than that of the second RAW image.
Then, the electronic device may perform unpacking processing on the acquired RAW image packet to obtain a first RAW image and a second RAW image, and perform synthesis processing on the first RAW image and the second RAW image to obtain a RAW synthesized image with a high dynamic range.
That is, each time one RAW image packet is acquired, the electronic device may perform unpacking processing on the RAW image packet, thereby obtaining the first RAW image and the second RAW image included therein. In this embodiment, a RAW image with a long exposure included in each RAW image packet is referred to as a first RAW image, and a RAW image with a short exposure time included therein is referred to as a second RAW image.
At 204, the electronic device obtains an operating parameter value that is indicative of a computing capability of the electronic device.
For example, after obtaining the first RAW composite image in the target scene a, the electronic device may obtain an operation parameter value thereof, where the operation parameter value may be a numerical value of a parameter representing a computing capability of the electronic device.
In one embodiment, the step 204 of obtaining, by the electronic device, an operation parameter value for a process indicating a computing capability of the electronic device may include:
the electronic equipment obtains the ratio of the residual operation memory capacity to the total operation memory capacity, and the ratio is used for representing the computing capacity of the electronic equipment.
For example, when the computing capability of the electronic device needs to be determined, the electronic device may obtain a ratio of the remaining operating memory capacity to the total operating memory capacity of the electronic device, and use the ratio as the computing capability of the electronic device. Wherein a larger ratio may indicate a stronger computing power of the electronic device.
For example, if the electronic device obtains that the remaining operating memory capacity is 3GB and the total operating memory capacity is 4GB, the ratio of the remaining operating memory capacity to the total operating memory capacity is 75%. For example, the electronic device may preset a preset threshold of 40%. When the ratio of the remaining operating memory capacity to the total operating memory capacity is greater than 40% of the preset threshold, the current computing capability of the electronic device may be considered to be strong. When the ratio of the remaining operating memory capacity to the total operating memory capacity is less than or equal to 40% of the preset threshold, the current computing capability of the electronic device may be considered to be weak.
In 205, if the operation parameter value indicates that the calculation capability of the electronic device is higher than the preset threshold, the electronic device determines to store the RAW composite image in a preset image buffer queue in the target scene; and if the operation parameter value indicates that the computing capability of the electronic equipment is not higher than the preset threshold value, the electronic equipment determines that the RAW synthetic image is converted into the YUV synthetic image in the target scene and then stores the YUV synthetic image into a preset image cache queue.
For example, after obtaining an operation parameter value of the electronic device, the electronic device may determine, according to the operation parameter value, that the RAW composite image is stored in a preset image buffer queue in a target scene, or that the RAW composite image is converted into a YUV composite image and then stored in the preset image buffer queue.
For example, in this embodiment, if the operation parameter value indicates that the computing capability of the electronic device is higher than the preset threshold, the electronic device may determine that the RAW composite image is stored in the preset image buffer queue in the target scene. If the operation parameter value indicates that the computing capability of the electronic device is not higher than the preset threshold value, the electronic device may determine that the RAW composite image is converted into the YUV composite image in the target scene and store the YUV composite image in the preset image cache queue.
In one embodiment, the preset image buffer queue may be a fixed-length queue. For example, the preset image buffer queue may buffer 30 frames of images. That is, the preset image buffer queue may buffer the recently acquired 30 frames of images. For example, according to the time sequence stored in the preset image buffer queue, the 1 st frame image, the 2 nd frame image, … … and the 30 th frame image are buffered in the preset image buffer queue. Then, when buffering the 31 st frame image, the preset image buffer queue may remove the oldest stored 1 st frame image included therein, and store the 31 st frame image in the preset image buffer queue.
At 206, when the photographing instruction is received, the electronic device determines a target number according to the operating parameter value, the target number being greater than or equal to 2.
In 207, the electronic device obtains a target number of historical cache images from a preset image cache queue, where the historical cache images are images of a target scene.
For example, 206 and 207 may include:
after storing the RAW composite image or the YUV composite image in the preset image buffer queue, when receiving a photographing instruction for photographing a target scene, the electronic device may determine a target number according to the operation parameter value obtained in 204. Wherein the target number may have a value greater than or equal to 2.
After the target number is determined, the electronic device may obtain multiple frames of historical cache images of the target scene from the preset image cache queue.
For example, if the electronic device determines that the target number is 4 according to the operation parameter value, the electronic device may obtain 4 frames of historical cache images of the target scene from a preset image cache queue.
In one embodiment, the electronic device may preset at least two values, for example, 4 and 2. Wherein, when the operation parameter value indicates that the computing capability of the electronic device is higher than the preset threshold, the electronic device may determine the value 4 as the target number. When the operating parameter value indicates that the computing power of the electronic device is not higher than a preset threshold, the electronic device may determine the value 2 as the target number, and so on.
For example, the electronic device may preset three values, for example, 4, 3, and 2. Wherein, when the ratio of the remaining operating memory capacity to the total operating memory capacity is [75%,100% ], the electronic device may determine the value 4 as the target number. When the ratio of the remaining operating memory capacity to the total operating memory capacity is at [40%, 75%), the electronic device may determine the value 3 as the target number. When the ratio of the remaining operating memory capacity to the total operating memory capacity is less than 40%, the electronic device may determine the value 2 as the target number.
At 208, the electronic device performs multi-frame noise reduction processing on the multi-frame history buffer image to obtain a noise-reduced image.
For example, after obtaining multiple frames of history buffer images of a target scene, the electronic device may perform multiple frames of noise reduction processing on the multiple frames of history buffer images, so as to obtain a noise-reduced image.
The following describes multi-frame noise reduction by taking images P1, P2, P3, and P4 as examples. For example, when performing multi-frame noise reduction processing on the image P1, the image P2, the image P3, and the image P4, the electronic device may determine a frame base frame from the image P1, the image P2, the image P3, and the image P4, for example, determine the image P1 as the base frame, and then the electronic device may perform image alignment on the image P1, the image P2, the image P3, and the image P4. Then, the electronic device may calculate an average pixel value of each pixel point based on the aligned images, for example, pixel values of pixel points at a certain position in the four frame images P1, P2, P3, and P4 are sequentially: 101 102, 103, 102, the average pixel value of the pixel point at the position can be calculated to be 102. Then, the electronic device may determine the pixel value of the pixel point at the position as 102, and change the pixel value of the base frame image P1 at the position from 101 to 102. Similarly, the noise-reduced image can be obtained by changing the pixel value of each position in the basic frame image P1 to the corresponding average pixel value.
At 209, the electronic device performs image optimization on the noise-reduced image to obtain a processed image.
At 210, the electronic device responds to the photographing instruction according to the processed image, so as to display the processed image as a photographed image.
For example, 209 and 210 may include:
after obtaining the noise-reduced image, the electronic device may perform an optimization process, such as image sharpening, on the noise-reduced image to obtain a processed image. And then, the electronic equipment can respond to the photographing instruction according to the processed image so as to display the processed image as the photographed image.
In one embodiment, after 203, that is, after the electronic device obtains the RAW composite image with high dynamic range, the RAW composite image may be subjected to a beautification process such as YUV format conversion, image sharpening, image denoising, and the like, so as to obtain an image after the beautification process. The beautified image may be used for preview or video recording.
For example, after obtaining the RAW composite image, the electronic device may obtain a resolution of the display screen, perform downsampling on the RAW composite image according to the resolution, and perform beautification processing, such as YUV format conversion, image sharpening, and image denoising, on the downsampled image for previewing.
For another example, after obtaining the RAW composite image, the electronic device may obtain a video resolution of a video, perform downsampling processing on the RAW composite image according to the video resolution of the video, and store the downsampled image as one frame of a video corresponding to the video after performing beautification processing such as YUV format conversion, image sharpening, and image denoising.
For example, the video resolution of the video recording may be preset by the user, including but not limited to 1080P, 2K, and 4K, etc. For example, if the resolution of the recorded video is set to 2K by the user in advance, the electronic device may perform down-sampling processing on the RAW composite image (for example, the resolution is 4K) after the RAW composite image is obtained by compositing, so as to obtain a high dynamic range image with the resolution of 2K.
Referring to fig. 4 to 6, fig. 4 to 6 are schematic scene diagrams of an image processing method according to an embodiment of the present disclosure.
For example, the user opens the camera application, at which point the electronic device enters the preview interface of the camera. The electronic device may enable the image sensor and operate the image sensor in a first mode of operation. In the first operation mode, the image sensor will generate two frames of RAW images with different exposure time in sequence within one frame of image, and output the RAW images in the form of RAW image packets.
For example, as shown in fig. 4, the user points the camera to a scene a, which includes a person, and the electronic device may start to quickly acquire a RAW image packet regarding the scene a. After the electronic device obtains the first RAW image packet, it may perform unpacking processing on the RAW image packet to obtain two frames of RAW images, and perform synthesis processing on the two frames of RAW images, thereby obtaining a RAW synthesized image with a high dynamic range.
The electronic device may then obtain its operating parameter value, which may be used to indicate the current computing power of the electronic device. And the electronic device can determine whether to store the RAW composite image into a preset image buffer queue or to convert the RAW composite image into a YUV composite image and then store the YUV composite image into the preset image buffer queue in the scene a according to the operation parameter value. For example, if the current computing capability of the electronic device is strong, it may be determined that the RAW composite image is stored in the preset image cache queue in the scene a, and then the RAW composite image of the RAW image packet acquired in the scene a is stored in the preset image cache queue. For example, in scene a, the electronic device acquires 10 RAW image packets, and the RAW composite images of the 10 RAW image packets are stored in the preset image buffer queue.
While in the preview interface of the camera application, each time the electronic device acquires a frame of RAW composite image, the electronic device may display the RAW composite image on the preview interface for the user to preview after performing processing such as YUV format conversion, image sharpening, and noise reduction on the RAW composite image.
After the preview, for example, the user presses a photographing button as shown in fig. 4, the electronic device may receive an instruction to photograph scene a. At this time, the electronic device may obtain a 4-frame historical RAW composite image related to the scene a from a preset image buffer queue, and perform multi-frame noise reduction processing on the 4-frame historical RAW composite image to obtain a noise-reduced image. For example, as shown in fig. 5, the 4 frames of RAW composite images of the scene a acquired by the electronic device from the preset image buffer queue are P1, P2, P3, and P4, respectively. The electronic device may determine the image P1 as a base frame, and perform multi-frame noise reduction on P1 using the images P2, P3, and P4. The electronic device may image-align image P1, image P2, image P3, and image P4. Then, the electronic device may calculate an average pixel value of each pixel point based on the aligned images, for example, pixel values of the pixel point at the L position in the four frames of images P1, P2, P3, and P4 are sequentially: 101 102, 103, 102, the average pixel value of the pixel point at the position can be calculated to be 102. Then, the electronic device may determine the pixel value of the pixel point at the position as 102, and change the pixel value of the base frame image P1 at the L position from 101 to 102. Similarly, the noise-reduced image can be obtained by changing the pixel value of each position in the basic frame image P1 to the corresponding average pixel value.
After obtaining the noise-reduced image, the electronic device may perform image sharpening and other processing on the noise-reduced image to obtain a noise-reduced image after beautification processing. And then, the electronic equipment can send the beautified noise reduction image to a JPEG encoder, and the beautified noise reduction image is output to a camera application interface for display after being processed by the JPEG encoder so that a user can check and photograph the image.
As another example, the user then directs the camera to the scene B including two characters, as shown in fig. 6, the electronic device may start to quickly acquire a RAW image packet related to the scene B. After the electronic device obtains the first RAW image packet in the scene B, the electronic device may perform unpacking processing on the RAW image packet to obtain two frames of RAW images, and perform synthesis processing on the two frames of RAW images to obtain a RAW synthesized image with a high dynamic range.
The electronic device may then obtain its operating parameter value, which may be used to indicate the current computing power of the electronic device. And the electronic device can determine whether to store the RAW composite image into a preset image buffer queue or to store the RAW composite image into the preset image buffer queue after converting the RAW composite image into the YUV composite image according to the operation parameter value. For example, if the current computing capability of the electronic device is weak, it may be determined that the RAW composite image is converted into a YUV composite image in the scene B and then stored in the preset image buffer queue, and then the RAW composite image obtained from the RAW image packet in the scene B is stored in the preset image buffer queue after being converted into a YUV composite image in a YUV format. For example, in scene B, the electronic device acquires 15 RAW image packets, and the YUV composite images of the 15 RAW image packets are stored in the preset image buffer queue.
Likewise, while in the preview interface of the camera application, each time the electronic device acquires a frame of RAW composite image with respect to scene B, the electronic device may display the RAW composite image on the preview interface for the user to preview after performing processing such as YUV format conversion, image sharpening, and noise reduction on the RAW composite image.
After the preview, for example, the user presses a photograph button, as shown in fig. 6, the electronic device may receive an instruction to photograph scene B. At this time, the electronic device may obtain 2 frames of historical YUV synthetic images about the scene B from the preset image buffer queue, and perform multi-frame noise reduction processing on the 2 frames of historical YUV synthetic images to obtain a noise-reduced image. For example, the electronic device obtains 2 frames of YUV composite images for scene B from the preset image buffer queue as P5 and P6, respectively. The electronic device may perform multi-frame noise reduction on the images P5 and P6 to obtain a noise-reduced image.
After obtaining the noise-reduced image, the electronic device may perform image sharpening and other processing on the noise-reduced image to obtain a noise-reduced image after beautification processing. And then, the electronic equipment can send the beautified noise reduction image to a JPEG encoder, and the beautified noise reduction image is output to a camera application interface for display after being processed by the JPEG encoder so that a user can check and photograph the image.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing apparatus 300 may include: a first obtaining module 301, an unpacking module 302, a synthesizing module 303, a second obtaining module 304, a caching module 305, a processing module 306, and a responding module 307.
A first obtaining module 301, configured to obtain a RAW image packet, where the RAW image packet includes at least two frames of RAW images, and the at least two frames of RAW images are images obtained in a target scene and have different exposure times.
An unpacking module 302, configured to unpack the RAW image packet to obtain the at least two frames of RAW images.
A synthesizing module 303, configured to perform synthesizing processing on the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range.
A second obtaining module 304, configured to obtain an operation parameter value of the electronic device, where the operation parameter value is used to represent a computing capability of the electronic device.
The buffer module 305 is configured to determine, according to the operation parameter value, to store the RAW composite image in a preset image buffer queue in the target scene, or to store the RAW composite image in a preset image buffer queue after converting the RAW composite image into a YUV composite image.
And the processing module 306 is configured to, when a photographing instruction is received, acquire a multi-frame historical cache image of the target scene from the preset image cache queue, and perform multi-frame noise reduction processing on the multi-frame historical cache image to obtain a noise reduction image.
A response module 307, configured to respond to the photographing instruction according to the noise-reduced image.
In one embodiment, the caching module 305 may be configured to:
if the operation parameter value indicates that the computing capability of the electronic equipment is higher than a preset threshold value, determining that the RAW synthetic image is stored in a preset image cache queue under the target scene;
and if the operation parameter value indicates that the computing capacity of the electronic equipment is not higher than a preset threshold value, determining that the RAW synthetic image is converted into a YUV synthetic image in the target scene and then stored in the preset image cache queue.
In one embodiment, the second obtaining module 304 may be configured to:
and acquiring the ratio of the residual operation memory capacity of the electronic equipment to the total operation memory capacity, wherein the ratio is used for representing the computing capacity of the electronic equipment.
In one embodiment, the response module 307 may be configured to:
performing image optimization processing on the noise reduction image to obtain a processed image;
and responding to the photographing instruction according to the processed image so as to display the processed image as a photographed image.
In one embodiment, the first obtaining module 301 may be configured to: acquiring a RAW image packet, wherein the RAW image packet comprises a first RAW image and a second RAW image which are sequentially exposed, the first RAW image and the second RAW image are images acquired under a target scene, and the exposure time of the first RAW image is longer than that of the second RAW image.
Then, the unpacking module 302 may be configured to: and unpacking the RAW image packet to obtain the first RAW image and the second RAW image.
The synthesis module 303 may be configured to: and synthesizing the first RAW image and the second RAW image to obtain a RAW synthesized image with a high dynamic range.
In one embodiment, the processing module 306 may be configured to:
when a photographing instruction is received, determining a target number according to the operation parameter value, wherein the target number is greater than or equal to 2;
and obtaining the historical cache images of the target number from the preset image cache queue, wherein the historical cache images are the images of the target scene.
In an embodiment, the preset image buffer queue is a fixed-length queue.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the flow in the image processing method provided by the embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the flow in the image processing method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 400 may include a camera module 401, a memory 402, a processor 403, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 8 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The camera module 401 may include a lens and an image sensor, wherein the lens is used for collecting an external light source signal and providing the light source signal to the image sensor, and the image sensor senses the light source signal from the lens and converts the light source signal into digitized RAW image data, i.e., RAW image data. RAW is in an unprocessed, also uncompressed, format that can be visually referred to as "digital negative". The image sensor of the camera module of the electronic device can have a first operating mode and a second operating mode.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the electronic device.
In this embodiment, the processor 403 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, so as to execute:
acquiring a RAW image packet, wherein the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times;
unpacking the RAW image packet to obtain the at least two frames of RAW images;
synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
acquiring an operation parameter value of an electronic device, wherein the operation parameter value is used for representing the computing capacity of the electronic device;
according to the operation parameter value, determining to store the RAW synthetic image into a preset image cache queue in the target scene, or converting the RAW synthetic image into a YUV synthetic image and then storing the YUV synthetic image into a preset image cache queue;
when a photographing instruction is received, acquiring a multi-frame historical cache image of a target scene from the preset image cache queue, and performing multi-frame noise reduction processing on the multi-frame historical cache image to obtain a noise reduction image;
and responding to the photographing instruction according to the noise reduction image.
Referring to fig. 9, the electronic device 500 may include a camera module 501, a memory 502, a processor 503, a touch screen 504, a speaker 505, a microphone 506, and so on.
The camera module 501 may include Image Processing circuitry, which may be implemented using hardware and/or software components, and may include various Processing units that define an Image Signal Processing (Image Signal Processing) pipeline. The image processing circuit may include at least: a camera, an Image Signal Processor (ISP Processor), control logic, an Image memory, and a display. Wherein the camera may comprise at least one or more lenses and an image sensor. The image sensor may include an array of color filters (e.g., bayer filters). The image sensor may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor and provide a set of raw image data that may be processed by an image signal processor.
The image signal processor may process the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the image signal processor may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision. The raw image data can be stored in an image memory after being processed by an image signal processor. The image signal processor may also receive image data from an image memory.
The image Memory may be part of a Memory device, a storage device, or a separate dedicated Memory within the electronic device, and may include a DMA (Direct Memory Access) feature.
When image data is received from the image memory, the image signal processor may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to an image memory for additional processing before being displayed. The image signal processor may also receive processed data from the image memory and perform image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the image signal processor may also be sent to an image memory, and the display may read image data from the image memory. In one embodiment, the image memory may be configured to implement one or more frame buffers.
The statistical data determined by the image signal processor may be sent to the control logic. For example, the statistical data may include statistical information of the image sensor such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens shading correction, and the like.
The control logic may include a processor and/or microcontroller that executes one or more routines (e.g., firmware). One or more routines may determine camera control parameters and ISP control parameters based on the received statistical data. For example, the control parameters of the camera may include camera flash control parameters, control parameters of the lens (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), etc.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an image processing circuit in the present embodiment. As shown in fig. 10, only aspects of the image processing technique related to the embodiment of the present invention are shown for convenience of explanation.
For example, the image processing circuit may include: camera, image signal processor, control logic ware, image memory, display. The camera may include one or more lenses and an image sensor, among others. In some embodiments, the camera may be either a tele or a wide camera.
And transmitting the first image acquired by the camera to an image signal processor for processing. After the image signal processor processes the first image, statistical data of the first image (such as brightness of the image, contrast value of the image, color of the image, etc.) may be sent to the control logic. The control logic device can determine the control parameters of the camera according to the statistical data, so that the camera can carry out operations such as automatic focusing and automatic exposure according to the control parameters. The first image can be stored in the image memory after being processed by the image signal processor. The image signal processor may also read the image stored in the image memory for processing. In addition, the first image can be directly sent to the display for displaying after being processed by the image signal processor. The display may also read the image in the image memory for display.
In addition, not shown in the figure, the electronic device may further include a CPU and a power supply module. The CPU is connected with the logic controller, the image signal processor, the image memory and the display, and is used for realizing global control. The power supply module is used for supplying power to each module.
Memory 502 stores applications containing executable code. The application programs may constitute various functional modules. The processor 503 executes various functional applications and data processing by running an application program stored in the memory 502.
The processor 503 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 502 and calling the data stored in the memory 502, thereby performing overall monitoring of the electronic device.
The touch screen display 504 may be used to receive user touch control operations for the electronic device. The speaker 505 may play an audio signal. The microphone 506 may be used to pick up sound signals.
In this embodiment, the processor 503 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 502 according to the following instructions, and the processor 503 runs the application programs stored in the memory 502, thereby executing:
acquiring a RAW image packet, wherein the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times;
unpacking the RAW image packet to obtain the at least two frames of RAW images;
synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
acquiring an operation parameter value of an electronic device, wherein the operation parameter value is used for representing the computing capacity of the electronic device;
according to the operation parameter value, determining to store the RAW synthetic image into a preset image cache queue in the target scene, or converting the RAW synthetic image into a YUV synthetic image and then storing the YUV synthetic image into a preset image cache queue;
when a photographing instruction is received, obtaining multi-frame historical cache images of a target scene from the preset image cache queue, and carrying out multi-frame noise reduction processing on the multi-frame historical cache images to obtain noise reduction images;
and responding to the photographing instruction according to the noise reduction image.
In an embodiment, when the processor 503 determines to store the RAW composite image in a preset image buffer queue in the target scene or store the RAW composite image into a preset image buffer queue after converting the RAW composite image into a YUV composite image according to the operation parameter value, the following steps may be performed: if the operation parameter value indicates that the computing capability of the electronic equipment is higher than a preset threshold value, determining that the RAW synthetic image is stored in a preset image cache queue under the target scene; and if the operation parameter value indicates that the computing capacity of the electronic equipment is not higher than a preset threshold value, determining that the RAW synthetic image is converted into a YUV synthetic image in the target scene and then stored in the preset image cache queue.
In one embodiment, the processor 503 may perform the obtaining of an operation parameter value of the electronic device, where the operation parameter value is used to represent a computing capability of the electronic device, and may perform: and acquiring a ratio of the residual operation memory capacity of the electronic equipment to the total operation memory capacity, wherein the ratio is used for representing the computing capacity of the electronic equipment.
In one embodiment, the processor 503 may execute, in response to the photographing instruction, according to the noise-reduced image: performing image optimization processing on the noise-reduced image to obtain a processed image; and responding to the photographing instruction according to the processed image so as to display the processed image as a photographed image.
In one embodiment, the processor 503 executes the acquiring of the RAW image packet, where the RAW image packet includes at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times, and may execute: acquiring a RAW image packet, wherein the RAW image packet comprises a first RAW image and a second RAW image which are sequentially exposed, the first RAW image and the second RAW image are images acquired under a target scene, and the exposure time of the first RAW image is longer than that of the second RAW image.
Then, when the processor 503 performs an unpacking process on the RAW image packet to obtain the at least two frames of RAW images, it may perform: and unpacking the RAW image packet to obtain the first RAW image and the second RAW image.
When the processor 503 performs the combining process on the at least two frames of RAW images to obtain a RAW combined image with a high dynamic range, it may perform: and synthesizing the first RAW image and the second RAW image to obtain a RAW synthesized image with a high dynamic range.
In one embodiment, when the processor 503 executes the step of obtaining the image of the multi-frame history buffer of the target scene from the preset image buffer queue when receiving the photographing instruction, the following steps may be performed: when a photographing instruction is received, determining a target number according to the operation parameter value, wherein the target number is greater than or equal to 2; and obtaining the historical cache images of the target number from the preset image cache queue, wherein the historical cache images are the images of the target scene.
In an embodiment, the preset image buffer queue is a fixed-length queue.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the image processing method, and are not described herein again.
The image processing apparatus provided in the embodiment of the present application and the image processing method in the above embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be run on the image processing apparatus, and a specific implementation process thereof is described in the embodiment of the image processing method in detail, and is not described herein again.
It should be noted that, for the image processing method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the image processing method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution, the process of the embodiment of the image processing method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the image processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and an electronic device according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. An image processing method, comprising:
acquiring a RAW image packet, wherein the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure times;
unpacking the RAW image packet to obtain the at least two frames of RAW images;
synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
acquiring an operation parameter value of an electronic device, wherein the operation parameter value is used for representing the computing capacity of the electronic device;
if the operation parameter value indicates that the computing capability of the electronic equipment is higher than a preset threshold value, determining that the RAW composite image is stored in a preset image cache queue in the target scene; if the operation parameter value indicates that the computing capacity of the electronic equipment is not higher than a preset threshold value, determining that the RAW synthetic image is converted into a YUV synthetic image in the target scene and then storing the YUV synthetic image into a preset image cache queue;
when a photographing instruction is received, obtaining multi-frame historical cache images of a target scene from the preset image cache queue, and carrying out multi-frame noise reduction processing on the multi-frame historical cache images to obtain noise reduction images;
and responding to the photographing instruction according to the noise reduction image.
2. The image processing method according to claim 1, wherein the obtaining of an operation parameter value of an electronic device, the operation parameter value being indicative of a computing capability of the electronic device, comprises:
and acquiring the ratio of the residual operation memory capacity of the electronic equipment to the total operation memory capacity, wherein the ratio is used for representing the computing capacity of the electronic equipment.
3. The image processing method according to claim 1, wherein responding to the photographing instruction according to the noise-reduced image comprises:
performing image optimization processing on the noise reduction image to obtain a processed image;
and responding to the photographing instruction according to the processed image so as to display the processed image as a photographed image.
4. The image processing method according to claim 1, wherein the acquiring a RAW image packet, the RAW image packet including at least two frames of RAW images, the at least two frames of RAW images being images acquired in a target scene and having different exposure times, comprises: acquiring a RAW image packet, wherein the RAW image packet comprises a first RAW image and a second RAW image which are sequentially exposed, the first RAW image and the second RAW image are images acquired under a target scene, and the exposure time of the first RAW image is longer than that of the second RAW image;
unpacking the RAW image packet to obtain the at least two frames of RAW images, including: unpacking the RAW image packet to obtain the first RAW image and the second RAW image;
synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range, comprising: and synthesizing the first RAW image and the second RAW image to obtain a RAW synthesized image with a high dynamic range.
5. The image processing method according to claim 1, wherein when the photographing instruction is received, acquiring a multi-frame historical cache image of the target scene from the preset image cache queue comprises:
when a photographing instruction is received, determining a target number according to the operation parameter value, wherein the target number is greater than or equal to 2;
and obtaining the historical cache images of the target number from the preset image cache queue, wherein the historical cache images are images of the target scene.
6. The image processing method according to claim 1, wherein the preset image buffer queue is a fixed-length queue.
7. An image processing apparatus characterized by comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a RAW image packet, the RAW image packet comprises at least two frames of RAW images, and the at least two frames of RAW images are images acquired in a target scene and have different exposure time;
the unpacking module is used for unpacking the RAW image packet to obtain the at least two frames of RAW images;
the synthesis module is used for synthesizing the at least two frames of RAW images to obtain a RAW synthesized image with a high dynamic range;
the second acquisition module is used for acquiring an operation parameter value of the electronic equipment, wherein the operation parameter value is used for representing the computing capacity of the electronic equipment;
the cache module is used for determining that the RAW synthetic image is stored in a preset image cache queue in the target scene if the operation parameter value indicates that the computing capability of the electronic equipment is higher than a preset threshold value; if the operation parameter value indicates that the computing power of the electronic equipment is not higher than a preset threshold value, determining that the RAW synthetic image is converted into a YUV synthetic image in the target scene and then storing the YUV synthetic image into a preset image cache queue;
the processing module is used for acquiring a multi-frame historical cache image of a target scene from the preset image cache queue when a photographing instruction is received, and performing multi-frame noise reduction processing on the multi-frame historical cache image to obtain a noise reduction image;
and the response module is used for responding to the photographing instruction according to the noise-reduced image.
8. A storage medium having stored thereon a computer program, characterized in that the computer program, when executed on a computer, causes the computer to execute the method according to any of claims 1 to 6.
9. An electronic device comprising a memory, a processor, wherein the processor is configured to perform the method of any of claims 1 to 6 by invoking a computer program stored in the memory.
CN201910280476.8A 2019-04-09 2019-04-09 Image processing method, image processing device, storage medium and electronic equipment Active CN109993722B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910280476.8A CN109993722B (en) 2019-04-09 2019-04-09 Image processing method, image processing device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910280476.8A CN109993722B (en) 2019-04-09 2019-04-09 Image processing method, image processing device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109993722A CN109993722A (en) 2019-07-09
CN109993722B true CN109993722B (en) 2023-04-18

Family

ID=67132528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910280476.8A Active CN109993722B (en) 2019-04-09 2019-04-09 Image processing method, image processing device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109993722B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110969587A (en) * 2019-11-29 2020-04-07 联想(北京)有限公司 Image acquisition method and device and electronic equipment
CN111385475B (en) * 2020-03-11 2021-09-10 Oppo广东移动通信有限公司 Image acquisition method, photographing device, electronic equipment and readable storage medium
CN112003996B (en) * 2020-08-12 2023-04-18 Oppo广东移动通信有限公司 Video generation method, terminal and computer storage medium
CN111988526B (en) * 2020-08-27 2021-07-27 Oppo(重庆)智能科技有限公司 Mobile terminal and image data processing method
CN114979500B (en) * 2021-02-26 2023-08-08 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN113038004B (en) * 2021-02-26 2022-09-23 展讯通信(天津)有限公司 Multi-window image previewing method and device, computer equipment and storage medium
CN116029914B (en) * 2022-07-27 2023-10-20 荣耀终端有限公司 Image processing method and electronic equipment
CN116389898B (en) * 2023-02-27 2024-03-19 荣耀终端有限公司 Image processing method, device and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102833471A (en) * 2011-06-15 2012-12-19 奥林巴斯映像株式会社 Imaging device and imaging method
JP2015082675A (en) * 2013-10-21 2015-04-27 三星テクウィン株式会社Samsung Techwin Co., Ltd Image processing device and image processing method
CN105376473A (en) * 2014-08-25 2016-03-02 中兴通讯股份有限公司 Photographing method, device and equipment
JP2017112462A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program therefor and storage medium
CN107222669A (en) * 2017-06-30 2017-09-29 维沃移动通信有限公司 The method and mobile terminal of a kind of shooting
WO2017215501A1 (en) * 2016-06-15 2017-12-21 深圳市万普拉斯科技有限公司 Method and device for image noise reduction processing and computer storage medium
CN108198189A (en) * 2017-12-28 2018-06-22 广东欧珀移动通信有限公司 Acquisition methods, device, storage medium and the electronic equipment of picture clarity
CN108353130A (en) * 2015-11-24 2018-07-31 索尼公司 Image processor, image processing method and program
CN108401110A (en) * 2018-03-18 2018-08-14 广东欧珀移动通信有限公司 Acquisition methods, device, storage medium and the electronic equipment of image
CN108419023A (en) * 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
CN108520493A (en) * 2018-03-30 2018-09-11 广东欧珀移动通信有限公司 Processing method, device, storage medium and the electronic equipment that image is replaced
CN109167930A (en) * 2018-10-11 2019-01-08 Oppo广东移动通信有限公司 Image display method, device, electronic equipment and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6604297B2 (en) * 2016-10-03 2019-11-13 株式会社デンソー Imaging device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102833471A (en) * 2011-06-15 2012-12-19 奥林巴斯映像株式会社 Imaging device and imaging method
JP2015082675A (en) * 2013-10-21 2015-04-27 三星テクウィン株式会社Samsung Techwin Co., Ltd Image processing device and image processing method
CN105376473A (en) * 2014-08-25 2016-03-02 中兴通讯股份有限公司 Photographing method, device and equipment
CN108353130A (en) * 2015-11-24 2018-07-31 索尼公司 Image processor, image processing method and program
JP2017112462A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Imaging device and control method, program therefor and storage medium
WO2017215501A1 (en) * 2016-06-15 2017-12-21 深圳市万普拉斯科技有限公司 Method and device for image noise reduction processing and computer storage medium
CN107222669A (en) * 2017-06-30 2017-09-29 维沃移动通信有限公司 The method and mobile terminal of a kind of shooting
CN108198189A (en) * 2017-12-28 2018-06-22 广东欧珀移动通信有限公司 Acquisition methods, device, storage medium and the electronic equipment of picture clarity
CN108401110A (en) * 2018-03-18 2018-08-14 广东欧珀移动通信有限公司 Acquisition methods, device, storage medium and the electronic equipment of image
CN108419023A (en) * 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
CN108520493A (en) * 2018-03-30 2018-09-11 广东欧珀移动通信有限公司 Processing method, device, storage medium and the electronic equipment that image is replaced
CN109167930A (en) * 2018-10-11 2019-01-08 Oppo广东移动通信有限公司 Image display method, device, electronic equipment and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于多曝光的高动态图像合成的噪声处理;刘宗等;《电子科技》;20161115(第11期);全文 *

Also Published As

Publication number Publication date
CN109993722A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109993722B (en) Image processing method, image processing device, storage medium and electronic equipment
CN109040609B (en) Exposure control method, exposure control device, electronic equipment and computer-readable storage medium
CN110445988B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108322669B (en) Image acquisition method and apparatus, imaging apparatus, and readable storage medium
CN108989700B (en) Imaging control method, imaging control device, electronic device, and computer-readable storage medium
CN110213502B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110198418B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108712608B (en) Terminal equipment shooting method and device
CN110381263B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110012227B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110445989B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110198417A (en) Image processing method, device, storage medium and electronic equipment
CN107509044B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN110430370B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110266954B (en) Image processing method, image processing device, storage medium and electronic equipment
CN107948520A (en) Image processing method and device
CN110445986B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108833802B (en) Exposure control method and device and electronic equipment
CN110290325B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108401110B (en) Image acquisition method and device, storage medium and electronic equipment
CN110740266B (en) Image frame selection method and device, storage medium and electronic equipment
CN110717871A (en) Image processing method, image processing device, storage medium and electronic equipment
CN110572585B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110266967B (en) Image processing method, image processing device, storage medium and electronic equipment
CN110278375B (en) Image processing method, image processing device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant