CN110121882B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN110121882B
CN110121882B CN201780081683.XA CN201780081683A CN110121882B CN 110121882 B CN110121882 B CN 110121882B CN 201780081683 A CN201780081683 A CN 201780081683A CN 110121882 B CN110121882 B CN 110121882B
Authority
CN
China
Prior art keywords
exposure
frame
adjustment
camera
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780081683.XA
Other languages
Chinese (zh)
Other versions
CN110121882A (en
Inventor
王军
杜成
敖欢欢
徐荣跃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110121882A publication Critical patent/CN110121882A/en
Application granted granted Critical
Publication of CN110121882B publication Critical patent/CN110121882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An image processing method and device are used for reducing motion blur. The method comprises the following steps: the method comprises the steps that when a terminal previews an image frame acquired through a camera and a shot target object moves relative to the terminal, first adjustment is conducted on initial exposure parameters of the camera in a previewing state, wherein the first adjustment comprises the steps of reducing initial exposure time and increasing initial exposure gain; after receiving a shooting instruction, the terminal performs second adjustment on the first adjusted exposure parameter, generates a first exposure frame according to the first adjusted exposure parameter, and generates at least two second exposure frames according to the second adjusted exposure parameter, wherein the second adjustment comprises reducing the first adjusted exposure time and increasing the first adjusted exposure gain; and the terminal fuses the first exposure frame and the at least two second exposure frames and outputs a fused image.

Description

Image processing method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
Some terminals with the photographing function have motion blurs of different degrees when photographing moving objects, and if the moving speed of a photographed target is high, a serious smear phenomenon can occur. However, motion scenes such as human body motion, object motion, flower and grass shake and the like are important scenes which are focused on by the user during photographing. Professional photographers often equip expensive large apertures to improve the clarity of shooting moving objects, and set appropriate aperture values and shutter speeds to capture moving objects. However, for the mobile phone user who takes pictures anytime, anywhere: firstly, the aperture of a mobile phone camera is fixed and cannot be automatically expanded, and the adjustment of the aperture to match with a high shutter speed for shooting cannot be supported; secondly, even if the aperture of the mobile phone camera is adjustable, a general user does not have enough prior knowledge to set appropriate camera parameters such as an aperture shutter and the like in time for shooting.
In recent years, the post-processing computing capability of mobile phone images is remarkably improved, and the cost and the difficulty of improving mobile phone camera hardware are high. Some software approaches are developed and applied. The existing software method for resisting motion blur during photographing mainly uses a strategy of reducing exposure time based on motion detection, the method firstly carries out motion detection, when a camera or an object is detected to move, the shutter speed is automatically increased, namely the exposure time is reduced, the motion blur is weakened in the same proportion along with the reduction of the exposure time in the image imaging stage, but the overall brightness of an image is reduced by reducing the exposure time, and therefore, the exposure gain is required to be increased in the same proportion to keep the overall brightness of the image while the exposure time is reduced.
However, the above method of reducing the exposure time and increasing the exposure gain proportionally causes excessive image noise in the dark scene, and reduces the overall image quality. If the strength of reducing the exposure time is limited to ensure the image noise level, the motion blur resistance can be weakened. In summary, decreasing the exposure time and increasing the exposure gain are mutually restricted, resulting in insufficient motion blur reduction capability of the terminal.
Disclosure of Invention
The application provides an image processing method and device, which are used for solving the problem that when motion blur is reduced by reducing exposure time, the power for reducing the exposure time is limited by balancing noise level, so that the capability of reducing the motion blur of a terminal is insufficient.
In one aspect, an image processing method is provided, and the method includes: in a previewing stage, when a terminal previews an image frame acquired by a camera and a shot target object moves relative to the terminal, adjusting an initial exposure parameter of the camera in a previewing state, wherein the first adjustment is called as first adjustment and comprises reducing initial exposure time and increasing initial exposure gain; after receiving a shooting instruction, the terminal adjusts the first adjusted exposure parameter again, which is referred to as a second adjustment, the second adjustment includes reducing the first adjusted exposure time and increasing the first adjusted exposure gain, the terminal generates a first exposure frame according to the first adjusted exposure parameter and generates at least two second exposure frames according to the second adjusted exposure parameter, and the terminal fuses the first exposure frame and the at least two second exposure frames and outputs a fused image. This enables a higher shutter speed to be obtained by reducing the exposure time twice during the imaging phase. The exposure time is reduced in the preview process, the time delay of shooting imaging can be effectively shortened, the exposure time is reduced again after a shooting command is sent, the motion blur can be effectively reduced, time domain multiframe denoising processing is carried out on multiframe short frames in an image post-processing link, noise caused by increasing exposure gain can be reduced, and therefore the method has stronger motion blur resistance and a better motion blur resistance effect.
And when the terminal is in a preview state, the camera can generate initial exposure parameters under automatic exposure. The initial exposure parameters include an initial exposure time and an initial exposure gain.
In a possible design, the terminal performs the second adjustment on the first adjusted exposure parameter when a certain condition is met, performs only the first adjustment when the condition is not met, and directly outputs a frame according to the first adjusted exposure parameter without performing the second adjustment, where the condition may be any one of the following conditions or other conditions. The conditions may be: when the speed of the relative motion between the target object and the terminal is larger than a set speed threshold value, the terminal carries out second adjustment on the exposure parameter after the first adjustment; it can also be: when the initial exposure gain of the camera is larger than a set gain threshold value in a preview state, the terminal carries out second adjustment on the exposure parameter after the first adjustment; or it may also be: and when the brightness value LV of the camera in the preview state of the terminal is smaller than a set brightness threshold value, carrying out second adjustment on the exposure parameter after the first adjustment. Therefore, whether the second adjustment is carried out or not is selected according to the exposure gain and the LV value, different strategies for reducing motion blur can be flexibly selected, and the targeted processing of different environment brightness scenes is achieved.
In one possible design, if the terminal determines that the relative motion rate of the target object and the terminal is not greater than the set rate threshold, only the initial exposure parameter of the camera in the preview state is subjected to first adjustment, the imaging process adopts the first adjusted exposure parameter to generate an exposure frame, and an image is output according to the generated exposure frame, namely, the process without second adjustment
In one possible design, the terminal determines the proportional value of the exposure time reduction corresponding to the relative movement rate and the value of the exposure gain increase according to the relationship between a preset movement rate and the proportional value of the exposure time reduction; and according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain, carrying out first adjustment on the initial exposure parameters of the camera in the preview state. Therefore, the value of the exposure time can be adjusted according to the proportion by the value of the movement speed, the proportion value of the exposure time can be associated with the movement speed, a proper proportion value of the exposure time can be selected for the current movement speed, and the phenomenon that the larger noise level is brought by the excessively low exposure time can be prevented.
In one possible design, a maximum exposure gain threshold is set, and when the exposure parameter in the preview state is first adjusted, the increased exposure gain needs to be smaller than the set maximum exposure gain threshold. The setting of this method can control the noise level of the preview image by limiting the upper limit of the exposure gain.
In a possible design, before the terminal fuses the first exposure frame and the at least two second exposure frames, the terminal performs time-domain multi-frame denoising and fusion processing on the at least two second exposure frames to obtain a short frame. Because the second exposure frame is generated by adopting shorter exposure time, larger exposure gain is increased in proportion, and more noise is brought.
In a possible design, the terminal uses the short frame as a reference frame, performs image registration and ghost detection on the first exposure frame and the short frame, and performs ghost-removing processing on the first exposure frame after image registration according to a ghost detection result to obtain a long frame after ghost removal; and according to the result of the ghost detection, performing frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference. Specifically, the short frame pixel fusion weight is greater than the long frame pixel fusion weight in the ghost area, the long frame pixel fusion weight is greater than the short frame pixel fusion weight in other areas, the ghost area is a moving area, that is, a partial area where the target object and the terminal have relative motion, the non-moving area can be considered to be relatively static or approximately relatively static with the terminal, the moving area is jittering flowers and plants, and the non-moving area is a background such as sky and ground. Therefore, the brightness information of the long exposure frame can be kept, the moving area and the non-moving area of the shot image are clearer, and the problem that the capability of reducing motion blur of the terminal is insufficient due to the fact that the intensity of reducing exposure time is limited by balancing noise level can be effectively solved.
In a second aspect, there is provided an image processing apparatus having a function of implementing the method in any one of the possible designs of the first aspect and the first aspect described above. The functions can be realized by hardware, and the functions can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above-described functions.
In one possible design, the device may be a chip or an integrated circuit.
In one possible design, the apparatus includes a camera for capturing image frames and a processor for executing a set of programs, and when the programs are executed, the apparatus may perform the method described in any one of the possible designs of the first aspect and the first aspect.
In one possible design, the apparatus further includes a memory to store code executed by the processor.
In a third aspect, there is provided a computer storage medium storing a computer program comprising instructions for performing the first aspect and any one of the possible design-in-method methods of the first aspect.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method as set forth in the first aspect and any one of the possible designs of the first aspect.
Drawings
Fig. 1 is a schematic diagram of a terminal hardware structure in an embodiment of the present application;
FIG. 2 is a flowchart illustrating an image processing method according to an embodiment of the present application;
FIG. 3a is a diagram illustrating the relationship between the exposure time reduction ratio and the exposure gain according to an embodiment of the present application;
FIG. 3b is a diagram illustrating the relationship between the motion rate level and the exposure time reduction according to the embodiment of the present application;
FIG. 4 is a schematic diagram of multi-frame short frame noise reduction and long and short frame fusion in the embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a process of removing ghosting in a long exposure frame according to an embodiment of the present application;
fig. 6 is a schematic flow chart of frequency domain fusion of long and short frames in the embodiment of the present application;
fig. 7 is a schematic flowchart of an image processing method in an application scenario according to an embodiment of the present application;
FIG. 8 is a diagram illustrating an exemplary structure of an image processing apparatus according to an embodiment of the present disclosure;
fig. 9 is a second schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The terminal according to the embodiments of the present application may be any electronic device having a photographing function, including but not limited to a personal computer, a server computer, a handheld or laptop device, a mobile phone, a tablet computer, a personal digital assistant, a media player, a consumer electronic device, a mini-computer, and a mainframe computer.
Fig. 1 is a schematic diagram of a hardware structure of a terminal applied in the embodiment of the present application. As shown in fig. 1, the terminal 100 includes a display device 110, a processor 120, and a memory 130. The memory 130 may be used to store software programs and data, and the processor 120 executes various functional applications and data processing of the terminal 100 by operating the software programs and data stored in the memory 130. The memory 130 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created according to the use of the terminal 100, such as audio data, a phonebook, and exchangeable image files EXIF. Further, the memory 130 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The processor 120 is a control center of the terminal 100, connects various parts of the entire terminal using various interfaces and lines, performs various functions of the terminal 100 and processes data by running or executing software programs and/or data stored in the memory 130, thereby monitoring the entire terminal. The processor 120 may include one or more general-purpose processors, and may further include one or more Digital Signal Processors (DSPs) for performing related operations to implement the technical solutions provided by the embodiments of the present application. Specifically, the processor 120 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP. The processor 120 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof. Memory 130 may include volatile memory (volatile memory), such as random-access memory (RAM); the memory 130 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD); memory 130 may also comprise a combination of the above types of memory.
The terminal 100 may further include an input device 140 for receiving input numerical information, character information, or contact touch operation/non-contact gesture, and generating signal input related to user setting and function control of the terminal 100, etc. The input device 140 may include a touch panel 141. The touch panel 141, also called a touch screen, can collect touch operations of a user thereon or nearby, and drive a corresponding connection device according to a preset program. The touch panel 141 can be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 141, the input device 140 may include other input devices 142, and the other input devices 142 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. The display device 110 includes a display panel 111 for displaying information input by a user or information provided to the user and various menu interfaces of the terminal device 100. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD) or an organic light-emitting diode (OLED) to configure the display panel 111.
In addition to the above, the terminal 100 may further include a power supply 150 for supplying power to other modules and a camera 160 for taking a picture or video. The terminal 100 can also include one or more sensors 170, such as an acceleration sensor, a light sensor, a GPS sensor, an infrared sensor, a laser sensor, a position sensor, or a lens pointing angle sensor, among others. The terminal 100 may further include a Radio Frequency (RF) circuit 180 for performing network communication with a wireless network device, and a WiFi module 190 for performing WiFi communication with other devices.
The image processing method provided by the embodiment of the present application may be executed by the terminal 100 based on the hardware structure of the terminal shown in fig. 1.
The terminal enters a preview state after the camera is opened, and in the process of previewing the image frames acquired by the camera, the shot target object may be in a motion state, such as human motion, object motion, flower and grass shake and the like. In addition, the photographed target object may be in a stationary state, but the terminal is in a moving state, for example, the user photographs the terminal while holding the terminal on a running train, or the user photographs the terminal while moving, or the like. In any of the above states, it can be considered that the photographed target object and the terminal have a relative motion relationship. In such an application scenario where the photographed target object and the terminal have relative motion, there is a possibility that the image photographed by the terminal has a motion blur problem. The method provided by the embodiment of the application can be helpful for solving the motion blur problem in the application scene.
The following describes a flow of an image processing method provided in an embodiment of the present application in further detail with reference to the accompanying drawings.
As shown in fig. 2, a flow of an image processing method provided in an embodiment of the present application is as follows.
Step 201, in the process of previewing the image frame acquired by the camera, when the shot target object and the terminal move relatively, the terminal performs a first adjustment on the initial exposure parameter of the camera in the previewing state.
Wherein the first adjustment comprises decreasing an initial exposure time and increasing an initial exposure gain. In practical applications, when the terminal is in a preview state, the camera generates initial exposure parameters under Automatic Exposure (AE). The initial exposure parameters include an initial exposure time and an initial exposure gain.
Step 202, after receiving the shooting instruction, the terminal performs a second adjustment on the first adjusted exposure parameter, generates a first exposure frame according to the first adjusted exposure parameter, and generates at least two second exposure frames according to the second adjusted exposure parameter, wherein the second adjustment includes reducing the first adjusted exposure time and increasing the first adjusted exposure gain.
Since the second adjusted exposure time is shorter than the first adjusted exposure time, in the embodiment of the present application, the first exposure frame generated according to the first adjusted exposure parameter may also be referred to as a long exposure frame, and the second exposure frame generated according to the second adjusted exposure parameter may also be referred to as a short exposure frame.
And 203, fusing the first exposure frame and the at least two second exposure frames by the terminal, and outputting a fused image.
In the embodiment of the present application, when a certain condition is satisfied, the exposure parameter needs to be adjusted twice, and for distinction, adjustment performed on the initial exposure parameter of the camera in the preview state is referred to as a first adjustment, and adjustment performed on the first adjusted exposure parameter after receiving a shooting instruction input by a user is referred to as a second adjustment. The exposure parameters related to the embodiments of the present application include at least an exposure time and an exposure gain. The first adjustment is to reduce the exposure time and increase the exposure gain on the basis of the exposure parameters of the camera automatic exposure in the preview state; the second adjustment is to decrease the exposure time again and increase the exposure gain on the basis of the first adjusted exposure parameter. Optionally, in the first adjustment and the second adjustment, the proportion of decreasing the exposure time is the same as the proportion of increasing the exposure gain, or the proportion of increasing the exposure gain is obtained according to the proportion of decreasing the exposure time. The shooting related to the embodiment of the application can be but is not limited to the process of taking pictures, shooting pictures and the like by using a camera.
It can be seen that, in the embodiment of the present application, a shorter exposure time can be obtained by reducing the exposure time twice, and the motion blur is reduced proportionally with the reduction of the exposure time, so that the shorter exposure time is beneficial to reducing the motion blur more effectively. In order to weaken the noise problem caused by reducing the exposure time, the embodiment of the application adopts a multi-frame fusion mode, generates a long exposure frame according to the exposure parameter adjusted for the first time in the imaging stage, and generates a short exposure frame according to the exposure parameter adjusted for the second time. The long exposure frame and the short exposure frame are fused, the respective advantages of the long exposure frame and the short exposure frame are kept in the fusion process, and the image with a clearer motion area can be effectively output.
The following describes the image processing method shown in fig. 2 in some detail, and the achievement of the beneficial effects is described in some detail.
In step 202, a series of actions performed by the terminal are performed during a period from when the terminal receives a shooting instruction triggered by a user to when imaging is output. The terminal generates at least one long exposure frame and at least two short exposure frames in the imaging process. Optionally, the terminal may also generate four or five short exposure frames in the imaging process, but three short exposure frames are used as a possible implementation manner by integrating the factors such as the calculation amount.
Specifically, after the terminal starts the camera, the motion detection is performed on the previewed image frame to determine whether the shot target object and the terminal move relatively. Optionally, the embodiment of the present application may apply any motion detection method in the prior art to perform motion detection on the previewed image frame. For example, generally, the motion detection method generally down-samples two previous and next image frames, and meshes the down-sampled image frames, and one image frame may include a plurality of image mesh blocks after being meshed. Judging whether the images in the two frame grids move or not by carrying out mode matching on the images in the corresponding image grid blocks of the front and the back image frames and calculating the correlation, and if the calculated correlation is large, indicating that no movement exists; a small correlation indicates that there is a relative change in the images within the two frame grid, and that there is motion. The two previous and next image frames are expressed by a current frame and an adjacent frame, the most similar image grid blocks in the adjacent frame and the current frame are determined, the movement speed is measured through the distance between the two image grid blocks obtained through matching, and the overall movement speed of a target in the image is obtained by calculating the average speed of all the image grid blocks.
If the photographed target object and the terminal are detected to be relatively static or approximately relatively static, the fact that the terminal possibly photographs the current image is a static image is indicated, and photographing is carried out according to the normal exposure parameters of AE convergence. If the shot target object and the terminal move relatively, the exposure parameters need to be adjusted. In one possible implementation manner, if the terminal determines that the speed of the relative motion between the target object and the terminal is greater than a set speed threshold, performing second adjustment on the exposure parameter after the first adjustment; if the terminal determines that the relative movement rate of the target object and the terminal is not larger than a set rate threshold, only the initial exposure parameter of the camera in the preview state is subjected to first adjustment, an exposure frame is generated in the imaging process by adopting the exposure parameter after the first adjustment, and an image is output according to the generated exposure frame, namely, a second adjustment process is not performed. Wherein the rate threshold is set to an empirical value.
Alternatively, the condition whether the second adjustment is performed or not may be equivalently changed to the following two or other conditions.
And under the first condition, when the initial exposure gain of the camera in the preview state is larger than a set gain threshold, performing second adjustment on the first adjusted exposure parameter, and when the exposure gain in the process of previewing the image is determined to be not larger than the set gain threshold, generating an exposure frame by adopting the first adjusted exposure parameter in the imaging process, and outputting the image according to the generated exposure frame, namely, a process without second adjustment. Here, the gain threshold is set to an empirical value, for example, 800.
And secondly, when the Light Value (LV) of the camera in the preview state is smaller than the set brightness threshold, performing second adjustment on the first adjusted exposure parameter, when the Light Value (LV) in the process of determining the preview image is not smaller than the set brightness threshold, generating an exposure frame by adopting the first adjusted exposure parameter in the imaging process, and outputting the image according to the generated exposure frame, namely, the process without the second adjustment. Wherein, LV can be regarded as the ambient brightness in the process of previewing the image frame acquired by the camera by the terminal. The luminance threshold value is set to an empirical value, for example, 40 or 20.
Therefore, whether the second adjustment is carried out or not is selected according to the exposure gain and the LV value, different strategies for reducing motion blur can be flexibly selected, and the targeted processing of different environment brightness scenes is achieved. For example, in general, a highlight scene is selected to perform first adjustment on an initial exposure parameter of a camera only in a preview state, and frames are output according to the exposure parameter after the first adjustment; and selecting to perform first adjustment and then perform second adjustment in a dark scene, and fusing the long and short frames by combining image post-processing. Of course, it is also possible to select to perform the first adjustment and then perform the second adjustment in most luminance environments, and combine the long and short frames with the image post-processing.
Optionally, when the terminal adjusts an initial exposure parameter of the camera in the preview state, the ratio of the exposure time reduction may be related to the rate of the relative motion between the target object obtained by the motion detection and the terminal. Specifically, the terminal determines a proportional value of decreasing the exposure time corresponding to the rate of the relative movement according to the relationship between the preset movement rate and the proportional value of decreasing the exposure time, and determines a value of increasing the exposure gain according to the determined proportional value of decreasing the exposure time, generally speaking, the proportion of increasing the exposure gain is the same as the proportion of decreasing the exposure time. And adjusting the initial exposure parameters of the camera in the preview state according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain. The relationship between the movement rate and the proportional value of the exposure time reduction may be a linear relationship, that is, as the movement rate increases, the proportional value of the exposure time reduction increases, and the value of the exposure gain increases. However, since increasing the exposure gain can cause noise, optionally, as shown in fig. 3a, in the embodiment of the present application, a maximum exposure gain threshold is set, and when adjusting the exposure parameter of the camera in the preview state, the value of the increased exposure gain needs to be smaller than the set maximum exposure gain threshold. If the value obtained by increasing the exposure gain in the same proportion as the exposure time is reduced is larger than the maximum exposure gain threshold value, the proportion value for reducing the exposure time needs to be reduced, and at least the value is reduced to ensure that the increased exposure gain is smaller than the set maximum exposure gain threshold value. For example, the maximum exposure gain threshold may be 700, 1000. The setting of this method can control the noise level of the preview image by limiting the upper limit of the exposure gain.
If the condition of whether to perform the second adjustment is judged according to whether the speed of the relative motion is larger than the set speed threshold, the speed of the relative motion and the proportional value of the exposure time reduction meet a certain change relationship, and approximately the proportional value of the exposure time reduction is in a linear-first-then-invariable relationship with the speed of the relative motion. For example, as shown in FIG. 3b, a schematic diagram is characterized in which a first adjustment of the exposure parameters is made in a rate level in the preview state. It is assumed that the rate of relative motion is divided into several levels, with different levels corresponding to different values of the scale for reducing the exposure time. When the speed grade of the relative movement is 0, the still image is shot, and imaging is carried out according to the normal exposure parameters of AE convergence; when the speed grade of the relative movement is 1, according to the relation graph, the proportion of the exposure time is reduced according to the proportion 1, the exposure gain is increased according to the same proportion, and imaging is carried out according to the adjusted exposure parameters; when the speed grade of the relative movement is 2, according to the relation graph, reducing the proportion of the exposure time according to the proportion 2, increasing the exposure gain according to the same proportion, and imaging according to the adjusted exposure parameters; when the speed grade of the relative movement is 3, the movement speed is larger than a set speed threshold, the proportion of the exposure time is reduced according to the proportion 2 in the preview stage, the exposure gain is increased according to the proportion, and imaging is carried out according to the adjusted exposure parameters; when the speed grade of the relative movement is larger than 3, the movement speed is larger than a set speed threshold, the proportion of the exposure time is reduced according to the proportion 2 in the preview stage, the exposure gain is increased according to the proportion, and imaging is carried out according to the adjusted exposure parameters.
In the following, it is emphasized how at least two short exposure frames and at least one long exposure frame are merged in the post-image processing stage. For convenience of explanation, taking the generation of three short exposure frames and one long exposure frame in the imaging stage as an example, the three short exposure frames and the long exposure frame are fused, and the fused image is output.
As shown in fig. 4, the fusion process is roughly divided into: after an image signal processing unit (ISP) of the terminal outputs three short exposure frames and one long exposure frame, a step 401 and a step 402 are performed, and a fusion result, that is, a fused image is output finally.
Step 401: and performing time domain multi-frame denoising and fusion processing on the three short exposure frames, and calling one frame obtained after the time domain multi-frame denoising and fusion processing as a short frame.
The short frame here refers to the result of the noise reduction and fusion processing performed on the three short-exposure frames, and in particular, may be one of the three short-exposure frames.
Step 402: and fusing the short frame and the long exposure frame obtained in the step 401.
Specifically, in the case where sensitivity and exposure time specified by the International Standards Organization (ISO) are fixed, the same scene is photographed at least twice, since noise fluctuates randomly from frame to frame, the signal is fixed, and if the noise variance of each frame is σ2After N frames are averaged, the noise variance of the resulting frame is reduced to sigma2and/N. The signal-to-noise ratio is improved by 6dB for each half of the noise standard deviation reduction. Therefore, theoretically 4-frame time domain noise reduction can be improved by 6 dB. In the embodiment of the application, three-frame short exposure frame fusion is used for noise reduction. The three-Frame short exposure Frame is represented by Frame0, Frame1, and Frame 2.
If hand-held shaking occurs or an object in a scene moves in the process of shooting multiple frames, time domain averaging may be misplaced and ghost or blur is introduced, so that image registration and ghost detection need to be added before the time domain averaging of the multiple frames.
Thus, step 401 may include several specific implementation steps. Image registration, ghost detection and time domain multi-frame noise reduction fusion. Wherein:
image registration: and performing feature extraction on the input reference Frame0 and the Frame to be registered 1 to respectively obtain a series of feature points, and performing feature description on each feature point. And matching the feature points of the two frames of images according to the feature description to obtain a series of feature point pairs. And solving the matched characteristic point pairs to obtain a transformation matrix, namely a projective transformation, of the two frames of images, wherein the transformation matrix is a matrix H of 3x 3. The Frame1 obtains an image aligned with the Frame0 through H matrix transformation, and can be called a registration result of the Frame 1. Similarly, the image with the Frame2 aligned with the Frame0 is obtained in the same way, and can be called the registration result of the Frame 2.
And (3) ghost detection: and respectively subtracting the registration results of the Frame1 and the Frame2 from the reference Frame0 to obtain a difference value of each pixel point, namely a diff graph. And performing Gaussian smoothing on the diff image to remove the influence of noise. And comparing diff with a corresponding ghost threshold value to judge whether the point is a ghost point. And eliminating isolated points (noise) by using corrosion expansion to obtain final ghost Mask.
Time domain multi-frame noise reduction fusion: it is determined whether the size of the area of the ghost region for ghost detection is smaller than 1/3 for the full image. If not, directly outputting the data of the Frame0, namely omitting the fusion process, and directly taking the Frame0 as the short Frame to participate in the subsequent long and short Frame fusion. If yes, weighting each pixel point according to a ghost Mask to remove the ghost, and outputting a fusion result, wherein the pixel fusion weight of the reference frame in the ghost Mask area is greater than that of the corresponding pixels of other registration frames, namely, the pixel point of the reference frame is taken as the fusion result in the ghost area.
Step 402 may include several specific implementation steps. And (3) carrying out image registration, ghost detection and removal on the short frame and the long exposure frame obtained in the step (401), and carrying out frequency domain fusion on the long frame and the short frame without the ghost.
Similarly, in order to avoid fusion errors caused by ghosts in motion areas between long exposure frames and short frames, the image registration and ghost detection method which is the same as that used in the multi-frame short frame noise reduction process is also used before the fusion of the long and short frames, and the influence of ghosts is removed. The image registration method is similar to the image registration method, a ghost Mask is used as a weight, registration results of the short frames and the long exposure frames subjected to noise reduction and fusion of the multiple frames are fused, and the long exposure frames subjected to ghost elimination are obtained. For convenience of description, the resulting de-ghosted long exposure frame is referred to herein as a long frame. And according to the result of the ghost detection, carrying out frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference, wherein the pixel fusion weight of the short frame in the ghost area is greater than that of the corresponding pixel in the long frame, and the pixel fusion weight of the long frame in the non-ghost area is greater than that of the corresponding pixel in the short frame.
Fig. 5 is a schematic flow chart of removing the ghost in the long exposure frame. The input is a short frame obtained in step 401 and a registered long exposure frame. And taking a ghost Mask detected by the ghost as a weight, and performing weighted fusion on the two input images, wherein the morphological expansion and smoothing operation is to remove the cavity and the edge smoothing of the ghost Mask and improve the image fusion effect. The de-ghosted long exposure frame, which may be referred to as a long frame in this embodiment, is output.
It should be noted that, when the relative motion degree between the shooting target object and the terminal is large or a macro scene is shot, if the ratio of the area of the ghost region of the long and short frame images in the whole image exceeds a set threshold (for example, 10%), it indicates that the area exceeds the maximum ghost threshold that can be processed by the algorithm, and at this time, to avoid the fusion dislocation, the long and short frame frequency domain fusion is skipped, and the long exposure frame is directly output.
In image processing, the frequency domain reflects the intensity of the image gray scale change in the spatial domain, that is, the change speed of the image gray scale, or the gradient size of the image. For an image, the edge part of the image is a sudden change part which changes rapidly, so that the reaction is a high-frequency component in a frequency domain; the noise of the image is mostly a high frequency part; the gently changing part of the image is a low-frequency component. That is, the fourier transform provides a free conversion from spatial domain to frequency to observe the image, and the image can be converted from a gray scale distribution to a frequency distribution to observe the characteristics of the image. Image to frequency direct relationship: the low frequency is mostly the gentle profile of image, and the intermediate frequency is mostly details such as image edge, texture, and the high frequency is mostly image noise, and Fourier spectrogram center bright spot then represents the grey level mean of image. By using the direct relationship between the image and the frequency domain, the embodiment performs frequency domain fusion on the long frame and the short frame by using the ghost area of the long frame as the fusion reference according to the ghost detection result, thereby achieving the purpose of simultaneously retaining the brightness of the long frame, the detail information of the non-motion area and the definition information of the motion area of the short frame.
As shown in fig. 6, a schematic flow chart of frequency domain fusion for a long frame and a short frame is shown. The input is as follows: and the multi-frame denoising is carried out on the fused short frames and the long frames with the ghost removed.
And 601, respectively performing down-sampling on the multi-frame de-noising fused short frame and the ghost-removed long frame, and reducing the calculated amount.
And step 602, after long frame down-sampling, up-sampling to calculate a loss error map of the original image, and restoring image detail loss caused by down-sampling after image fusion.
And 603, performing fast Fourier transform on the two input frames subjected to the down-sampling respectively to obtain respective Fourier spectrums, and calculating corresponding amplitude values.
And step 604, fusing the Fourier spectrums of the two inputs by taking the amplitude as the weight.
During fusion, a bright point in the long-frame Fourier spectrogram needs to be protected, and a value in a 10x10 area with the bright point as the center is assigned to the fused Fourier spectrum, so that the average brightness information of the long frame is reserved.
And 605, performing inverse Fourier transform on the fused frequency spectrum to obtain a fused image.
And step 606, adding the fused image and the error map obtained by the long frame downsampling calculation, and recovering the loss caused by downsampling.
And outputting a final fusion result.
In summary, according to the embodiment of the application, the exposure time is reduced for the first time in the preview state, the exposure time is reduced for the second time in the imaging process, and the multi-frame short frame noise reduction fusion and the long and short frame frequency domain fusion processing are performed in the image post-processing stage, so that the motion blur reduction capability during terminal shooting is improved.
As shown in fig. 7, the image processing method is further described in detail below with reference to specific application scenarios, which are assumed to be photographed by a camera.
Step 701, the terminal receives an instruction of opening the camera, starts the camera and enters a preview state.
Step 702, down-sampling the preview image data.
And step 703, performing motion detection on the down-sampled image data.
The motion detection is performed by analyzing the preview images of the front frame and the back frame, and the detection result is output immediately, for example, the motion state and the rate level can be output, and the rate level is larger when the motion rate is larger. The rate level is 0, indicating stationary no motion, and the motion rates indicated by rate levels 1, 2, and 3 are incremented.
If no motion is detected, executing step 704 to step 705; if motion is detected, step 706 is performed.
And step 704, when the camera application terminal issues a photographing command, photographing is carried out according to the normal exposure parameters of AE convergence, namely, the Sensor frames according to the normal exposure parameters.
Step 705, outputting and storing the image.
Step 706, determining whether the motion rate level is greater than a set rate threshold, or determining whether the exposure gain is greater than a set gain threshold, or determining whether the LV value is less than a set brightness threshold; if so, go to step 710-step 714, otherwise, go to step 707-step 709.
Step 707, adjust exposure parameters on the preview.
And reducing the exposure time according to a preset proportion based on the motion speed grade, and increasing the exposure gain according to the preset proportion so as to keep the overall brightness of the image unchanged.
Step 708, when the camera application terminal issues the photographing command, the Sensor frames according to the adjusted exposure parameter.
Step 709, outputting and storing the image.
Step 710, adjust exposure parameters on the preview.
And reducing the exposure time according to a preset proportion based on the motion speed grade, and increasing the exposure gain according to the preset proportion so as to keep the overall brightness of the image unchanged.
And 711, when the camera application terminal issues a photographing command, firstly, reducing the exposure time and increasing the exposure gain again according to a set proportion on the basis of the adjusted preview exposure parameter.
And 712, generating three short exposure frames by the Sensor according to the decreased exposure time and the increased exposure gain, and generating a long exposure frame according to the preview exposure parameters after the first adjustment. And carrying out image post-processing after ISP processing.
And 713, performing short-frame multi-frame noise reduction and long-frame and short-frame fusion in image post-processing, and reserving details of a short-frame motion region and details of a long-frame non-motion region.
And 714, outputting and saving the final fusion image.
In summary, in the image processing method provided in this embodiment of the present application, exposure time is reduced twice in an imaging stage to obtain a higher shutter speed, a long exposure frame is generated by using a parameter for reducing exposure time for the first time, at least two short exposure frames are generated by using a parameter for reducing exposure time for the second time, multiple frames of long and short exposures are fused in an image post-processing stage, a short frame pixel fusion weight is greater than a long frame pixel fusion weight in a ghost area, a long frame pixel fusion weight in other areas is greater than a short frame pixel fusion weight, the ghost area is a motion area, which is a partial area where a target object and a terminal have relative motion, a non-motion area can be considered to be relatively stationary or approximately relatively stationary with the terminal, the motion area is jittered flowers and grasses, and the non-motion area is a background such as a sky ground. The method can keep the brightness information of the long exposure frame, so that the motion area and the non-motion area of the shot image are clearer, the problem that the capacity of reducing motion blur of a terminal caused by limitation of the intensity of reducing the exposure time is insufficient due to the balance of noise level can be effectively solved, the exposure time is reduced in the preview process, the time delay of shooting imaging can be effectively shortened, the exposure time is reduced again after a shooting command is sent, the motion blur can be reduced more effectively, time domain multi-frame noise reduction processing is carried out on multi-frame short frames through an image post-processing link, the noise brought by increasing the exposure gain can be reduced, and the method has stronger motion blur resistance and better motion blur resistance.
Based on the same inventive concept as the image processing method shown in fig. 2, as shown in fig. 8, an embodiment of the present application further provides an image processing apparatus 800, the image processing apparatus 800 being configured to execute the image processing method shown in fig. 2, the image processing apparatus 800 including:
an adjusting unit 801, configured to perform a first adjustment on an initial exposure parameter of the camera in a preview state when there is a relative motion between a photographed target object and the device in a process of previewing an image frame captured by the camera, where the first adjustment includes decreasing an initial exposure time and increasing an initial exposure gain.
The adjusting unit 801 is further configured to perform a second adjustment on the first adjusted exposure parameter after receiving the shooting instruction, where the second adjustment includes decreasing the first adjusted exposure time and increasing the first adjusted exposure gain.
A generating unit 802, configured to generate a first exposure frame according to the first adjusted exposure parameter, and generate at least two second exposure frames according to the second adjusted exposure parameter.
A fusion unit 803, configured to fuse the first exposure frame generated by the generation unit 802 and the at least two second exposure frames, and output a fused image.
Optionally, when performing the second adjustment on the first adjusted exposure parameter, the adjusting unit 801 is configured to: when the relative movement speed of the target object and the device is larger than a set speed threshold value, carrying out second adjustment on the exposure parameters after the first adjustment; or when the initial exposure gain of the camera in the preview state is larger than a set gain threshold, carrying out second adjustment on the exposure parameter after the first adjustment; or when the brightness value LV of the camera in the preview state is smaller than the set brightness threshold value, carrying out second adjustment on the exposure parameter after the first adjustment.
Optionally, when performing the first adjustment on the initial exposure parameter of the camera in the preview state, the adjusting unit 801 is configured to: determining a proportional value of exposure time reduction corresponding to the relative movement rate and a value of exposure gain increase according to the relation between a preset movement rate and the proportional value of exposure time reduction; and carrying out first adjustment on the initial exposure parameters of the camera in the preview state according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain.
Optionally, before the first exposure frame and the at least two second exposure frames are fused, the fusion unit 803 is further configured to perform time-domain multi-frame denoising fusion processing on the at least two second exposure frames to obtain a short frame.
Optionally, when the first exposure frame and the at least two second exposure frames are fused, the fusion unit 803 is configured to perform image registration and ghost detection on the first exposure frame and the short frame by using the short frame as a reference frame, and perform, according to a result of the ghost detection, a ghost-removing process on the first exposure frame after the image registration to obtain a long frame after the ghost is removed; and according to the result of the ghost detection, performing frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference.
Based on the image processing method shown in fig. 2, as shown in fig. 9, an embodiment of the present application further provides another image processing apparatus 900, which includes a camera 901 and a processor 902. Wherein the camera 901 is used for capturing image frames and the processor 902 is used for executing a set of codes, which when executed, enables the image processing apparatus to perform the image processing method shown in fig. 2. The same parts of the method are not described in detail herein. The image processing apparatus 900 may be the terminal 100 shown in fig. 1. The terminal 100 shown in fig. 1 may be used to execute the image processing method shown in fig. 2, with the camera 160 performing the functions performed by the camera 901 and the processor 120 performing the functions performed by the processor 902. The camera 160 is used for collecting image frames; the processor 120 is used to perform the details of the image processing method shown in fig. 2. The functional module adjustment unit 801, the generation unit 802, and the fusion unit 803 in fig. 8 may be implemented by the processor 902 in the image processing apparatus 900, that is, may also be implemented by the processor 120 in the terminal 100 shown in fig. 1.
An embodiment of the present application provides a computer storage medium storing a computer program including instructions for executing the image processing method shown in fig. 2.
The present application provides a computer program product containing instructions, which when run on a computer, causes the computer to execute the image processing method shown in fig. 2.
Any image processing device provided by the embodiment of the application can also be a system chip.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the embodiments of the present application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.

Claims (16)

1. An image processing method, comprising:
the method comprises the steps that when a terminal previews an image frame acquired through a camera and a shot target object moves relative to the terminal, first adjustment is conducted on initial exposure parameters of the camera in a previewing state, wherein the first adjustment comprises the steps of reducing initial exposure time and increasing initial exposure gain;
after receiving a shooting instruction, the terminal performs second adjustment on the first adjusted exposure parameter, generates a first exposure frame according to the first adjusted exposure parameter, and generates at least two second exposure frames according to the second adjusted exposure parameter, wherein the second adjustment comprises reducing the first adjusted exposure time and increasing the first adjusted exposure gain;
and the terminal fuses the first exposure frame and the at least two second exposure frames and outputs a fused image.
2. The method of claim 1, wherein the terminal second adjusts the first adjusted exposure parameter, comprising:
when the speed of the relative motion between the target object and the terminal is larger than a set speed threshold value, the terminal carries out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
when the initial exposure gain of the camera is larger than a set gain threshold value in a preview state, the terminal carries out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
and when the brightness value LV of the camera in the preview state of the terminal is smaller than a set brightness threshold value, carrying out second adjustment on the exposure parameter after the first adjustment.
3. The method of claim 1 or 2, wherein the first adjusting of the initial exposure parameters of the camera in the preview state comprises:
the terminal determines the proportion value of the exposure time reduction corresponding to the relative movement rate and determines the value of the exposure gain increase according to the relationship between the preset movement rate and the proportion value of the exposure time reduction;
and according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain, carrying out first adjustment on the initial exposure parameters of the camera in the preview state.
4. The method of claim 1 or 2, wherein the terminal, prior to fusing the first exposure frame and the at least two second exposure frames, further comprises:
and the terminal performs time domain multi-frame denoising and fusion processing on the at least two second exposure frames to obtain a short frame.
5. The method of claim 4, wherein the terminal fusing the first exposure frame and the at least two second exposure frames comprises:
the terminal takes the short frame as a reference frame, carries out image registration and ghost detection on the first exposure frame and the short frame, and carries out ghost removing processing on the first exposure frame after image registration according to a ghost detection result to obtain a long frame after ghost removing;
and according to the result of the ghost detection, performing frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference.
6. An image processing apparatus characterized by comprising:
the device comprises an adjusting unit, a display unit and a control unit, wherein the adjusting unit is used for performing first adjustment on an initial exposure parameter of a camera in a preview state when a shot target object and the device have relative motion in the process of previewing an image frame acquired by the camera, and the first adjustment comprises reducing initial exposure time and increasing initial exposure gain;
the adjusting unit is further configured to perform a second adjustment on the first adjusted exposure parameter after receiving a shooting instruction, where the second adjustment includes reducing the first adjusted exposure time and increasing the first adjusted exposure gain;
a generating unit, configured to generate a first exposure frame according to the first adjusted exposure parameter, and generate at least two second exposure frames according to the second adjusted exposure parameter;
and the fusion unit is used for fusing the first exposure frame generated by the generation unit and the at least two second exposure frames and outputting a fused image.
7. The apparatus of claim 6, wherein, in making a second adjustment of the first adjusted exposure parameter, the adjustment unit is to:
when the relative movement speed of the target object and the device is larger than a set speed threshold value, carrying out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
when the initial exposure gain of the camera in the preview state is larger than a set gain threshold value, carrying out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
and when the brightness value LV of the camera in the preview state is smaller than a set brightness threshold value, carrying out second adjustment on the exposure parameter after the first adjustment.
8. The apparatus according to claim 6 or 7, wherein, when the first adjustment is made to the initial exposure parameters of the camera in the preview state, the adjustment unit is configured to:
determining a proportional value of exposure time reduction corresponding to the relative movement rate and a value of exposure gain increase according to the relation between a preset movement rate and the proportional value of exposure time reduction;
and according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain, carrying out first adjustment on the initial exposure parameters of the camera in the preview state.
9. The apparatus of claim 6 or 7, wherein prior to fusing the first exposure frame and the at least two second exposure frames, the fusing unit is further to:
and performing time domain multi-frame denoising and fusion processing on the at least two second exposure frames to obtain a short frame.
10. The apparatus of claim 9, wherein in fusing the first exposure frame and the at least two second exposure frames, the fusing unit is to:
taking the short frame as a reference frame, carrying out image registration and ghost detection on the first exposure frame and the short frame, and carrying out ghost removing processing on the first exposure frame after image registration according to a ghost detection result to obtain a long frame after ghost removing;
and according to the result of the ghost detection, performing frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference.
11. An image processing apparatus comprising a camera and a processor, wherein:
the camera is used for collecting image frames;
the processor is configured to, during a process of previewing an image frame acquired by a camera, perform a first adjustment on an initial exposure parameter of the camera in a preview state when a photographed target object and the device have a relative motion, and perform a second adjustment on the first adjusted exposure parameter after receiving a photographing instruction, where the first adjustment includes reducing an initial exposure time and increasing an initial exposure gain, and the second adjustment includes reducing the first adjusted exposure time and increasing the first adjusted exposure gain; generating a first exposure frame according to the first adjusted exposure parameter, and generating at least two second exposure frames according to the second adjusted exposure parameter; and fusing the generated first exposure frame and the at least two second exposure frames, and outputting a fused image.
12. The apparatus of claim 11, wherein, in making a second adjustment to the first adjusted exposure parameter, the processor is to:
when the relative movement speed of the target object and the device is larger than a set speed threshold value, carrying out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
when the initial exposure gain of the camera in the preview state is larger than a set gain threshold value, carrying out second adjustment on the exposure parameter after the first adjustment; alternatively, the first and second electrodes may be,
and when the brightness value LV of the camera in the preview state is smaller than a set brightness threshold value, carrying out second adjustment on the exposure parameter after the first adjustment.
13. The apparatus of claim 11 or 12, wherein, in making the first adjustment to the initial exposure parameters of the camera in the preview state, the processor is to:
determining a proportional value of exposure time reduction corresponding to the relative movement rate and a value of exposure gain increase according to the relation between a preset movement rate and the proportional value of exposure time reduction;
and according to the determined proportional value for reducing the exposure time and the value for increasing the exposure gain, carrying out first adjustment on the initial exposure parameters of the camera in the preview state.
14. The apparatus of claim 11 or 12, wherein prior to fusing the first exposure frame and the at least two second exposure frames, the processor is further configured to:
and performing time domain multi-frame denoising and fusion processing on the at least two second exposure frames to obtain a short frame.
15. The apparatus of claim 14, wherein in fusing the first exposure frame and the at least two second exposure frames, the processor is to:
taking the short frame as a reference frame, carrying out image registration and ghost detection on the first exposure frame and the short frame, and carrying out ghost removing processing on the first exposure frame after image registration according to a ghost detection result to obtain a long frame after ghost removing;
and according to the result of the ghost detection, performing frequency domain fusion on the long frame and the short frame by taking the ghost area of the long frame as a fusion reference.
16. A computer storage medium, characterized in that the computer storage medium stores a computer program comprising instructions for performing the method according to any one of claims 1 to 5.
CN201780081683.XA 2017-10-13 2017-10-13 Image processing method and device Active CN110121882B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/106176 WO2019071613A1 (en) 2017-10-13 2017-10-13 Image processing method and device

Publications (2)

Publication Number Publication Date
CN110121882A CN110121882A (en) 2019-08-13
CN110121882B true CN110121882B (en) 2020-09-08

Family

ID=66101215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780081683.XA Active CN110121882B (en) 2017-10-13 2017-10-13 Image processing method and device

Country Status (2)

Country Link
CN (1) CN110121882B (en)
WO (1) WO2019071613A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023016041A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110619593B (en) * 2019-07-30 2023-07-04 西安电子科技大学 Double-exposure video imaging system based on dynamic scene
CN110460773B (en) * 2019-08-16 2021-05-11 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
TWI727497B (en) * 2019-11-11 2021-05-11 瑞昱半導體股份有限公司 Image processing method based on sensor characteristics
CN112819699A (en) * 2019-11-15 2021-05-18 北京金山云网络技术有限公司 Video processing method and device and electronic equipment
CN113055580B (en) * 2019-12-26 2023-10-03 中兴通讯股份有限公司 Environment recognition method, shooting mode switching method, terminal and storage medium
CN111091498B (en) * 2019-12-31 2023-06-23 联想(北京)有限公司 Image processing method, device, electronic equipment and medium
CN113271414B (en) * 2020-02-14 2022-11-18 上海海思技术有限公司 Image acquisition method and device
CN111275653B (en) * 2020-02-28 2023-09-26 北京小米松果电子有限公司 Image denoising method and device
WO2021179223A1 (en) * 2020-03-11 2021-09-16 深圳市大疆创新科技有限公司 Infrared image processing method and processing device, and unmanned aerial vehicle and storage medium
CN111462021B (en) * 2020-04-27 2023-08-29 Oppo广东移动通信有限公司 Image processing method, apparatus, electronic device, and computer-readable storage medium
CN114338956A (en) * 2020-09-30 2022-04-12 北京小米移动软件有限公司 Image processing method, image processing apparatus, and storage medium
CN112399091B (en) * 2020-10-26 2023-01-20 维沃移动通信有限公司 Image processing method and device and electronic equipment
CN112532890B (en) * 2020-11-02 2022-06-07 浙江大华技术股份有限公司 Exposure control method, image pickup apparatus, and computer-readable storage medium
CN112689099B (en) * 2020-12-11 2022-03-22 北京邮电大学 Double-image-free high-dynamic-range imaging method and device for double-lens camera
CN115037915B (en) * 2021-03-05 2023-11-14 华为技术有限公司 Video processing method and processing device
CN113382169B (en) * 2021-06-18 2023-05-09 荣耀终端有限公司 Photographing method and electronic equipment
CN113298735A (en) * 2021-06-22 2021-08-24 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN113592887B (en) * 2021-06-25 2022-09-02 荣耀终端有限公司 Video shooting method, electronic device and computer-readable storage medium
CN113411512B (en) * 2021-08-04 2022-06-24 红云红河烟草(集团)有限责任公司 Industrial camera automatic exposure control method for cigarette field
CN115706870B (en) * 2021-08-12 2023-12-26 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN115706766B (en) * 2021-08-12 2023-12-15 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN115706863B (en) * 2021-08-12 2023-11-21 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN113766150B (en) * 2021-08-31 2024-03-26 北京安酷智芯科技有限公司 Noise reduction method, circuit system, electronic equipment and computer readable storage medium
CN113905185B (en) * 2021-10-27 2023-10-31 锐芯微电子股份有限公司 Image processing method and device
CN116437222B (en) * 2021-12-29 2024-04-19 荣耀终端有限公司 Image processing method and electronic equipment
CN114302068B (en) * 2022-01-06 2023-09-26 重庆紫光华山智安科技有限公司 Image shooting method and device
CN116723408B (en) * 2022-02-28 2024-05-14 荣耀终端有限公司 Exposure control method and electronic equipment
CN114630050A (en) * 2022-03-25 2022-06-14 展讯半导体(南京)有限公司 Photographing method, device, medium and terminal equipment
CN116402695A (en) * 2022-06-28 2023-07-07 上海玄戒技术有限公司 Video data processing method, device and storage medium
CN115278069A (en) * 2022-07-22 2022-11-01 北京紫光展锐通信技术有限公司 Image processing method and device, computer readable storage medium and terminal
CN117880645A (en) * 2022-10-10 2024-04-12 华为技术有限公司 Image processing method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5986705A (en) * 1997-02-18 1999-11-16 Matsushita Electric Industrial Co., Ltd. Exposure control system controlling a solid state image sensing device
CN101510960A (en) * 2009-03-26 2009-08-19 北京中星微电子有限公司 Mobile phone camera shooting method and apparatus
CN101873437A (en) * 2009-09-15 2010-10-27 杭州海康威视系统技术有限公司 Method and device for regulating exposure
CN103634513A (en) * 2012-08-20 2014-03-12 佳能株式会社 Image processing apparatus and control method thereof
CN103702015A (en) * 2013-12-20 2014-04-02 华南理工大学 Exposure control method for human face image acquisition system under near-infrared condition
CN105827964A (en) * 2016-03-24 2016-08-03 维沃移动通信有限公司 Image processing method and mobile terminal
CN106972887A (en) * 2012-05-24 2017-07-21 松下电器(美国)知识产权公司 Information communicating method, information-communication device, program and recording medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4974704B2 (en) * 2007-02-22 2012-07-11 パナソニック株式会社 Imaging device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5986705A (en) * 1997-02-18 1999-11-16 Matsushita Electric Industrial Co., Ltd. Exposure control system controlling a solid state image sensing device
CN101510960A (en) * 2009-03-26 2009-08-19 北京中星微电子有限公司 Mobile phone camera shooting method and apparatus
CN101873437A (en) * 2009-09-15 2010-10-27 杭州海康威视系统技术有限公司 Method and device for regulating exposure
CN106972887A (en) * 2012-05-24 2017-07-21 松下电器(美国)知识产权公司 Information communicating method, information-communication device, program and recording medium
CN103634513A (en) * 2012-08-20 2014-03-12 佳能株式会社 Image processing apparatus and control method thereof
CN103702015A (en) * 2013-12-20 2014-04-02 华南理工大学 Exposure control method for human face image acquisition system under near-infrared condition
CN105827964A (en) * 2016-03-24 2016-08-03 维沃移动通信有限公司 Image processing method and mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023016041A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
CN110121882A (en) 2019-08-13
WO2019071613A1 (en) 2019-04-18

Similar Documents

Publication Publication Date Title
CN110121882B (en) Image processing method and device
CN108898567B (en) Image noise reduction method, device and system
TWI682664B (en) Method for generating an hdr image of a scene based on a tradeoff between brightness distribution and motion
US20140270487A1 (en) Method and apparatus for processing image
US10853927B2 (en) Image fusion architecture
JP6308748B2 (en) Image processing apparatus, imaging apparatus, and image processing method
CN112738411B (en) Exposure adjusting method, exposure adjusting device, electronic equipment and storage medium
US9071766B2 (en) Image capturing apparatus and control method thereof
CN109785264B (en) Image enhancement method and device and electronic equipment
CN110796041B (en) Principal identification method and apparatus, electronic device, and computer-readable storage medium
CN113344821B (en) Image noise reduction method, device, terminal and storage medium
JP2015040941A (en) Image-capturing device, control method therefor, and program
CN111953893B (en) High dynamic range image generation method, terminal device and storage medium
CN112215875A (en) Image processing method, device and electronic system
US9338354B2 (en) Motion blur estimation and restoration using light trails
CN113439286A (en) Processing image data in a composite image
CN112085686A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110740266B (en) Image frame selection method and device, storage medium and electronic equipment
CN107147851B (en) Photo processing method and device, computer readable storage medium and electronic equipment
US9538074B2 (en) Image processing apparatus, imaging apparatus, and image processing method
US10972676B2 (en) Image processing method and electronic device capable of optimizing hdr image by using depth information
CN113793257A (en) Image processing method and device, electronic equipment and computer readable storage medium
WO2013109192A1 (en) Method and device for image processing
CN111654618A (en) Camera focusing sensitivity control method and device
CN109727193B (en) Image blurring method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant