CN111050026A - Image noise reduction control method, terminal and computer readable storage medium - Google Patents

Image noise reduction control method, terminal and computer readable storage medium Download PDF

Info

Publication number
CN111050026A
CN111050026A CN201911286509.6A CN201911286509A CN111050026A CN 111050026 A CN111050026 A CN 111050026A CN 201911286509 A CN201911286509 A CN 201911286509A CN 111050026 A CN111050026 A CN 111050026A
Authority
CN
China
Prior art keywords
noise reduction
image
parameters
color difference
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911286509.6A
Other languages
Chinese (zh)
Other versions
CN111050026B (en
Inventor
李蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201911286509.6A priority Critical patent/CN111050026B/en
Publication of CN111050026A publication Critical patent/CN111050026A/en
Application granted granted Critical
Publication of CN111050026B publication Critical patent/CN111050026B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses an image noise reduction control method, a terminal and a computer readable storage medium; the method comprises the following steps: acquiring color difference parameters in a shot image and shooting parameters adopted for shooting the image, matching the acquired shooting parameters and the color difference parameters with a preset corresponding relation, acquiring a target noise reduction scheme matched with the shooting parameters and the color difference parameters, wherein the corresponding relation is the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme, and controlling the target noise reduction scheme to reduce noise of the image; the invention also discloses a terminal and a computer readable storage medium, by implementing the scheme, aiming at different conditions, different noise reduction schemes are controlled to be adopted, the noise reduction effect of the image is improved, and the user experience is improved.

Description

Image noise reduction control method, terminal and computer readable storage medium
Technical Field
The present invention relates to the field of image data processing technologies, and in particular, to an image noise reduction control method, a terminal, and a computer-readable storage medium.
Background
As image capturing apparatuses (e.g., digital cameras, smart phones, etc.) including an image capturing function are widely used, more and more users capture images in their daily lives. However, there is some unnecessary interference information, i.e., noise, in the captured image data. The control of noise is important to the quality of an imaged picture, and the amount and distribution of noise is often different for different scenes, especially in dim light scenes. The current noise reduction method triggers noise reduction mainly according to the exposure value. The noise reduction control method is too single, and cannot control the noise reduction scheme according to other shooting parameters, focal length, color temperature values and the like or parameters of the image. Therefore, the noise reduction effect of the picture cannot reach the best effect, and the user experience is influenced.
Disclosure of Invention
The problems to be solved by the invention are as follows: the existing image noise reduction control method only uses an exposure value to trigger noise reduction, and is too single, so that the noise reduction effect of an image is poor, and the user experience is influenced. In order to solve the technical problem, an image noise reduction control method, a terminal and a computer readable storage medium are provided.
In order to solve the above technical problem, the present invention provides an image noise reduction control method, a terminal and a computer-readable storage medium, where the image noise reduction control method is applied to the terminal, and the image noise reduction control method includes:
an image noise reduction control method is applied to a terminal, and is characterized by comprising the following steps:
acquiring color difference parameters in a shot image and shooting parameters adopted for shooting the image;
matching the acquired shooting parameters and the acquired color difference parameters with a preset corresponding relation to acquire a target noise reduction scheme matched with the shooting parameters and the color difference parameters; the corresponding relation is the matching relation of the shooting parameter, the color difference parameter and the noise reduction scheme;
controlling to denoise the image using the target denoising scheme.
Optionally, the shooting parameters include: exposure value and color temperature value;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: and matching relations between all combination results and the noise reduction schemes of the plurality of sections of which the exposure values are divided in the selectable range, the plurality of sections of which the color temperature values are divided in the selectable range, and the plurality of sections of which the color difference degree parameters are divided in the selectable range.
Optionally, the shooting parameters include: exposure value and focal length;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: matching relations between all combination results and noise reduction schemes of a plurality of sections of which the focal length is divided within the selectable range, a plurality of sections of which the exposure value is divided within the selectable range, and a plurality of sections of which the color difference degree parameter is divided within the selectable range.
Optionally, the shooting parameters include: exposure value, focal length and color temperature value;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: and matching relations between all the combination results of the plurality of intervals in which the exposure value is divided in the selectable range, the plurality of intervals in which the focal length is divided in the selectable range, the plurality of intervals in which the color temperature value is divided in the selectable range, the plurality of intervals in which the color difference parameter is divided in the selectable range, and the noise reduction scheme.
Optionally, the noise reduction scheme includes:
noise reduction schemes with different noise reduction coefficients; and/or
Noise reduction schemes of different noise reduction methods.
Optionally, when an update instruction is received, the corresponding relationship is updated.
Optionally, the update instruction is issued by a user or a server.
Optionally, the updating includes:
deleting the corresponding relation, adding the corresponding relation newly, and modifying the corresponding relation.
Further, the invention also provides a terminal, which comprises a processor, a memory and a communication bus;
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of the image noise reduction control method described above.
Further, the present invention provides a computer-readable storage medium, characterized in that the computer-readable storage medium stores one or more programs, which are executable by one or more processors to implement the steps of the image noise reduction control method as described above.
Advantageous effects
The invention provides an image noise reduction control method, a terminal and a computer readable storage medium, which are used for acquiring color difference parameters in a shot image and shooting parameters adopted for shooting the image, matching the acquired shooting parameters and the color difference parameters with a preset corresponding relation, and acquiring a target noise reduction scheme matched with the shooting parameters and the color difference parameters, wherein the corresponding relation is the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme, and controlling the use of the target noise reduction scheme for noise reduction of the image, thereby solving the technical problems that the noise reduction control method of the current image only uses an exposure value to trigger noise reduction, the method is too single, and the noise reduction effect of the image is poor, thus the image noise reduction control method provided by the invention can integrate a plurality of shooting parameters to control and select the noise reduction scheme, and uses the corresponding noise reduction scheme aiming at different scenes, the image can reach higher definition, has promoted the whole shooting effect of image.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is an electrical schematic diagram of an alternative camera for implementing various embodiments of the invention;
FIG. 3 is a basic flowchart of an image denoising control method according to a first embodiment of the present invention;
FIG. 4 is a flowchart of an image denoising control method according to a second embodiment of the present invention;
fig. 5 is a schematic functional block diagram of a terminal according to a fourth embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal, however, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented, and that more or fewer components may instead be implemented, the elements of the mobile terminal being described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module, a mobile communication module, a wireless internet module, a short-range communication module, and a location information module, through which a corresponding communication function is externally implemented.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display module 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal, and in particular, as for the electrical structure of the cameras, reference may be made to fig. 2. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display module 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device. The sensing unit 140 may include a proximity sensor 141.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner.
The output unit 150 may include a display module 151, an audio output module 152, an alarm module 153, and the like.
The display module 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display module 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display module 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display module 151 and the touch pad are stacked on each other in the form of layers to form a touch screen, the display module 151 may serve as an input device and an output device. The display module 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. The mobile terminal 100 may include two or more display modules (or other display devices) according to a particular desired implementation, for example, the mobile terminal may include an external display module (not shown) and an internal display module (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm module 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm module 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm module 153 may provide an output in the form of a vibration, and when a call, a message, or some other incoming communication (communicating communication) is received, the alarm module 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm module 153 may also provide an output notifying the occurrence of an event via the display module 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Fig. 2 is an electrical schematic diagram of an alternative camera for implementing various embodiments of the present invention.
The photographing lens 1211 is composed of a plurality of optical lenses for forming an object image, wherein the photographing lens 1211 may be a single focus lens or a zoom lens. The photographing lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focal position of the photographing lens 1211 in accordance with a control signal from the lens driving control circuit 1222. The lens drive control circuit 1222 controls the drive of the lens driver 1221 in accordance with a control command from the microcomputer 1217, and the lens drive control circuit 1222 may also control the drive in accordance with a control command from the controller 180, a processor, a microcontroller, or a microprocessor.
An image pickup device 1212 is disposed on the optical axis of the photographing lens 1211 near the position of the object image formed by the photographing lens 1211. The image pickup device 1212 is used to pick up an image of an object and acquire picked-up image data. Photodiodes constituting each pixel are two-dimensionally arranged in a matrix on the image pickup device 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode. A bayer RGB color filter is disposed on the front surface of each pixel.
The image pickup device 1212 is connected to an image pickup circuit 1213, and the image pickup circuit 1213 performs charge accumulation control and image signal reading control in the image pickup device 1212, performs waveform shaping after reducing reset noise for the read image signal (analog image signal), and further performs gain improvement or the like so as to obtain an appropriate signal level.
The imaging circuit 1213 is connected to an a/D converter 1214, and the a/D converter 1214 performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
The bus 1227 is a transfer path for transferring various data read out or generated inside the camera. The a/D converter 1214 described above is connected to the bus 1227, and further connected to an image processor 1215, a JPEG processor 1216, a microcomputer 1217, an SDRAM (Synchronous Dynamic random access memory) 1218, a memory interface (hereinafter referred to as memory I/F)1219, and an LCD (Liquid Crystal Display) driver 1220.
The image processor 1215 performs various image processing such as OB subtraction processing, white balance adjustment, color matrix operation, gamma conversion, color difference signal processing, noise removal processing, synchronization processing, and edge processing on image data output from the image pickup device 1212. The JPEG processor 1216 compresses the image data read out from the SDRAM1218 in a JPEG compression method when recording the image data in the recording medium 1225. The JPEG processor 1216 decompresses JPEG image data for image reproduction display. When decompression is performed, a file recorded in the recording medium 1225 is read out, decompression processing is performed in the JPEG processor 1216, and the decompressed image data is temporarily stored in the SDRAM1218 and displayed on the LCD 1226. In the present embodiment, the JPEG system is used as the image compression/decompression system, but the compression/decompression system is not limited to this, and other compression/decompression systems such as MPEG, TIFF, and h.264 may be used.
The microcomputer 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera. The microcomputer 1217 is connected to an operation unit 1223 and a flash memory 1224.
The operation unit 1223 includes, but is not limited to, physical keys or virtual keys, which may be various input buttons such as a power button, a photographing key, an editing key, a moving image button, a reproduction button, a menu button, a cross key, an OK button, a delete button, and an enlargement button, and operation controls such as various input keys, and which detect operation states of these operation controls.
The detection result is output to the microcomputer 1217. A touch panel is provided on the front surface of the LCD1226 as a display, and a touch position of the user is detected and output to the microcomputer 1217. The microcomputer 1217 executes various processing sequences corresponding to the user's operation according to the detection result of the operation position from the operation unit 1223.
The flash memory 1224 stores programs for executing various processing sequences of the microcomputer 1217. The microcomputer 1217 controls the entire camera according to the program. The flash memory 1224 stores various adjustment values of the camera, and the microcomputer 1217 reads the adjustment values and controls the camera in accordance with the adjustment values.
The SDRAM1218 is an electrically rewritable volatile memory for temporarily storing image data and the like. The SDRAM1218 temporarily stores the image data output from the a/D converter 1214 and the image data processed in the image processor 1215, JPEG processor 1216, and the like.
The memory interface 1219 is connected to the recording medium 1225, and performs control for writing and reading image data and data such as a file header added to the image data to and from the recording medium 1225. The recording medium 1225 is, for example, a recording medium such as a memory card that can be attached to and detached from the camera body, but is not limited to this, and may be a hard disk or the like that is built in the camera body.
The LCD driver 1220 is connected to the LCD1226, and stores the image data processed by the image processor 1215 in the SDRAM1218, and when display is required, reads the image data stored in the SDRAM1218 and displays the image data on the LCD1226, or the image data compressed by the JPEG processor 1216 is stored in the SDRAM1218, and when display is required, the JPEG processor 1216 reads the compressed image data in the SDRAM1218, decompresses the data, and displays the decompressed image data through the LCD 1226.
The LCD1226 is disposed on the back surface of the camera body and displays an image. The LCD1226LCD is not limited to this, and various display panels (LCD1226) such as organic EL may be used.
Based on the hardware structure of the mobile terminal and the electrical structure of the camera, the camera detection device and method and various embodiments of the multi-camera terminal are provided.
Hereinafter, the camera detection device, the camera detection method and the multi-camera terminal according to the present invention will be described in detail with reference to specific embodiments.
Based on the above terminal hardware structure and communication network system, the present invention provides various embodiments of the method.
First embodiment
In order to solve the problem that the noise reduction effect of an image is poor due to the fact that the existing noise reduction control method of the image only uses an exposure value to trigger noise reduction and is too single, the embodiment provides an image noise reduction control method, and fig. 2 is a basic flow chart of the image noise reduction control method provided by the embodiment, and the image noise reduction control method comprises the following steps:
and S301, acquiring color difference degree parameters in the shot image and shooting parameters adopted by the shot image.
It should be understood that besides the shooting parameters during shooting, some parameters of the shot image itself may also affect the noise reduction of the image. The color difference degree parameter refers to the difference degree of the colors of the shot images, the color difference degrees of the shot images are different, the color span of some shot objects is very large, the color change of some shot objects is very gentle, and the transitivity is stronger.
The image is often degraded by various noises during the generation and transmission processes, which adversely affects the processing of the subsequent images and the visual effect of the images. The noise may be of various types, such as electrical noise, mechanical noise, channel noise, and other noise. Therefore, in order to suppress noise, improve image quality, and facilitate higher-level processing, it is necessary to perform denoising preprocessing on an image. The image noise blurs the image and even submerges the image characteristics, which brings difficulty to analysis.
Image noise generally has the following characteristics:
the distribution and size of the noise in the image is irregular, i.e. random.
There is generally a correlation between noise and images. For example, the signal of the camera is correlated with noise, and the dark portion is noisy and the bright portion is noisy. For another example, quantization noise in a digital image is related to the phase of the image, and when the image content is close to flat, the quantization noise exhibits a false contour, but random noise in the image may make the quantization noise less noticeable due to the microphonic effect. The noise has a superposition. In a tandem image transmission system, power addition can be performed on each part of noise if the noise of the same type is entered, and the signal-to-noise ratio is reduced in sequence.
And S302, matching the acquired shooting parameters and color difference parameters with a preset corresponding relation, and acquiring a target noise reduction scheme matched with the shooting parameters and the color difference parameters.
The corresponding relation is the matching relation of the shooting parameter, the color difference parameter and the noise reduction scheme.
Optionally, the shooting parameters include: exposure value and color temperature value; the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme comprises the following steps: and matching relations between all combination results and the noise reduction schemes of the plurality of sections of which the exposure values are divided in the selectable range, the plurality of sections of which the color temperature values are divided in the selectable range, and the plurality of sections of which the color difference degree parameters are divided in the selectable range.
Optionally, the shooting parameters include: exposure value and focal length; the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme comprises the following steps: matching relations between all combination results and noise reduction schemes of a plurality of sections of which the focal length is divided within the selectable range, a plurality of sections of which the exposure value is divided within the selectable range, and a plurality of sections of which the color difference degree parameter is divided within the selectable range.
Optionally, the shooting parameters include: exposure value, focal length and color temperature value; the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme comprises the following steps: and matching relations between all the combination results of the plurality of intervals in which the exposure value is divided in the selectable range, the plurality of intervals in which the focal length is divided in the selectable range, the plurality of intervals in which the color temperature value is divided in the selectable range, the plurality of intervals in which the color difference parameter is divided in the selectable range, and the noise reduction scheme.
It should be understood that, in the actual shooting process, the color difference degrees of the shot images are different, specifically, the color difference of some shot images is very large, the corresponding noise reduction scheme with a low noise reduction coefficient can be used, the color of some shot objects belongs to a transition type, the color difference is very gentle, and the corresponding noise reduction scheme with a high noise reduction coefficient can be used.
It is understood that the shooting parameters include at least an exposure value, and it should be noted that other parameters affecting the noise reduction effect belong to the shooting parameters. In the case of low exposure, the noise is larger, and the noise point of scenes with different color temperatures is not represented in the same way. The focal length also has an influence on noise, the near focus tends to retain more details and definition, and the far focus tends to remove more corner noise, so different noise reduction schemes should be adopted for different application scenarios according to actual requirements.
The noise reduction scheme includes:
noise reduction schemes with different noise reduction coefficients; and/or
Noise reduction schemes of different noise reduction methods.
The noise reduction method may be by an averaging filter, an adaptive wiener filter, a median filter, etc.
An average value filter: an averaging filter using the neighborhood averaging method is well suited for removing grain noise from images obtained by scanning. The domain averaging method strongly suppresses noise, and also causes blurring due to averaging, the degree of blurring being proportional to the neighborhood radius.
The smoothness achieved by the geometric mean filter is comparable to the arithmetic mean filter, but less image detail is lost during the filtering process.
The harmonic mean filter works better for "salt" noise, but is not suitable for "pepper" noise. It is good at handling other noise like gaussian noise.
The inverse harmonic mean filter is better suited to dealing with impulse noise but has the disadvantage that it is necessary to know whether the noise is dark or bright in order to select the appropriate filter order sign, which could have catastrophic consequences if the order sign is selected incorrectly.
The self-adaptive wiener filter: it can adjust the output of the filter according to the local variance of the image, the larger the local variance is, the stronger the smoothing effect of the filter is. Its final goal is to minimize the mean square error of the restored image from the original image. The filtering effect of this method is better than that of the mean filter, and is useful for preserving the edges and other high frequency parts of the image, but with a large amount of computation. The wiener filter works best for images with white noise.
The median filter is a common nonlinear smoothing filter, and its basic principle is that the value of one point in digital image or digital sequence is substituted by the median value of every point value in the field of said point, and its main function is to change the pixel whose difference value of gray value of peripheral pixel is greater into the value close to peripheral pixel value so as to eliminate isolated noise point. The median filter can remove noise and protect the edge of the image, so that a satisfactory restoration effect is obtained, moreover, the statistical characteristic of the image is not needed in the actual operation process, which brings great convenience, but the median filtering method is not suitable for the image with more details, especially more points, lines and spires.
And S303, controlling the target noise reduction scheme to reduce the noise of the image.
Further, when an update instruction is received, the corresponding relationship is updated, and the update instruction is issued by a user or a server.
The updating comprises the following steps:
deleting the corresponding relation, adding the corresponding relation newly, and modifying the corresponding relation.
It can be understood that if the noise reduction scene is further refined, a more appropriate noise reduction scheme is obtained correspondingly, a new correspondence relationship can be added, and the correspondence relationship can be changed or deleted according to the personalized requirements of the user, so that the user satisfaction is improved.
In some embodiments, the contrast of the image after noise reduction can be further adjusted, and the image after noise reduction is clearer by increasing the contrast of the image.
The embodiment provides an image noise reduction control method, which includes the steps of obtaining color difference parameters in a shot image and shooting parameters adopted by the shot image, matching the obtained shooting parameters and the obtained color difference parameters with a preset corresponding relation, obtaining a target noise reduction scheme matched with the shooting parameters and the color difference parameters, controlling the target noise reduction scheme to reduce noise of the image, and solving the technical problems that the noise reduction effect of the image is poor due to the fact that the existing image noise reduction control method only triggers noise reduction by using an exposure value and is too single, and the shooting parameters can be comprehensively obtained by adopting the image noise reduction control method provided by the invention: the exposure value, the color temperature, the focal length and the like and the color difference parameter of the shot object are used for controlling and selecting the noise reduction scheme, the corresponding noise reduction scheme is used for different scenes, the image can reach higher definition, and the overall shooting effect of the image is improved.
Second embodiment
The present embodiment is explained based on the first embodiment in a case where the shooting parameters include the focal length, the color temperature, and the exposure value, and the noise reduction scheme only employs different noise reduction coefficients.
S401, acquiring parameters of color difference degree in the shot image and a focal length, a color temperature and an exposure value adopted for shooting the image.
S402, matching the acquired focal length, color temperature, exposure value and color difference parameter of the shot object with a preset corresponding relation, and acquiring a target noise reduction scheme matched with the focal length, color temperature, exposure value and color difference parameter of the shot object.
In the 3A technology of a camera, Autofocus (AF), Auto Exposure (AE), and Auto White Balance (AWB) are referred to.
The auto-focus algorithm moves the lens to maximize the image contrast by obtaining the image contrast.
The auto-exposure algorithm will automatically set the exposure value according to the available light source conditions.
The automatic white balance algorithm adjusts the fidelity of the picture color according to the light source conditions.
And (3) AF algorithm: the traditional camera adopts a similar visual range finding mode to realize automatic focusing, the camera emits infrared rays (or other rays), the distance of a shot object is determined according to the reflection of the shot object, and then the lens combination is adjusted according to the measured result to realize automatic focusing. The automatic focusing mode is direct, fast, easy to implement and low in cost, but sometimes errors occur (automatic focusing cannot be achieved when other things such as glass exist between a camera and a shot object or under the condition of insufficient light), the precision is poor, and the mode is generally not used by high-grade cameras nowadays. The camera is active because it actively emits rays, so it is called active, and it is also called non-TTL because it actually only measures distance and does not judge whether coking is correct through the actual imaging of the lens. Compared with active automatic focusing, passive automatic focusing is developed later, namely whether the lens is correctly coked is judged according to actual imaging of the lens, the judgment is generally based on a contrast detection mode, and the specific principle is quite complex. Since this approach is achieved by lens imaging, it is called TTL autofocus. This autofocus system is also realized based on lens imaging, and therefore has high focusing accuracy and a low error rate, but is complicated in technology, slow in speed (except for advanced autofocus lenses using an ultrasonic motor), and high in cost. The manual focusing is a focusing mode which adjusts a camera lens by manually rotating a focusing ring so as to make a shot picture clear, and the mode greatly depends on the discrimination of human eyes on an image on a focusing screen and the proficiency of a photographer even the eyesight of the photographer. Early single mirror reflex cameras and paraxial cameras generally used manual focus to accomplish the focusing operation. The existing quasi-professional and professional digital cameras and single lens reflex digital cameras are provided with a manual focusing function so as to be matched with different shooting requirements.
The AE algorithm: in the automatic exposure program mode, the camera can automatically set the shutter speed and the aperture value according to the shutter and aperture exposure combination set by the manufacturer in accordance with the exposure value of the shot picture measured by the photometric system. In this way, equivalent to a so-called "point camera" in terms of camera operability, the operator does not adjust the shutter speed and aperture value at all, and all that is required is to press the shutter release button for good focus. An electronic program shutter, which is commonly used in "dumb" cameras, belongs to this exposure method. In fact, only the automatic exposure mode of the program is the real 'full-automatic' exposure mode.
The AWB algorithm: white Balance english is known as White Balance. The color of the object can be changed due to the color of the projected light, and photos shot under different light occasions have different color temperatures. For example, a photograph taken in an environment illuminated by a tungsten lamp (light bulb) may be yellowish, and in general, the CCD has no way to automatically correct for the change in light as the human eye does. The following pictures show different images under different colored light. The balance is to let the digital camera default to "white", i.e. let him recognize white, regardless of the ambient light, while balancing the hue of its color under colored light. Color is essentially an explanation of light, and what appears to be a white color in normal light may not appear white in darker light, but also "white" under fluorescent lights. If the white balance can be adjusted for all of this, the other colors can be restored in the obtained picture with "white" as the primary color correctly. Most of the commercial digital cameras today provide a white balance adjustment function. As mentioned above, the white balance is closely related to ambient light, and thus, the use of the flash is limited when the white balance function is activated, otherwise the change in ambient light may disable the white balance or disturb the normal white balance. There are multiple modes of general balance, adapt to different scene shots, such as: automatic white balance, tungsten white balance, fluorescent white balance, indoor white balance, manual adjustment.
The image processor can obtain the focal distance of image shooting from the AF algorithm in the 3A technology; the AE algorithm obtains an exposure value when the image is shot; the AWB algorithm acquires a color temperature at the time of image capturing.
It will be appreciated that changes in color temperature also affect changes in exposure values. The analysis is started from the aspects of exposure principle and color temperature, and then the analysis is mapped into the adjustment of RAW. The existing camera completes automatic photometry of the camera through a TTL internal photometry system no matter whether the camera is digital or film, the internal photometry is developed for many years, a photometry element is developed to a silicon photodiode or a gallium arsenide phosphide photodiode with better performance from the original cadmium sulfide, but the luminance value of light from a lens is measured no matter which photometry element is used, namely the luminance value. The light metering principle of the camera is to regard the selected light metering target as 18% neutral gray, in other words, as long as the camera exposes the required reference target to 18% neutral gray according to the light metering mode selected by the camera, the light metering task is completed, and the camera automatically matches the 'correct' parameter value. The change in color temperature necessarily results in a change in the color tone of the entire screen, which necessarily results in a change in the color of the selected light-measuring point, the direct effect of the change in the color of the selected light-measuring point is that the RGB value of that object changes, which in turn results in a change in the gray value of that object, the gray value changes as soon as the gray value, which is simply the gray value of the area being measured, and since this reference changes, the light-measuring system naturally changes the exposure value.
The adjusting part of the RAW, the RAW format is called the original format of the digital camera, records all original data during shooting, including all exposure parameters, including the metering mode used by you, the selected metering point information (whether the central area mode, the point metering mode or the matrix metering mode), the algorithm built in the RAW adjusting tool is the same as the program algorithm solidified in the camera, the picture is adjusted in the RAW, actually, the camera is used to re-expose the picture, if the color temperature is adjusted, the color of the reference metering object will change in microseconds, the corresponding gray value (brightness value) will also change, the program will naturally calculate according to the new value during operation, and the obtained final effect is that the exposure value is changed.
In the present embodiment, the shooting parameters include: exposure value, focal length and color temperature value; the matching relation of the shooting parameters, the color difference parameters and the noise reduction scheme comprises the following steps: and matching relations between all the combination results of the plurality of intervals in which the exposure value is divided in the selectable range, the plurality of intervals in which the focal length is divided in the selectable range, the plurality of intervals in which the color temperature value is divided in the selectable range, the plurality of intervals in which the color difference parameter is divided in the selectable range, and the noise reduction scheme. For example, the exposure value is divided into 6 intervals in the selectable range, the focal length is divided into 3 intervals in the selectable range, and the color temperature value is divided into 3 intervals in the selectable range, and the color difference parameter is divided into 2 intervals, so that a combined result in 6 × 3 × 2 is finally obtained, and correspondingly, 6 × 3 × 2 noise reduction schemes are adopted for different scenes, so that better noise reduction requirements can be achieved. In practical applications, further refinement can be performed.
For example, the smaller the exposure value, the darker the illuminated image and the more the image is in the telephoto state, the greater the corner noise, in which case the overall noise is greater, while a greater noise reduction coefficient is required if the color change of the image is gradual.
And S403, controlling the noise reduction of the image by using the target noise reduction scheme.
Further, when an update instruction is received, the corresponding relationship is updated, and the update instruction is issued by a user or a server.
The updating comprises the following steps:
deleting the corresponding relation, adding the corresponding relation newly, and modifying the corresponding relation.
It can be understood that if the noise reduction scene is further refined, a more appropriate noise reduction scheme is obtained correspondingly, a new correspondence relationship can be added, and the correspondence relationship can be changed or deleted according to the personalized requirements of the user, so that the user satisfaction is improved.
The embodiment provides an image noise reduction control method, which presets the corresponding relation between a focal length, a color temperature value and an exposure value and an image noise reduction coefficient, an image processor acquires the focal length, the color temperature value and the exposure value when an image is shot, the image processor determines the image noise reduction coefficient according to the focal length, the color temperature value and the exposure value when the image is shot and the corresponding relation, and performs noise reduction processing on the image according to the image noise reduction coefficient, so that the technical problems that the noise reduction control method of the current image only uses the exposure value to trigger noise reduction, the method is too single, and the noise reduction effect of the image is poor are solved, therefore, the image noise reduction control method provided by the invention can control and select the noise reduction coefficient by integrating the exposure value, the color temperature, the focal length and the color difference parameters of a shot object, uses the corresponding noise reduction coefficients aiming at different scenes, and ensures that the image can, the overall shooting effect of the image is improved.
Third embodiment
The present embodiment further provides a terminal, as shown in fig. 5, which includes a processor 51, a memory 52 and a communication bus 53, wherein:
the communication bus 53 is used for realizing connection communication between the processor 51 and the memory 52;
the processor 51 is configured to execute an image noise reduction control program stored in the memory 52 to implement the steps of the image noise reduction control method as exemplified in the above embodiments.
The present embodiment provides a computer-readable storage medium, which stores one or more programs, where the one or more programs are executable by one or more processors to implement the steps of the image noise reduction control method according to the first embodiment and/or the second embodiment, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for causing a terminal to execute the method of the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image noise reduction control method is applied to a terminal, and is characterized by comprising the following steps:
acquiring color difference parameters in a shot image and shooting parameters adopted for shooting the image;
matching the acquired shooting parameters and the acquired color difference parameters with a preset corresponding relation to acquire a target noise reduction scheme matched with the shooting parameters and the color difference parameters; the corresponding relation is the matching relation of the shooting parameter, the color difference parameter and the noise reduction scheme;
controlling to denoise the image using the target denoising scheme.
2. The image noise reduction control method according to claim 1, the photographing parameters comprising: exposure value and color temperature value;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: and matching relations between all combination results and the noise reduction schemes of the plurality of sections of which the exposure values are divided in the selectable range, the plurality of sections of which the color temperature values are divided in the selectable range, and the plurality of sections of which the color difference degree parameters are divided in the selectable range.
3. The image noise reduction control method according to claim 1, the photographing parameters comprising: exposure value and focal length;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: matching relations between all combination results and noise reduction schemes of a plurality of sections of which the focal length is divided within the selectable range, a plurality of sections of which the exposure value is divided within the selectable range, and a plurality of sections of which the color difference degree parameter is divided within the selectable range.
4. The image noise reduction control method according to claim 1, the photographing parameters comprising: exposure value, focal length and color temperature value;
the matching relationship between the shooting parameters, the color difference parameters and the noise reduction scheme comprises: and matching relations between all the combination results of the plurality of intervals in which the exposure value is divided in the selectable range, the plurality of intervals in which the focal length is divided in the selectable range, the plurality of intervals in which the color temperature value is divided in the selectable range, the plurality of intervals in which the color difference parameter is divided in the selectable range, and the noise reduction scheme.
5. The image noise reduction control method according to claims 1 to 4, wherein the noise reduction scheme includes:
noise reduction schemes with different noise reduction coefficients; and/or
Noise reduction schemes of different noise reduction methods.
6. An image noise reduction control method according to claims 1-5, characterized in that the correspondence is updated when an update instruction is received.
7. The image noise reduction control method according to claims 1 to 6, wherein the update instruction is issued by a user or a server.
8. The image noise reduction control method according to claims 1 to 7, wherein the updating includes:
deleting the corresponding relation, adding the corresponding relation newly, and modifying the corresponding relation.
9. A terminal, characterized in that the terminal comprises a processor, a memory and a communication bus;
the communication bus is used for realizing communication connection between the processor and the memory;
the processor is configured to execute one or more programs stored in the memory to implement the steps of the image noise reduction control method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores one or more programs which are executable by one or more processors to implement the steps of the image noise reduction control method according to any one of claims 1 to 8.
CN201911286509.6A 2019-12-13 2019-12-13 Image noise reduction control method, terminal and computer readable storage medium Active CN111050026B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911286509.6A CN111050026B (en) 2019-12-13 2019-12-13 Image noise reduction control method, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911286509.6A CN111050026B (en) 2019-12-13 2019-12-13 Image noise reduction control method, terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111050026A true CN111050026A (en) 2020-04-21
CN111050026B CN111050026B (en) 2022-08-19

Family

ID=70236357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911286509.6A Active CN111050026B (en) 2019-12-13 2019-12-13 Image noise reduction control method, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111050026B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023181A (en) * 1998-07-03 2000-01-21 Hitachi Ltd Display device for color video signal
CN101282417A (en) * 2007-03-29 2008-10-08 索尼株式会社 Method of and apparatus for image denoising
JP2010062802A (en) * 2008-09-03 2010-03-18 Ricoh Co Ltd Imaging apparatus, imaging method, and computer readable recording medium storing program for executing the method
CN108230270A (en) * 2017-12-28 2018-06-29 努比亚技术有限公司 A kind of noise-reduction method, terminal and computer readable storage medium
CN109151257A (en) * 2018-09-20 2019-01-04 浙江大华技术股份有限公司 A kind of method and video camera of image procossing
CN110290289A (en) * 2019-06-13 2019-09-27 Oppo广东移动通信有限公司 Image denoising method, device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023181A (en) * 1998-07-03 2000-01-21 Hitachi Ltd Display device for color video signal
CN101282417A (en) * 2007-03-29 2008-10-08 索尼株式会社 Method of and apparatus for image denoising
JP2010062802A (en) * 2008-09-03 2010-03-18 Ricoh Co Ltd Imaging apparatus, imaging method, and computer readable recording medium storing program for executing the method
CN108230270A (en) * 2017-12-28 2018-06-29 努比亚技术有限公司 A kind of noise-reduction method, terminal and computer readable storage medium
CN109151257A (en) * 2018-09-20 2019-01-04 浙江大华技术股份有限公司 A kind of method and video camera of image procossing
CN110290289A (en) * 2019-06-13 2019-09-27 Oppo广东移动通信有限公司 Image denoising method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111050026B (en) 2022-08-19

Similar Documents

Publication Publication Date Title
KR102598109B1 (en) Electronic device and method for providing notification relative to image displayed via display and image stored in memory based on image analysis
JP6803982B2 (en) Optical imaging method and equipment
CN105959543B (en) It is a kind of to remove reflective filming apparatus and method
US8749653B2 (en) Apparatus and method of blurring background of image in digital image processing device
US20170004603A1 (en) Image processing device, imaging device, image processing method, and image processing program
US9906732B2 (en) Image processing device, image capture device, image processing method, and program
US9699427B2 (en) Imaging device, imaging method, and image processing device
CN106131441B (en) Photographing method and device and electronic equipment
JP2012199675A (en) Image processing apparatus, image processing method, and program
CN112840642B (en) Image shooting method and terminal equipment
CN113452898B (en) Photographing method and device
US9965833B2 (en) Optical system characteristic data storing device, method, and program for image restoration filtering
CN105469357B (en) Image processing method, device and terminal
CN111741187B (en) Image processing method, device and storage medium
JP6534780B2 (en) Imaging device, imaging method, and program
CN116055897A (en) Photographing method and related equipment thereof
JP5768193B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JPWO2019064753A1 (en) Image processing equipment, imaging equipment, image processing methods, imaging methods, and programs
CN108156392B (en) Shooting method, terminal and computer readable storage medium
CN107404616B (en) Photographing method for adjusting light sensitivity of camera, mobile terminal and storage device
JP2013017218A (en) Image processing device, image processing method, and program
KR20150022531A (en) Photographing apparatus and method
CN107071293B (en) Shooting device, method and mobile terminal
CN111050026B (en) Image noise reduction control method, terminal and computer readable storage medium
CN116055855B (en) Image processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant