CN111147695B - Image processing method, image processor, shooting device and electronic equipment - Google Patents

Image processing method, image processor, shooting device and electronic equipment Download PDF

Info

Publication number
CN111147695B
CN111147695B CN201911420578.1A CN201911420578A CN111147695B CN 111147695 B CN111147695 B CN 111147695B CN 201911420578 A CN201911420578 A CN 201911420578A CN 111147695 B CN111147695 B CN 111147695B
Authority
CN
China
Prior art keywords
photosensitive
noise reduction
value
current
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911420578.1A
Other languages
Chinese (zh)
Other versions
CN111147695A (en
Inventor
李小朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911420578.1A priority Critical patent/CN111147695B/en
Publication of CN111147695A publication Critical patent/CN111147695A/en
Application granted granted Critical
Publication of CN111147695B publication Critical patent/CN111147695B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, an image processor, a shooting device and an electronic device. The image processing method includes: the algorithm post-processing module receives a current image output by the image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image; determining a current noise reduction parameter according to a current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, and the current noise reduction parameter is determined by a first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval in which the current photosensitive value is located and a second noise reduction parameter corresponding to the second photosensitive endpoint value; and performing noise reduction processing on the current image according to the current noise reduction parameter. In the embodiment of the application, different photosensitive values adopt different noise reduction parameters, the noise reduction parameters are more flexible and change more smoothly, images can be clearer, noise points are less, and the effect is better.

Description

Image processing method, image processor, shooting device and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processor, a photographing apparatus, and an electronic device.
Background
With the development of image processing technology, the requirements of users on image quality are higher and higher. At present, the noise reduction of images generally adopts fixed noise reduction parameters. Because the same group of noise reduction parameters is adopted under different conditions, the noise reduction effect is poor. For example, there are times when the ambient noise smear is severe, and there are times when the ambient noise is relatively harsh and unclean.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processor, a shooting device and electronic equipment.
The image processing method of the embodiment of the application comprises the following steps: the algorithm post-processing module receives a current image output by the image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image; determining a current noise reduction parameter according to the current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and the second noise reduction parameter corresponding to the second photosensitive endpoint value; and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
The image processor of the embodiment of the application comprises an algorithm post-processing module and a hardware abstraction module. The algorithm post-processing module is used for: receiving a current image output by an image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image; determining a current noise reduction parameter according to the current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and the second noise reduction parameter corresponding to the second photosensitive endpoint value; and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
The shooting device of the embodiment of the application comprises an image processor and an image sensor, wherein the image sensor is connected with the image processor. The image processor comprises an algorithm post-processing module and a hardware abstraction module. The algorithm post-processing module is used for: receiving a current image output by an image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image; determining a current noise reduction parameter according to the current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and the second noise reduction parameter corresponding to the second photosensitive endpoint value; and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
The electronic equipment of the embodiment of the application comprises a shooting device and a shell, wherein the shooting device is combined with the shell. The shooting device comprises an image processor and an image sensor, and the image sensor is connected with the image processor. The image processor comprises an algorithm post-processing module and a hardware abstraction module. The algorithm post-processing module is used for: receiving a current image output by an image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image; determining a current noise reduction parameter according to the current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and the second noise reduction parameter corresponding to the second photosensitive endpoint value; and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
In the image processing method, the image processor, the shooting device and the electronic device of the embodiment of the application, the photosensitive value is divided into a plurality of photosensitive intervals, in each photosensitive interval, a first photosensitive endpoint value corresponds to a first noise reduction parameter, a second photosensitive endpoint value corresponds to a second noise reduction parameter, a current noise reduction parameter corresponding to the current image is determined by a first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and a second noise reduction parameter corresponding to the second photosensitive endpoint value, so different photosensitive values adopt different noise reduction parameters, the noise reduction parameters are more flexible, the change is smoother, the image can be clearer, the number of noise points is less, and the effect is better.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of a camera according to some embodiments of the present application;
FIG. 2 is a schematic diagram illustrating a correspondence relationship between noise reduction parameters and exposure values according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating a correspondence relationship between noise reduction parameters and exposure values according to some embodiments of the present disclosure;
FIG. 4 is a schematic diagram of an algorithmic post-processing module in accordance with certain embodiments of the present application;
FIG. 5 is a schematic view of a camera according to some embodiments of the present application;
FIG. 6 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 7 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
FIG. 8 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 9 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 10 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 11 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 12 is a flow chart illustrating an image processing method according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
The following disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the application. In order to simplify the disclosure of the embodiments of the present application, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present application.
Referring to fig. 1, an embodiment of the present disclosure provides a camera 100. The photographing apparatus 100 includes an image processor 10 and an image sensor 20. The image processor 10 is connected to the image sensor 20. The Image sensor 20 includes an Image acquisition unit (sensor)22 and a RAW Image data unit (IFE) 24, the Image acquisition unit 22 is configured to receive light to acquire Image data (RAW Image), the RAW Image data unit 24 is configured to transmit the Image data acquired by the Image acquisition unit 22 to the Image processor 10, wherein the RAW Image data unit 24 is configured to process the RAW Image acquired by the Image acquisition unit 22 and output the processed RAW Image to the Image processor 10.
The image processor 10 includes a hardware abstraction module 12, an application program module (APP)14, and an Algo Process Service (APS) 16.
The hardware abstraction module 12 is configured to receive a RAW image, convert the RAW image into a YUV image, and transmit the RAW image and/or the YUV image. The hardware abstraction module 12 may be connected to the image sensor 20. Specifically, the hardware abstraction module 12 may include a buffer queue (buffer queue)122 connected to the Image sensor 20, a RAW to RGB processing unit (BPS) 124, and an Image Process Engine (IPE) 126 connected to the application module 14. The buffer unit 122 is used for buffering the RAW image from the image sensor 20 and transmitting the RAW image to the post-algorithm processing module 16 through the application module 14. The RAW-to-RGB processing unit 124 is configured to convert the RAW image from the buffer unit 122 into an RGB image. The denoising and YUV post-processing unit 126 is configured to process the RGB image to obtain a YUV image and transmit the YUV image to the algorithm post-processing module 16 through the application module 14. The hardware abstraction module 12 may also transmit metadata (metadata) of the image data, the metadata including 3a (automatic exposure control AE, automatic focus control AF, automatic white balance control AWB) information, picture information (e.g., image width, height), exposure parameters (aperture size, shutter speed, and light sensation value), etc., and post-photographing processing (e.g., including at least one of beauty processing, filter processing, rotation processing, watermarking processing, blurring processing, HDR processing, and multi-frame noise reduction processing) of the RAW image and/or the YUV image may be implemented with assistance of the metadata. In one embodiment, the metadata includes photopic value (ISO) information from which the brightness of the RAW image and/or YUV image may be assisted to be adjusted, thereby enabling post-photographing processing associated with adjusting the brightness.
Because the hardware abstraction module 12 does not perform post-photographing processing on the RAW image and/or the YUV image (for example, only receiving the RAW image, converting the RAW image into the YUV image, and transmitting the RAW image and/or the YUV image), the image processing algorithm for post-photographing processing does not need to perform flow truncation on the algorithm framework of the hardware abstraction module 12 itself, and only needs to be externally compatible, so that the design difficulty is reduced.
In the related art, an Application Program Interface (API) establishes a hardware abstraction module as a pipeline (pipeline), and since the establishment of the pipeline requires a lot of time and memory, all the pipelines used for a working mode corresponding to a camera need to be established first when the camera is started, and in order to implement various image processing algorithms, a lot of pipelines (for example, more than three pipelines) generally need to be established, which may cause the startup of the camera to consume a lot of time and occupy a lot of memory. The hardware abstraction module 12 according to the embodiment of the present application does not perform post-photographing processing on the RAW image and/or the YUV image, and therefore, the hardware abstraction module 12 only needs to establish a small number of (for example, one or two) pipelines, and does not need to establish a large number of pipelines, so that the memory can be saved, and the starting speed of the camera can be increased.
The application module 14 is used to interface with the hardware abstraction module 12. The application module 14 may be configured to generate control commands according to user input and send the control commands to the image sensor 20 through the hardware abstraction module 12 to control the operation of the image sensor 20 accordingly. Wherein, the application module 14 can run with 64 bits (bit), and the static data link library (lib) of the image processing algorithm of the post-photographing processing can be configured with 64 bits to improve the operation speed. After receiving the RAW image and/or the YUV image transmitted by the hardware abstraction module, the application module 14 may perform post-photographing processing on the RAW image and/or the YUV image, or may transmit the RAW image and/or the YUV image to the algorithm post-processing module 16 for post-photographing processing. Of course, it is also possible that the application module 14 performs some post-photographing processing (e.g., beautifying processing, filter processing, rotation processing, watermarking processing, blurring processing, etc.), and the algorithm post-processing module 16 performs some other post-photographing processing (e.g., HDR processing, multi-frame noise reduction processing, etc.). In the embodiment of the present application, the application module 14 transmits the RAW and/or YUV images to the post-algorithm processing module 16 for post-photographing processing.
The algorithm post-processing module 16 is connected to the hardware abstraction module 12 through the application module 14, at least one image processing algorithm (for example, at least one of a beauty processing algorithm, a filter processing algorithm, a rotation processing algorithm, a watermark processing algorithm, a blurring processing algorithm, an HDR processing algorithm, and a multi-frame noise reduction processing algorithm) is stored in the algorithm post-processing module 16, and the algorithm post-processing module 16 is configured to process the RAW image and/or the YUV image by using the image processing algorithm to implement post-photographing processing. Since the post-photographing processing of the RAW image and/or the YUV image can be realized by the algorithm post-processing module 16, the process truncation is not required on the algorithm architecture of the hardware abstraction module 12, and only the external compatibility is required, so that the design difficulty is reduced. And because the post-photographing processing is realized by the algorithm post-processing module 16, the function of the algorithm post-processing module 16 is more single and more focused, thereby achieving the effects of fast transplantation, simple expansion of new image processing algorithms and the like. Of course, if the application module 14 performs some post-photographing processing (e.g., beauty processing, filter processing, rotation processing, watermark processing, blurring processing, etc.), and the algorithm post-processing module 16 performs other post-photographing processing (e.g., HDR processing, multi-frame noise reduction processing, etc.), at least one image processing algorithm (e.g., at least one of beauty processing algorithm, filter processing algorithm, rotation processing algorithm, watermark processing algorithm, blurring processing algorithm, HDR processing algorithm, and multi-frame noise reduction processing algorithm) may also be stored in the application module 14, and the application module 14 is further configured to process the RAW image and/or the YUV image using the image processing algorithm to implement the post-photographing processing. Since the post-photographing processing of the RAW image and/or the YUV image is realized by the application module 14 and the algorithm post-processing module 16, the process truncation is not required on the algorithm architecture of the hardware abstraction module 12, and only the external compatibility is required, so that the design difficulty is also greatly reduced.
When the post-algorithm processing module 16 processes only the RAW image (for example, the image processing algorithm processes the RAW image), the hardware abstraction module 12 may transmit only the RAW image (at this time, it may not be necessary to convert the RAW image into the YUV image); when the post-algorithm processing module 16 processes only YUV images (e.g., the image processing algorithm processes YUV images), the hardware abstraction module 12 may transmit only YUV images; while the post-algorithm processing module 16 processes the RAW image and the YUV image, the hardware abstraction module 12 may transmit the RAW image and the YUV image.
Referring to fig. 1, in some embodiments, the image sensor 20 outputs the current image (e.g., RAW image) to the hardware abstraction module 12 after acquiring the current image. The hardware abstraction module 12 is configured to receive the current image output by the image sensor 20, acquire a current exposure value corresponding to the current image, and then send the current image and the current exposure value to the application module 14. The application module 14 is used to send the current image and the current exposure value to the post-algorithm processing module 16. Thus, the post-algorithm processing module 16 can receive the current image output by the image sensor 20 and the current exposure value corresponding to the current image transmitted by the hardware abstraction module 12. The current exposure value corresponding to the current image refers to a current exposure value corresponding to the current image time. For example, if the image sensor 20 acquires the current image at time T1, the hardware abstraction module 12 acquires the exposure value at time T1, and the current image corresponds to the exposure value; if the image sensor 20 acquires the current image at time T2, the hardware abstraction module 12 acquires the exposure value at time T2, and the current image and the exposure value also correspond.
The hardware abstraction module 12 may package the current image and the current exposure value together for transmission to the application module 14, or may transmit the current image and the current exposure value sequentially to the application module 14. After receiving the current image and the current exposure value sent by the hardware abstraction module 12, the application module 14 packs the current image and the current exposure value together and sends them to the post-algorithm processing module 16, or sends the current image and the current exposure value to the post-algorithm processing module 16 in sequence.
Referring to fig. 2, after obtaining the current photosensitive value, the algorithm post-processing module 16 determines the current noise reduction parameter according to the current photosensitive value. The photosensitive value is divided into a plurality of photosensitive intervals, and each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value. The first light sensing endpoint value corresponds to the first noise reduction parameter, and the second light sensing endpoint value corresponds to the second noise reduction parameter. The current noise reduction parameter is determined by a first noise reduction parameter corresponding to a first light sensing endpoint value of a light sensing interval where the current light sensing value is located and a second noise reduction parameter corresponding to a second light sensing endpoint value. Therefore, different noise reduction parameters are adopted for different photosensitive values, the noise reduction parameters are more flexible and change more smoothly, images can be clearer, noise points are fewer, and the effect is better.
Taking fig. 2 as an example, the photosensitive value can be divided into three photosensitive regions, which are the first photosensitive region (100-400), the second photosensitive region (400-800) and the third photosensitive region (800-1200). The first light sensing section (100-400) comprises a first light sensing endpoint value 100 and a second light sensing endpoint value 400, the second light sensing section (400-800) comprises a first light sensing endpoint value 400 and a second light sensing endpoint value 800, and the third light sensing section (800-1200) comprises a first light sensing endpoint value 800 and a second light sensing endpoint value 1200.
The end point values of adjacent photosensitive sections may have overlapping portions. For example, the first light-sensing end point values of the second light-sensing sections (400-800) coincide with the second light-sensing end point values of the first light-sensing sections (100-400), and the first light-sensing end point values of the third light-sensing sections (800-1200) coincide with the second light-sensing end point values of the second light-sensing sections (800-1200).
The difference between the second light sensing endpoint value and the first light sensing endpoint value may be the same or different between different light sensing intervals. That is, the photosensitive value spans of the different photosensitive sections may be the same or different. For example, in the first photosensitive section (100-400), the difference between the second photosensitive end point value and the first photosensitive end point value is 400-; in the second photosensitive interval (400-800), the difference between the second photosensitive end point value and the first photosensitive end point value is 800-; in the third photosensitive section (800-1200), the difference between the second photosensitive end point value and the first photosensitive end point value is 1200-800-400. In this example, the difference between the second light sensing end point values and the first light sensing end point values is different between the first light sensing interval and the second light sensing interval; and the difference value between the second light sensing endpoint value and the first light sensing endpoint value is the same between the second light sensing interval and the third light sensing interval.
Further, when the difference between the second light sensing endpoint value and the first light sensing endpoint value is different between different light sensing intervals, the smaller the light sensing value is, the larger the difference between the second light sensing endpoint value and the first light sensing endpoint value is in the corresponding light sensing interval, that is, the larger the light sensing value span is; when the light sensing value is larger, the difference value between the second light sensing endpoint value and the first light sensing endpoint value in the corresponding light sensing interval is smaller, namely the light sensing value span is smaller.
It can be understood that when the photosensitive value is smaller, it indicates that the brightness of the environment is brighter, and at this time, the noise point existing in the image is less, so the noise reduction degree is not required to be too high, the noise reduction parameter is not required to be changed greatly, and the photosensitive value span of the corresponding photosensitive section can be set to be larger. And when the photosensitive value is larger, the brightness of the environment is darker, the number of noise points existing in the image is larger, the noise reduction degree needs to be higher, the noise reduction parameter can be greatly changed, and the photosensitive value span of the corresponding photosensitive interval can be set to be smaller. In one example, the photosensitive value can be divided into five photosensitive regions (100-500), (500-800), (800-1000), (1000-1150), and (1150-1200). When the light sensing value is smaller, the difference between the second light sensing endpoint value and the first light sensing endpoint value in the corresponding light sensing interval is larger, for example, the light sensing value in the interval (100-500) is smaller, and the difference between the second light sensing endpoint value and the first light sensing endpoint value in the interval (100-500) is larger. When the light sensing value is larger, the difference between the second light sensing endpoint value and the first light sensing endpoint value in the corresponding light sensing interval is smaller, for example, the light sensing value in the interval (1150-1200) is larger, and the difference between the second light sensing endpoint value and the first light sensing endpoint value in the interval (1150-1200) is smaller.
In each photosensitive interval, the first photosensitive endpoint value corresponds to the first noise reduction parameter, and the second photosensitive endpoint value corresponds to the second noise reduction parameter. Still taking fig. 2 as an example, in the first photosensitive section (100-400), the first photosensitive endpoint value 100 corresponds to the first noise reduction parameter P1, and the second photosensitive endpoint value 400 corresponds to the second noise reduction parameter P2; in the second photosensitive interval (400-800), the first photosensitive endpoint value 400 corresponds to the first noise reduction parameter P2, and the second photosensitive endpoint value 800 corresponds to the second noise reduction parameter P3; in the third light sensing interval (800-1200), the first light sensing endpoint value 800 corresponds to the first noise reduction parameter P3, and the second light sensing endpoint value 1200 corresponds to the second noise reduction parameter P4. Wherein, P1 < P2 < P3 < P4. When the noise reduction parameter is smaller, the noise reduction degree is lower; the degree of noise reduction is higher when the noise reduction parameter is larger.
For each photosensitive interval, a noise reduction parameter corresponding to a photosensitive value located between the first photosensitive endpoint value and the second photosensitive endpoint value may be determined according to the first noise reduction parameter and the second noise reduction parameter. For example, for a first photosensitive interval (100-400), the noise reduction parameter corresponding to the photosensitive value 200 between the first photosensitive endpoint value and the second photosensitive endpoint value can be determined according to the first noise reduction parameter P1 corresponding to the first photosensitive endpoint value 100 and the second noise reduction parameter P2 corresponding to the second photosensitive endpoint value 400; for the second photosensitive interval (400-800), the noise reduction parameter corresponding to the photosensitive value 600 between the first photosensitive endpoint value and the second photosensitive endpoint value can be determined according to the first noise reduction parameter P2 corresponding to the first photosensitive endpoint value 400 and the second noise reduction parameter P3 corresponding to the second photosensitive endpoint value 800; for the third photosensitive interval (800-1200), the noise reduction parameter corresponding to the photosensitive value 1000 between the first photosensitive endpoint value and the second photosensitive endpoint value can be determined according to the first noise reduction parameter P3 corresponding to the first photosensitive endpoint value 800 and the second noise reduction parameter P4 corresponding to the second photosensitive endpoint value 1200.
Further, the photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value is positively correlated with the corresponding noise reduction parameter. That is, the smaller the sensitization value, the smaller the corresponding noise reduction parameter; the larger the sensitization value is, the larger the corresponding noise reduction parameter is. For example, in the first photosensitive interval (100-400), the noise reduction parameter corresponding to the photosensitive value 200 is smaller than the noise reduction parameter corresponding to the photosensitive value 300; in a second photosensitive interval (400-800), the noise reduction parameter corresponding to the photosensitive value 500 is smaller than the noise reduction parameter corresponding to the photosensitive value 600; in the third photosensitive interval (800-1200), the noise reduction parameter corresponding to the photosensitive value 900 is smaller than the noise reduction parameter corresponding to the photosensitive value 1000. It can be understood that when the photosensitive value is smaller, the brightness of the environment where the photosensitive value is located is brighter, and noise points existing in the image are fewer, so that the noise reduction degree is not required to be too high, and the corresponding noise reduction parameters can be set to be smaller so as to avoid the serious smearing of the image; when the photosensitive value is larger, the brightness of the environment is darker, the noise points in the image are more, the noise reduction degree needs to be higher, and the corresponding noise reduction parameters can be set to be larger so as to prevent the noise points from being unclean. In the embodiment of the application, the photosensitive value and the corresponding noise reduction parameter can be in positive correlation within the variation range of the whole photosensitive value.
Referring to fig. 2, as the light sensing value between the first light sensing endpoint and the second light sensing endpoint increases, the corresponding noise reduction parameter may increase linearly. Therefore, the change of the noise reduction parameters is smooth, the picture change is more natural, and the effect is better. Referring to fig. 3, in other embodiments, as the light sensing value between the first light sensing endpoint value and the second light sensing endpoint value increases, the corresponding noise reduction parameter may also increase exponentially. At this time, as the sensitization value is larger, the corresponding noise reduction parameter is increased faster, so as to remove noise more greatly when the ambient brightness is darker.
When the noise reduction parameter increases linearly with the photosensitive value, the photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value and the corresponding noise reduction parameter may satisfy the following condition:
output=input1+ratio*(input2-input1);
0<ratio<1;
wherein output is a noise reduction parameter corresponding to a photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value, input1 is the first noise reduction parameter, input2 is the second noise reduction parameter, and ratio is a noise reduction coefficient, and ratio linearly increases with the increase of the photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value.
Taking the second photosensitive intervals (400-800) as an example, if the first noise reduction parameter corresponding to the first photosensitive endpoint 400 is P2, and the second noise reduction parameter corresponding to the second photosensitive endpoint 800 is P3, then output is P2+ ratio (P3-P2). Since the ratio increases linearly with the exposure value between the first exposure endpoint value and the second exposure endpoint value, it is assumed that exposure value 500 between the first exposure endpoint value and the second exposure endpoint value corresponds to a ratio of 0.25; the ratio corresponding to the photosensitive value 600 located between the first photosensitive endpoint value and the second photosensitive endpoint value is 0.5; the ratio corresponding to the photosensitive value 700 located between the first photosensitive endpoint value and the second photosensitive endpoint value is 0.75 (the specific value of the ratio can be preset, and can be set according to an empirical value). Therefore, the noise reduction parameter corresponding to the sensitization value 500 is P2+0.25 (P3-P2); the noise reduction parameter corresponding to the photosensitive value 600 is P2+0.5 (P3-P2); the noise reduction parameter corresponding to the photosensitive value 700 is P2+0.75 (P3-P2). By analogy, the algorithm post-processing module 16 may calculate, for each sensitization interval, a denoising parameter corresponding to a sensitization value located between the first sensitization endpoint value and the second sensitization endpoint value.
In this embodiment of the application, the algorithm post-processing module 16 may divide the photosensitive value into a plurality of photosensitive intervals in advance (for example, before leaving the factory), and then determine, for each photosensitive interval, a noise reduction parameter corresponding to the photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value according to the first noise reduction parameter and the second noise reduction parameter. Alternatively, the algorithm post-processing module 16 stores a corresponding relationship between the current photo value and the current noise reduction parameter in advance (for example, before shipping), and the algorithm post-processing module is configured to determine the current noise reduction parameter according to the current photo value and the corresponding relationship between the current photo value and the current noise reduction parameter. Therefore, the noise reduction parameter corresponding to each photosensitive value is determined (or preset), and when the current photosensitive value is obtained by the algorithm post-processing module 16, the current noise reduction parameter can be determined according to the current photosensitive value, which is beneficial to saving time.
Referring to fig. 1 and 2, after determining the current noise reduction parameter according to the current photosensitive value, the post-algorithm processing module 16 may further be configured to perform noise reduction processing on the current image according to the current noise reduction parameter.
Specifically, taking the example that the current photosensitive value of the current image is 500, the current photosensitive value 500 is in the second photosensitive section (400-800), so the current noise reduction parameter corresponding to the current photosensitive value 500 is determined according to the first noise reduction parameter P2 corresponding to the first photosensitive endpoint value 400 and the second noise reduction parameter P3 corresponding to the second photosensitive endpoint value 800 in the second photosensitive section (400-800). Assuming that the ratio corresponding to the current photosensitive value 500 is 0.25, the current noise reduction parameter corresponding to the current photosensitive value 500 is P2+0.25 (P3-P2). The post-algorithm processing module 16 performs noise reduction processing on the current image according to the current noise reduction parameters P2+0.25 (P3-P2).
If the number of the current images is multiple frames, the algorithm post-processing module 16 may respectively determine the current noise reduction parameter corresponding to the current photosensitive value of each frame of the current image according to the above manner, and perform noise reduction processing on the current image according to the respective current noise reduction parameters. Taking the number of the current images as 4 frames as an example, if the post-algorithm processing module 16 determines that the current denoising parameter corresponding to the current photosensitive value of the current image of the first frame is Q1, the current denoising parameter corresponding to the current photosensitive value of the current image of the second frame is Q2, the current denoising parameter corresponding to the current photosensitive value of the current image of the third frame is Q3, and the current denoising parameter corresponding to the current photosensitive value of the current image of the fourth frame is Q4, then denoising the current image of the first frame according to the current denoising parameter Q1, denoising the current image of the second frame according to the current denoising parameter Q2, denoising the current image of the third frame according to the current denoising parameter Q3, and denoising the current image of the fourth frame according to the current denoising parameter Q4. Because the sensitization value may be different when each frame image is shot, each frame image respectively determines the noise reduction parameter and carries out noise reduction processing according to the noise reduction parameter, noise of each frame image can be eliminated better, and the effect is better. When obtaining multiple frames of images subjected to noise reduction, the algorithm post-processing module 16 may further perform synthesis or superposition processing on the multiple frames of images subjected to noise reduction to obtain an image finally used for output, so as to further eliminate noise and improve image definition.
In some embodiments, the post-algorithm processing module 16 includes a plurality of noise reduction processing tasks in a queue. The algorithm post-processing module 16 is configured to calculate a processing time required for each of the noise reduction processing tasks, and calculate a total operation time required for the plurality of noise reduction processing tasks according to the processing time required for each of the noise reduction processing tasks. When the application module 14 receives the exit command, if the time length for the post-algorithm processing module 16 to start processing the noise reduction processing task reaches the total running time length, the post-algorithm processing module 16 exits running. Therefore, the quitting time of the algorithm post-processing module 16 can be ensured to be accurate, image data cannot be lost, and system memory and power consumption can be maximally avoided (when the algorithm post-processing module 16 quits to operate too early, the image data can be lost, and when the algorithm post-processing module 16 quits to operate too late, the algorithm post-processing module 16 always occupies the memory, and the power consumption is increased).
Specifically, there may be multiple noise reduction tasks in the queue of the algorithm post-processing module 16, where the noise reduction tasks are to perform noise reduction processing on the current image according to the current noise reduction parameters. The processing time required for each noise reduction processing task may be the same or different. The algorithm post-processing module 16 estimates the processing time required for each noise reduction processing task in advance, and then calculates the total operation time required for a plurality of noise reduction processing tasks according to the processing time required for each noise reduction processing task. For example, the queue includes four noise reduction tasks altogether, the algorithm post-processing module 16 estimates in advance that the processing time required for obtaining the first noise reduction task is T1, the processing time required for the second noise reduction task is T2, the processing time required for the third noise reduction task is T3, and the processing time required for the fourth noise reduction task is T4, and then adds the processing time T1 required for the first noise reduction task, the processing time T2 required for the second noise reduction task, the processing time T3 required for the third noise reduction task, and the processing time T4 required for the fourth noise reduction task to obtain the total operating duration T required for all the noise reduction tasks. In one embodiment, the algorithm post-processing module 16 calculates the total operation time length required by the plurality of noise reduction processing tasks, and may further add the processing time required by the plurality of noise reduction processing tasks and the margin time to obtain the total operation time length. The margin time is a certain margin time added on the basis of the processing time required by a plurality of noise reduction processing tasks, so as to ensure that the algorithm post-processing module 16 has processed all the noise reduction processing tasks in the queue. The margin time may be 5% to 15% of the sum of the processing times required for the plurality of noise reduction processing tasks. Still taking the example that four noise reduction processing tasks are included in the queue in total, if the margin time is set to T0, the algorithm post-processing module 16 adds the processing time T1 required by the first noise reduction processing task, the processing time T2 required by the second noise reduction processing task, the processing time T3 required by the third noise reduction processing task, the processing time T4 required by the fourth noise reduction processing task, and the margin time T0 to obtain the total operating time T required by all the noise reduction processing tasks.
After the algorithm post-processing module 16 calculates the total running time required by all the noise reduction processing tasks in the queue according to the method, timing is started from the time point when the algorithm post-processing module 16 starts to process the noise reduction processing tasks, if the application program module 14 receives the exit command input by the user, the algorithm post-processing module 16 judges whether the time length when the algorithm post-processing module 16 starts to process the noise reduction processing tasks (i.e. the difference between the time point when the application program module 14 receives the exit command input by the user and the time point when the algorithm post-processing module 16 processes the noise reduction processing tasks) reaches (i.e. is greater than or equal to) the total running time length, and if so, the algorithm post-processing module 16 exits from running; if not, the background of the algorithm post-processing module 16 continues to run until the time length for the algorithm post-processing module 16 to start processing the noise reduction processing task reaches the total running time length.
In some embodiments, the hardware abstraction module 12 may send the frame number suggestion to the application module 14 according to the exposure information, the jitter of the gyroscope, the AR scene detection result (detecting the scene type, such as people, animals, scenery, etc.), etc., for example, when the jitter detected by the gyroscope is large, the frame number suggestion sent by the hardware abstraction module 12 to the application module 14 may be: more frames are suggested to better realize post-photographing processing; when the jitter detected by the gyroscope is small, the frame number suggestion sent by the hardware abstraction module 12 to the application module 14 may be: fewer frames are suggested to reduce the amount of data transmission. That is, the number of frames that the hardware abstraction module 12 suggests to the application module 14 may be positively correlated to the degree of jitter detected by the gyroscope. The hardware abstraction module 12 may also send an algorithm suggestion to the application program module 14 according to the exposure value information, the jitter condition of the gyroscope, the AR scene detection result, and the like, for example, when the jitter detected by the gyroscope is large, the algorithm suggestion sent by the hardware abstraction module 12 to the application program module 14 may be multi-frame noise reduction processing, so as to eliminate jitter and noise according to the multi-frame noise reduction processing; when the scene type detected by the AR scene detection result is a character, the algorithm suggestion sent by the hardware abstraction module 12 to the application program module 14 may be a beauty treatment to beautify the character; when the scene type detected by the AR scene detection result is scenic, the algorithm suggestion sent by the hardware abstraction module 12 to the application module 14 may be HDR processing to form a high dynamic range scenic image. The application program module 14 sends a data request to the hardware abstraction module 12 according to the frame number suggestion and the algorithm suggestion, the hardware abstraction module 12 transmits corresponding data to the application program module 14 according to the data request, and the application program module 14 transmits the data to the algorithm post-processing module 16 for post-processing after photographing.
After the image sensor 20 performs one shot (exposure imaging), the shot data (RAW image) is transmitted to the hardware abstraction module 12, and after the RAW image and/or YUV image corresponding to the shot data is received by the post-algorithm processing module 16, the image sensor 20 can perform the next shot, or the image sensor 20 can be turned off, or the application module 14 can exit the application interface. Since the post-photographing processing is implemented by the algorithm post-processing module 16, after the RAW image and/or the YUV image corresponding to the photographing data is transmitted to the algorithm post-processing module 16, the post-photographing processing can be implemented only by the algorithm post-processing module 16, and at this time, the image sensor 20 and the application program module 14 may not participate in the post-photographing processing, so that the image sensor 20 can be turned off or the next photographing can be performed, and the application program module 14 can be turned off or quit the application interface. In this way, the photographing apparatus 100 can achieve snapshot, and the application module 14 can be closed or the application interface can be exited when the post-photographing processing is performed by the post-algorithm processing module 16, so that some other operations (for example, operations unrelated to the photographing apparatus 100, such as browsing a web page, watching a video, making a call, etc.) can be performed on the electronic device, so that the user does not need to spend a lot of time waiting for the completion of the post-photographing processing, and the user can use the electronic device conveniently.
The algorithm post-processing module 16 may include an encoding unit 162, and the encoding unit 162 is configured to convert the YUV image into a JPG image (or a JPEG image, etc.). Specifically, when the YUV image is processed by the post-algorithm processing module 16, the encoding unit 162 may directly encode the YUV image to form a JPG image, thereby increasing the output speed of the image. When the RAW image is processed by the post-algorithm processing module 16, the post-algorithm processing module 16 may transmit the RAW image processed to realize post-photographing processing back to the hardware abstraction module 12 through the application module 14, for example, back to the RAW to RGB processing unit 124, the RAW to RGB processing unit 124 may be configured to convert the RAW image processed by the post-algorithm processing module 16 to realize post-photographing processing and transmitted back through the application module 14 into an RGB image, the noise reduction and YUV post-processing unit 126 may convert the RGB image into a YUV image, and the YUV image may be transmitted to the encoding unit 162 of the post-algorithm processing module 16 again to convert the YUV image into a JPG image. In some embodiments, the algorithm post-processing module 16 may also transmit the RAW image processed to implement the post-photographing processing back to the buffer unit 122 through the application module 14, and the transmitted RAW image passes through the RAW to RGB processing unit 124 and the noise reduction and YUV post-processing unit 126 to form a YUV image, and then is transmitted to the encoding unit 162 to form the JPG image. After the JPG image is formed, an algorithmic post-processing module 16 may be used to transfer the JPG image to memory for storage.
Referring to fig. 4, the algorithm post-processing module 16 includes a logic processing calling layer 164, an algorithm module interface layer 166 and an algorithm processing layer 168. The logic processing call layer 164 is used to communicate with the application module 14. The algorithm module interface layer 166 is used to maintain the algorithm interface. The algorithm processing layer 168 includes at least one image processing algorithm. The algorithm module interface layer 166 is used for performing at least one of registration, logout, call and callback on the image processing algorithm of the algorithm processing layer 168 through the algorithm interface.
The logic processing calling layer 164 may include a thread queue, and after receiving the post-photographing processing task of the RAW image and/or the YUV image, the algorithm post-processing module 16 may cache the post-photographing processing task in the thread queue for processing, where the thread queue may cache a plurality of post-photographing processing tasks, and thus, a snapshot (i.e., a snapshot mechanism) may be implemented by the logic processing calling layer 164. The logical process calling layer 164 may receive an instruction such as an initialization (init) or process (process) transmitted from the application module 14, and store the corresponding instruction and data in the thread queue. The logic processing call layer 164 makes a call of specific logic (i.e., a specific logic call combination) according to the task in the thread queue. Logical process call layer 164 may also pass the thumbnail (thumbnail) obtained by the process back to application module 14 for display (i.e., thumbnail display). In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The algorithm module interface layer 166 is used for calling an algorithm interface, the calling command can also be stored in the thread queue, and the algorithm processing layer 168 can analyze the parameter of the calling command to obtain the image processing algorithm to be called when receiving the calling command of the thread queue. When the algorithm module interface layer 166 registers the image processing algorithm, an image processing algorithm may be newly added to the algorithm processing layer 168; when the algorithm module interface layer 166 performs logout on an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be deleted; when the algorithm module interface layer 166 calls an image processing algorithm, one of the image processing algorithms in the algorithm processing layer 168 may be called; when the algorithm module interface layer 166 recalls the image processing algorithm, the data and status after the algorithm processing can be transmitted back to the application module 14. The unified interface can be adopted to realize the operations of registration, logout, call-back and the like of the image processing algorithm. Each image processing algorithm in the algorithm processing layer 168 is independent, so that operations such as registration, logout, call back and the like can be conveniently realized on the image processing algorithms.
Referring to fig. 5, in some embodiments, the image processor 10 further includes a camera service module 18. The hardware abstraction module 12 is connected to the application module 14 through the camera service module 18. The camera service module 18 encapsulates the RAW image and/or the YUV image and transmits the encapsulated RAW image and/or YUV image to the application module 14, and transmits the RAW image returned by the application module 14 to the hardware abstraction module 12. In this way, by encapsulating the image by the camera service module 18, the efficiency of image transmission can be improved, and the security of image transmission can be improved. When the image processor 10 includes the camera service module 18, the path of data (images, metadata, etc.) transmission in the image processor 10 may be adapted, i.e., data transmitted between the hardware abstraction module 12 and the application module 14 need to pass through the camera service module 18. For example, when the hardware abstraction module 12 transmits the RAW image and/or the YUV image to the application module 14, the hardware abstraction module 12 first transmits the RAW image and/or the YUV image to the camera service module 18, and the camera service module 18 encapsulates the RAW image and/or the YUV image and transmits the encapsulated RAW image and/or YUV image to the application module 14. For another example, when the hardware abstraction module 12 transmits metadata to the application program module 14, the hardware abstraction module 12 first transmits the metadata to the camera service module 18, and the camera service module 18 encapsulates the metadata and transmits the encapsulated metadata to the application program module 14. For another example, when the hardware abstraction module 12 transmits the frame number suggestion to the application module 14, the hardware abstraction module 12 first transmits the frame number suggestion to the camera service module 18, and the camera service module 18 encapsulates the frame number suggestion and transmits the encapsulated frame number suggestion to the application module 14. For another example, when the hardware abstraction module 12 transmits the algorithm suggestion to the application module 14, the hardware abstraction module 12 first transmits the algorithm suggestion to the camera service module 18, and the camera service module 18 encapsulates the algorithm suggestion and transmits the encapsulated algorithm suggestion to the application module 14. Of course, in some embodiments, the hardware abstraction module 12 may transmit the exposure information, the jitter of the gyroscope, the AR scene detection result, and the like to the camera service module 18, and the camera service module 18 obtains the frame number suggestion and/or the algorithm suggestion according to the exposure information, the jitter of the gyroscope, the AR scene detection result, and the like, and then transmits the frame number suggestion and/or the algorithm suggestion to the application module 14.
Referring to fig. 6 and 7, an electronic device 1000 is further provided in the present embodiment. The electronic device 1000 includes the camera 100 and the housing 200 according to any of the above embodiments, and the camera 100 is combined with the housing 200. The housing 200 may serve as a mounting carrier for functional elements of the electronic apparatus 1000. The housing 200 may provide protection against dust, falling, water, etc. for functional elements, such as a display screen, the camera 100, a receiver, etc. In one embodiment, the housing 200 includes a main body 210 and a movable bracket 220, the movable bracket 220 can move relative to the main body 210 under the driving of the driving device, for example, the movable bracket 220 can slide relative to the main body 210 to slide into the main body 210 (for example, the state of fig. 6) or slide out of the main body 210 (for example, the state of fig. 7). Some functional components may be mounted on the main body 210, and another part of functional components (e.g., the camera 100) may be mounted on the movable bracket 220, and the movement of the movable bracket 220 may cause the another part of functional components to retract into the main body 210 or extend out of the main body 210. In another embodiment, the housing 200 has a collection window, and the camera 100 is aligned with the collection window so that the camera 100 can receive external light through the collection window to form an image.
In the description of the embodiments of the present application, it should be noted that, unless otherwise explicitly specified or limited, the term "mounted" is to be interpreted broadly, e.g., as being either fixedly attached, detachably attached, or integrally attached; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present application can be understood by those of ordinary skill in the art according to specific situations.
Referring to fig. 8, an image processing method is further provided in the present application. The image processing method comprises the following steps:
01: the algorithm post-processing module 16 receives the current image output by the image sensor 20 and the current photosensitive value corresponding to the current image and transmitted by the hardware abstraction module 12;
02: determining a current noise reduction parameter according to a current photosensitive value, wherein the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by a first noise reduction parameter corresponding to the first photosensitive endpoint value of the photosensitive interval where the current photosensitive value is located and a second noise reduction parameter corresponding to the second photosensitive endpoint value; and
03: and performing noise reduction processing on the current image according to the current noise reduction parameter.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
Referring to fig. 9, in some embodiments, before the post-algorithm processing module 16 receives the current image output by the image sensor 20 and the current light sensing value (i.e. 01) corresponding to the current image transmitted by the hardware abstraction module 12, the image processing method further includes:
04: the hardware abstraction module 12 receives the current image output by the image sensor 20;
05: the hardware abstraction module 12 obtains a current photosensitive value corresponding to the current image;
06: the hardware abstraction module 12 sends the current image and the current exposure value to the application module 14; and
07: the application module 14 sends the current image and the current exposure value to the post-algorithm processing module 16.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
Referring to fig. 10, in some embodiments, before determining the current noise reduction parameter (i.e., 02) based on the current exposure value, the image processing method further comprises:
08: the algorithm post-processing module 16 divides the photosensitive value into a plurality of photosensitive sections in advance; and
09: and determining a noise reduction parameter corresponding to the photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value according to the first noise reduction parameter and the second noise reduction parameter for each photosensitive interval.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
Referring to fig. 11, in some embodiments, the algorithm post-processing module 16 stores a corresponding relationship between the current photosensitive value and the current noise reduction parameter in advance. Determining current noise reduction parameters (i.e. 02) from the current photosensitive values, including:
021: and determining the current noise reduction parameters according to the current photosensitive value and the corresponding relation.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
In some embodiments, the light sensation value between the first light sensation endpoint value and the second light sensation endpoint value is positively correlated with the corresponding noise reduction parameter.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
In some embodiments, as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases, the corresponding noise reduction parameter increases linearly; or the corresponding noise reduction parameter increases exponentially as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
Referring to fig. 12, in some embodiments, the queue of the post-algorithm processing module 16 includes a plurality of noise reduction processing tasks, and the image processing method further includes:
010: the algorithm post-processing module 16 calculates the processing time required by each noise reduction processing task;
011: the algorithm post-processing module 16 calculates the total operation time required by the plurality of noise reduction processing tasks according to the processing time required by each noise reduction processing task; and
012: when the application module 14 receives the exit command, if the time length for the post-algorithm processing module 16 to start processing the noise reduction processing task reaches the total running time length, the post-algorithm processing module 16 exits running.
It should be noted that the explanation of the image processor 10 in the foregoing embodiment is also applicable to the image processing method in the embodiment of the present application, and is not repeated herein.
In summary, in the image processing method, the image processor 10, the photographing device 100 and the electronic device 1000 according to the embodiment of the present application, the photosensitive value is divided into a plurality of photosensitive sections, in each photosensitive section, the first photosensitive endpoint corresponds to the first noise reduction parameter, the second photosensitive endpoint corresponds to the second noise reduction parameter, and the current noise reduction parameter corresponding to the current image is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint of the photosensitive section where the current photosensitive value is located and the second noise reduction parameter corresponding to the second photosensitive endpoint, so that different photosensitive values adopt different noise reduction parameters, the noise reduction parameters are more flexible, the change is smoother, the image is clearer, fewer noise points are provided, and the effect is better.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires (control method), a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, references to the description of "certain embodiments" or the like are intended to mean that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present application. In the present specification, the schematic representations of the above terms do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics described may be combined in any suitable manner in any one or more embodiments.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (14)

1. An image processing method, comprising:
the algorithm post-processing module receives a current image output by the image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image;
determining a current noise reduction parameter according to the current photosensitive value, wherein a relation curve between the photosensitive value and the noise reduction parameter is a linear curve or an exponential curve, on the relation curve, the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive endpoint value and a second photosensitive endpoint value which is larger than the first photosensitive endpoint value, the first photosensitive endpoint value corresponds to a first noise reduction parameter, the second photosensitive endpoint value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive endpoint value, the second noise reduction parameter corresponding to the second photosensitive endpoint value, and a preset noise reduction coefficient of the photosensitive interval in which the current photosensitive value is located; and
and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
2. The image processing method according to claim 1, wherein before the post-algorithm processing module receives the current image output by the image sensor and the current photosensitive value corresponding to the current image transmitted by the hardware abstraction module, the image processing method further comprises:
the hardware abstraction module receives the current image output by the image sensor;
the hardware abstraction module acquires the current photosensitive value corresponding to the current image;
the hardware abstraction module sends the current image and the current sensitization value to an application program module; and
and the application program module sends the current image and the current light sensing value to the algorithm post-processing module.
3. The image processing method according to claim 1, wherein before said determining a current noise reduction parameter from said current photosensitive value, said image processing method further comprises:
the algorithm post-processing module divides the photosensitive value into a plurality of photosensitive intervals in advance; and
and determining a noise reduction parameter corresponding to the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value according to the first noise reduction parameter and the second noise reduction parameter for each sensitization interval.
4. The image processing method according to claim 1, wherein a photosensitive value between the first photosensitive endpoint value and the second photosensitive endpoint value is positively correlated with a corresponding noise reduction parameter.
5. The image processing method according to claim 4, wherein as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases, the corresponding noise reduction parameter increases linearly; or
The corresponding noise reduction parameter increases exponentially as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases.
6. The image processing method according to claim 1, wherein a queue of the post-algorithm processing module includes a plurality of noise reduction processing tasks, the image processing method further comprising:
the algorithm post-processing module calculates the processing time required by each noise reduction processing task;
the algorithm post-processing module calculates the total operation time required by the plurality of noise reduction processing tasks according to the processing time required by each noise reduction processing task; and
and when the application program module receives an exit command, if the time length for the algorithm post-processing module to start processing the noise reduction processing task reaches the total running time length, the algorithm post-processing module exits the running.
7. An image processor, comprising an algorithmic post-processing module and a hardware abstraction module, the algorithmic post-processing module to:
receiving a current image output by an image sensor and a current photosensitive value which is transmitted by the hardware abstraction module and corresponds to the current image;
determining a current noise reduction parameter according to the current photosensitive value, wherein a relation curve between the photosensitive value and the noise reduction parameter is a linear curve or an exponential curve, on the relation curve, the photosensitive value is divided into a plurality of photosensitive intervals, each photosensitive interval comprises a first photosensitive end point value and a second photosensitive end point value which is larger than the first photosensitive end point value, the first photosensitive end point value corresponds to a first noise reduction parameter, the second photosensitive end point value corresponds to a second noise reduction parameter, and the current noise reduction parameter is determined by the first noise reduction parameter corresponding to the first photosensitive end point value, the second noise reduction parameter corresponding to the second photosensitive end point value, and a preset noise reduction coefficient of the photosensitive interval in which the current photosensitive value is located; and
and carrying out noise reduction processing on the current image according to the current noise reduction parameter.
8. The image processor of claim 7, further comprising an application module that, prior to the post-algorithm processing module receiving a current image output by an image sensor and a current exposure value corresponding to the current image transmitted by a hardware abstraction module,
the hardware abstraction module is used for receiving the current image output by the image sensor;
the hardware abstraction module is used for acquiring the current photosensitive value corresponding to the current image;
the hardware abstraction module is used for sending the current image and the current photosensitive value to the application program module;
and the application program module is used for sending the current image and the current light sensing value to the algorithm post-processing module.
9. The image processor of claim 7, wherein the post-algorithm processing module, prior to determining current noise reduction parameters from the current photo-value, is to:
dividing a photosensitive value into a plurality of photosensitive sections in advance; and
and determining a noise reduction parameter corresponding to the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value according to the first noise reduction parameter and the second noise reduction parameter for each sensitization interval.
10. The image processor of claim 7, wherein a sensitization value between the first sensitization endpoint value and the second sensitization endpoint value is positively correlated with a corresponding noise reduction parameter.
11. The image processor of claim 10, wherein as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases, the corresponding noise reduction parameter increases linearly; or
The corresponding noise reduction parameter increases exponentially as the sensitization value between the first sensitization endpoint value and the second sensitization endpoint value increases.
12. The image processor of claim 7, wherein the queue of post-algorithm processing modules includes a plurality of noise reduction processing tasks,
the algorithm post-processing module is used for calculating the processing time required by each noise reduction processing task;
the algorithm post-processing module is used for calculating the total operation time required by the plurality of noise reduction processing tasks according to the processing time required by each noise reduction processing task;
and when the application program module receives an exit command, if the time length for the algorithm post-processing module to start processing the noise reduction processing task reaches the total running time length, the algorithm post-processing module exits the running.
13. A camera, comprising:
the image processor of any one of claims 7 to 12; and
an image sensor connected with the image processor.
14. An electronic device, comprising:
the camera of claim 13; and
a housing, the photographing device being combined with the housing.
CN201911420578.1A 2019-12-31 2019-12-31 Image processing method, image processor, shooting device and electronic equipment Active CN111147695B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911420578.1A CN111147695B (en) 2019-12-31 2019-12-31 Image processing method, image processor, shooting device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911420578.1A CN111147695B (en) 2019-12-31 2019-12-31 Image processing method, image processor, shooting device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111147695A CN111147695A (en) 2020-05-12
CN111147695B true CN111147695B (en) 2022-05-13

Family

ID=70522871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911420578.1A Active CN111147695B (en) 2019-12-31 2019-12-31 Image processing method, image processor, shooting device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111147695B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781288A (en) * 2020-06-09 2021-12-10 Oppo广东移动通信有限公司 Electronic device and image processing method
CN113810593B (en) * 2020-06-15 2023-08-01 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN113837937A (en) * 2020-06-24 2021-12-24 Oppo广东移动通信有限公司 Multimedia processing chip, electronic equipment image fusion method and image cutting method
CN113873142B (en) * 2020-06-30 2023-07-25 Oppo广东移动通信有限公司 Multimedia processing chip, electronic device, and moving image processing method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7983503B2 (en) * 2007-05-25 2011-07-19 Zoran Corporation Advanced noise reduction in digital cameras
JP5704069B2 (en) * 2009-06-25 2015-04-22 コニカミノルタ株式会社 Image input device
US9813632B2 (en) * 2011-05-19 2017-11-07 Foveon, Inc. Method of adjusting digital camera image processing parameters
CN104902143B (en) * 2015-05-21 2018-01-19 广东欧珀移动通信有限公司 A kind of image de-noising method and device based on resolution ratio
CN105306788B (en) * 2015-10-27 2018-05-15 广东欧珀移动通信有限公司 A kind of noise-reduction method and device of image of taking pictures
CN107452348B (en) * 2017-08-15 2020-07-28 广州视源电子科技股份有限公司 Method and system for reducing noise of display picture, computer device and readable storage medium
CN107635098B (en) * 2017-10-30 2019-09-10 Oppo广东移动通信有限公司 High dynamic range images noise remove method, device and equipment
CN109348089B (en) * 2018-11-22 2020-05-22 Oppo广东移动通信有限公司 Night scene image processing method and device, electronic equipment and storage medium
CN110290288B (en) * 2019-06-03 2022-01-04 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device, and electronic apparatus

Also Published As

Publication number Publication date
CN111147695A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
CN109963083B (en) Image processor, image processing method, photographing device, and electronic apparatus
CN110086967B (en) Image processing method, image processor, photographing device and electronic equipment
CN111147695B (en) Image processing method, image processor, shooting device and electronic equipment
CN110290288B (en) Image processor, image processing method, photographing device, and electronic apparatus
CN110062161B (en) Image processor, image processing method, photographing device, and electronic apparatus
US10616511B2 (en) Method and system of camera control and image processing with a multi-frame-based window for image data statistics
US11070742B2 (en) Optimized exposure temporal smoothing for time-lapse mode
EP2882184A2 (en) Robust automatic exposure control using embedded data
CN110996012B (en) Continuous shooting processing method, image processor, shooting device and electronic equipment
CN111193866B (en) Image processing method, image processor, photographing device and electronic equipment
WO2020259250A1 (en) Image processing method, image processor, photographing apparatus, and electronic device
KR20130109588A (en) Apparatus and mehod for processing a image in camera device
CN110121022A (en) Control method, filming apparatus and the electronic equipment of filming apparatus
CN111193867B (en) Image processing method, image processor, photographing device and electronic equipment
CN110278375B (en) Image processing method, image processing device, storage medium and electronic equipment
CN111510629A (en) Data display method, image processor, photographing device and electronic equipment
JP5797072B2 (en) Imaging apparatus, control method therefor, and program
CN110401800B (en) Image processing method, image processor, photographing device and electronic equipment
CN110418061B (en) Image processing method, image processor, photographing device and electronic equipment
JP6118118B2 (en) Imaging apparatus and control method thereof
JP2013211724A (en) Imaging apparatus
CN111491101B (en) Image processor, image processing method, photographing device, and electronic apparatus
CN110602359B (en) Image processing method, image processor, photographing device and electronic equipment
JP2016111565A (en) Photographing apparatus, photographing method, and program
CN117615260A (en) Image pickup method and device based on virtual sensor and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant