CN111131716A - Image processing method and electronic device - Google Patents

Image processing method and electronic device Download PDF

Info

Publication number
CN111131716A
CN111131716A CN201911424082.1A CN201911424082A CN111131716A CN 111131716 A CN111131716 A CN 111131716A CN 201911424082 A CN201911424082 A CN 201911424082A CN 111131716 A CN111131716 A CN 111131716A
Authority
CN
China
Prior art keywords
target
image
region
regions
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911424082.1A
Other languages
Chinese (zh)
Other versions
CN111131716B (en
Inventor
辛佳慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911424082.1A priority Critical patent/CN111131716B/en
Publication of CN111131716A publication Critical patent/CN111131716A/en
Application granted granted Critical
Publication of CN111131716B publication Critical patent/CN111131716B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure provides an image processing method, including: responding to a photographing instruction; processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction to generate a photographing picture; and displaying the photographed picture; wherein, the mode of shooing that the instruction of shooing corresponds includes at least: performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image; the partial area of the target acquisition image is an area subjected to brightness compensation processing. The present disclosure also provides an electronic device.

Description

Image processing method and electronic device
Technical Field
The present disclosure relates to an image processing method and an electronic device.
Background
The electronic apparatus in the related art has a variety of functions, for example, a function of processing an image. In general, an image may have noise due to various reasons, and the noisy image may bring a bad experience to a user. In one case, since noise in a partial region of an image is large, when noise reduction processing is performed on an image, noise reduction processing is generally performed on the entire image in the related art. However, after the image is subjected to the overall noise reduction, the portion of the image where the noise is small is caused to lose the detail information due to the noise reduction processing.
Disclosure of Invention
One aspect of the present disclosure provides an image processing method, including: and responding to a photographing instruction, processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction, generating a photographing picture, and displaying the photographing picture. Wherein, the photographing mode corresponding to the photographing instruction at least comprises: and performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
Optionally, the photographing mode corresponding to the photographing instruction at least includes: and responding to the photographing instruction to obtain at least one original acquisition image through a camera module, and processing the at least one original acquisition image based on a brightness compensation algorithm to obtain a target acquisition image, wherein the brightness of a partial area of the target acquisition image is higher than that of a partial area of the first original acquisition image.
Optionally, the method further includes: determining the target region in the target captured image. The determining the target region in the target captured image comprises: processing the target collected image and the first original collected image, determining target pixel points with different brightness values from the first original collected image in the target collected image, and determining the region of the target pixel points in the target collected image as the target region.
Optionally, the determining the target region in the target captured image further includes: dividing the first original captured image into a plurality of first regions, dividing the target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region.
Optionally, the processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region includes: determining a luminance value of the second region and a luminance value of the first region corresponding to the second region, and determining at least one of the plurality of second regions as the target region based on the luminance values of the second region and the corresponding luminance values of the first region.
Optionally, the determining, based on the brightness values of the second regions and the corresponding brightness values of the first regions, at least one of the plurality of second regions as the target region includes: determining a first brightness value set of a plurality of first pixel points in a current first region of the plurality of first regions, determining a second brightness value set of a plurality of second pixel points in the second region corresponding to the current first region, and determining the current second region as the target region in response to determining that the first brightness value set and the second brightness value set satisfy a preset condition.
Optionally, the first and second sets of luminance values satisfying a preset condition include at least one of: a first particular difference between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference, and a second particular difference between an average of elements in the first set of luminance values and an average of elements in the second set of luminance values is greater than a second predetermined difference.
Optionally, when the target region includes at least two second regions, the performing noise reduction processing on the target region of the target captured image includes at least one of: and performing noise reduction processing on at least two second regions based on the same first noise reduction parameter, determining a second noise reduction parameter corresponding to each of the at least two second regions based on the first specific difference or the second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
Another aspect of the present disclosure provides an electronic device including: camera module and treater. Wherein, the camera module is used for obtaining the collection image, and the treater is used for carrying out: and responding to a photographing instruction, processing the acquired image obtained by responding to the photographing instruction based on a photographing mode corresponding to the photographing instruction, generating a photographing picture, and displaying the photographing picture. Wherein, the photographing mode corresponding to the photographing instruction at least comprises: and performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
Optionally, the photographing mode corresponding to the photographing instruction at least includes: and responding to the photographing instruction to obtain at least one original acquisition image through the camera module, and processing the at least one original acquisition image based on a brightness compensation algorithm to obtain a target acquisition image, wherein the brightness of a partial area of the target acquisition image is higher than that of a partial area of the first original acquisition image.
Optionally, the processor is further configured to: determining the target region in the target captured image. The determining the target region in the target captured image comprises: processing the target collected image and the first original collected image, determining target pixel points with different brightness values from the first original collected image in the target collected image, and determining the region of the target pixel points in the target collected image as the target region.
Optionally, the determining the target region in the target captured image further includes: dividing the first original captured image into a plurality of first regions, dividing the target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region.
Optionally, the processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region includes: determining a luminance value of the second region and a luminance value of the first region corresponding to the second region, and determining at least one of the plurality of second regions as the target region based on the luminance values of the second region and the corresponding luminance values of the first region.
Optionally, the determining, based on the brightness values of the second regions and the corresponding brightness values of the first regions, at least one of the plurality of second regions as the target region includes: determining a first brightness value set of a plurality of first pixel points in a current first region of the plurality of first regions, determining a second brightness value set of a plurality of second pixel points in the second region corresponding to the current first region, and determining the current second region as the target region in response to determining that the first brightness value set and the second brightness value set satisfy a preset condition.
Optionally, the first and second sets of luminance values satisfying a preset condition include at least one of: a first particular difference between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference, and a second particular difference between an average of elements in the first set of luminance values and an average of elements in the second set of luminance values is greater than a second predetermined difference.
Optionally, when the target region includes at least two second regions, the performing noise reduction processing on the target region of the target captured image includes at least one of: and performing noise reduction processing on at least two second regions based on the same first noise reduction parameter, determining a second noise reduction parameter corresponding to each of the at least two second regions based on the first specific difference or the second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
Another aspect of the present disclosure provides an image processing apparatus including: the device comprises a response module, a processing module and a display module. The response module is used for responding to the photographing instruction. The processing module is used for processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction to generate a photographing picture. The display module is used for displaying the photographed picture. Wherein, the photographing mode corresponding to the photographing instruction at least comprises: and performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
Optionally, the photographing mode corresponding to the photographing instruction at least includes: responding to the photographing instruction to obtain at least one original acquisition image through the camera module, and processing the at least one original acquisition image based on a brightness compensation algorithm to obtain a target acquisition image; the brightness of the partial area of the target acquisition image is higher than that of the partial area of the first original acquisition image.
Optionally, the apparatus further comprises: a determination module for determining the target region in the target captured image. The determining the target region in the target captured image comprises: processing the target collected image and the first original collected image, determining target pixel points with different brightness values from the first original collected image in the target collected image, and determining the region of the target pixel points in the target collected image as the target region.
Optionally, the determining the target region in the target captured image further includes: dividing the first original captured image into a plurality of first regions, dividing the target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region.
Optionally, the processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as the target region includes: determining a luminance value of the second region and a luminance value of the first region corresponding to the second region, and determining at least one of the plurality of second regions as the target region based on the luminance values of the second region and the corresponding luminance values of the first region.
Optionally, the determining, based on the brightness values of the second regions and the corresponding brightness values of the first regions, at least one of the plurality of second regions as the target region includes: determining a first brightness value set of a plurality of first pixel points in a current first region of the plurality of first regions, determining a second brightness value set of a plurality of second pixel points in the second region corresponding to the current first region, and determining the current second region as the target region in response to determining that the first brightness value set and the second brightness value set satisfy a preset condition.
Optionally, the first and second sets of luminance values satisfying a preset condition include at least one of: a first particular difference between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference, and a second particular difference between an average of elements in the first set of luminance values and an average of elements in the second set of luminance values is greater than a second predetermined difference.
Optionally, when the target region includes at least two second regions, the performing noise reduction processing on the target region of the target captured image includes at least one of: and performing noise reduction processing on at least two second regions based on the same first noise reduction parameter, determining a second noise reduction parameter corresponding to each of the at least two second regions based on the first specific difference or the second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
Another aspect of the disclosure provides a non-transitory readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of an image processing method according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a schematic diagram of determining a target area according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of determining a target area according to another embodiment of the present disclosure;
FIG. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure;
fig. 6 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure; and
FIG. 7 schematically shows a block diagram of a computer system for implementing image processing according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable control apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
An embodiment of the present disclosure provides an image processing method, including: and responding to the photographing instruction, processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction, generating a photographing picture, and displaying the photographing picture. Wherein, the mode of shooing that the instruction of shooing corresponds includes at least: and performing noise reduction processing on a target area of the target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
Fig. 1 schematically shows an application scenario of an image processing method according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the application scenario 100 includes, for example, an electronic device 110. The electronic device 110 may be, for example, a mobile phone, a computer, a server, or the like.
The electronic device 110 may be used, for example, to obtain the captured image 120 in accordance with embodiments of the present disclosure. For example, the electronic device 110 has a photographing function, and the electronic device 110 can take a photograph based on the photographing instruction to obtain the captured image 120.
In the embodiment of the present disclosure, the target region 121 in the captured image 120 is, for example, a partial region of the captured image 120. The target area 121 is, for example, an area with lower brightness in the captured image 120, in other words, the target area 121 is, for example, a darker area in the captured image 120. The electronic device 110 may perform brightness compensation processing on the target area 121 in the captured image 120 to obtain a target captured image 130, where a brightness value of the target area 131 in the target captured image 130 is greater than a brightness value of the target area 121.
The brightness value of the region referred to in the embodiment of the present disclosure may include, for example, a gray value of a pixel point in the region. The luminance value of a region can be obtained from histogram statistics of pixel points of the region, for example. For example, taking the brightness value of the target region 131 as an example, for example, the target region 131 includes N pixels, and if the gray values of 90% of the N pixels are within the predetermined gray value range, the average gray value of the 90% pixels can be calculated as the brightness value of the target region 131.
According to the embodiment of the present disclosure, the noise of the target region 131 subjected to the brightness compensation process is, for example, greater than the noise of the target region 121, for example, the noise in the target region 131 is more noticeable. The embodiment of the disclosure may perform noise reduction processing on the target region 131 in the target captured image 130 to obtain the final photographed picture 140, where noise of the target region 141 in the photographed picture 140 is, for example, smaller than noise of the target region 131, and for example, a noise point in the target region 141 is not visually obvious.
The embodiment of the disclosure greatly reserves the global information of the image by carrying out noise reduction processing on the local area of the image. For example, in the process of noise reduction, noise reduction is not performed on the area without brightness compensation in the image, so that the information of the area without brightness compensation is more completely retained.
An image processing method according to an exemplary embodiment of the present disclosure is described below with reference to fig. 2 to 4 in conjunction with an application scenario of fig. 1. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present disclosure, and the embodiments of the present disclosure are not limited in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Fig. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S230.
In operation S210, a photographing instruction is responded.
In operation S220, the collected image obtained in response to the photographing instruction is processed based on the photographing mode corresponding to the photographing instruction, and a photographed picture is generated.
In operation S230, a photographed picture is displayed.
According to an embodiment of the present disclosure, the image processing method may be performed by an electronic device, for example. The electronic device may include, for example, a camera module and a processor. The processor may, for example, respond to a photographing instruction and instruct the camera module to acquire a captured image. Then, the processor may process the captured image based on a photographing mode corresponding to the photographing instruction, and may obtain a photographed picture by processing the captured image. The electronic device can display the photographed picture so that the user can determine the processing effect of the image by viewing the photographed picture.
According to the embodiment of the disclosure, for example, the processor may acquire a plurality of captured images through the camera module in response to the photographing instruction. For example, the plurality of captured images may include at least one original captured image and a target captured image. The target collected image can be obtained by performing brightness compensation processing on at least one original collected image. Specifically, the brightness of the partial region in at least one of the original captured images may be increased, so that the partial region in the obtained target captured image is an increased brightness region.
In the disclosed embodiments, the at least one raw captured image may comprise one or more, for example. How to obtain the target captured image when the at least one original captured image is one and plural, respectively, will be described below.
In one case, the at least one originally acquired image is, for example, one. For example, one original captured image may be acquired through a normal photographing mode of the electronic device, and the acquired one original captured image may be, for example, the first original captured image. The light-dark contrast of the first originally acquired image satisfies, for example, a light-dark contrast condition. The light and dark contrast condition may be a condition that the exposure value satisfies a threshold range, for example. Wherein exposure values below the threshold range are indicative of an underexposure of the image and exposure values above the threshold range are indicative of an overexposure of the image.
Then, the brightness value of the partial region in the first originally captured image may be increased by a brightness compensation algorithm. For example, the first originally captured image includes a dark portion region and a bright portion region, and the brightness value of the dark portion region in the first originally captured image may be increased to obtain the target captured image.
In another case, the at least one originally acquired image is, for example, plural. For example, a plurality of raw captured images may be acquired by an HDR (High-Dynamic Range) photographing mode of the electronic device. In the embodiment of the present disclosure, for example, the plurality of originally captured images is exemplified as three originally captured images. The three originally captured images include, for example, an underexposed image, a normally exposed image, and an overexposed image. In which, for example, a normally exposed image is taken as the first originally captured image.
Then, the brightness value of the partial region in the first originally captured image may be increased by a brightness compensation algorithm. Specifically, for example, the first original captured image may be subjected to brightness compensation processing based on information in the underexposed image and the overexposed image, so as to increase the brightness value of the partial region in the first original captured image, thereby obtaining the target captured image.
In the embodiment of the present disclosure, after the target captured image is acquired, noise reduction processing needs to be performed on the target region in the target captured image. The target region may be the same as or different from the partial region subjected to the luminance compensation process, for example. In the case where the target region is different from the partial region subjected to the luminance compensation processing, the target region includes, for example, at least a part of the partial region subjected to the luminance compensation processing.
The target area in different cases will be described below with reference to fig. 3 and 4, respectively. For example, fig. 3 will describe a case where the target area is the same as the partial area where the luminance compensation process is performed. Fig. 4 will describe a case where the target area is different from the partial area where the luminance compensation process is performed.
Fig. 3 schematically illustrates a schematic diagram of determining a target area according to an embodiment of the present disclosure.
As shown in fig. 3, the target area of this embodiment is, for example, the same as the partial area where the luminance compensation process is performed.
For example, the first originally-collected image 310 includes a plurality of pixel points, and the luminance compensation processing is performed on the plurality of pixel points in the first originally-collected image 310 to obtain the target-collected image 320. For example, the luminance values of the pixels a, b, c, and … … in the first original captured image 310 are increased, and the obtained target captured image 320 includes the pixels a ', b ', c ', and … … whose luminance values are increased. The region of the first originally acquired image 310 where the pixel points a, b, c, and … … are located is a partial region in the embodiment of the present disclosure. The region in the target collection image 320 where the pixel points a ', b ', c ', … … are located is the target region of the embodiment of the present disclosure.
According to an embodiment of the present disclosure, determining the target region in the target captured image 320 includes, for example: the target collected image 320 and the first original collected image 310 are processed, target pixel points with different brightness values from the first original collected image 310 in the target collected image 320 are determined, and the region of the target pixel points in the target collected image 320 is determined as a target region.
According to the embodiment of the present disclosure, for example, different pixel points in the two images can be obtained by comparing the target captured image 320 and the first original captured image 310. That is, the pixel points a ', b ', c ', … … in the target captured image 320 are determined as target pixel points. And the area of the target pixel points a ', b ', c ', … … in the target collection image 320 is used as a target area.
According to the embodiment of the present disclosure, the target region is subjected to noise reduction processing, for example, noise reduction processing may be performed on all target pixel points in the target region based on the same noise reduction parameter. In one case, the noise values of all target pixels can be reduced to a specific noise value. For example, the noise values of the three target pixel points a ', b ', and c ' are 5, 6, and 7, respectively, and the noise values 5, 6, and 7 are all reduced to a specific noise value 4. In another case, the noise values of all target pixel points may be denoised according to a specific noise difference, for example, 3, the noise values of the three target pixel points a ', b', and c 'are respectively 5, 6, and 7, and reduced according to the specific noise difference 3, and then the obtained noise values of the three target pixel points a', b ', and c' are respectively 2, 3, and 4.
Or, for different target pixel points in the target region, noise reduction processing may be performed based on different noise reduction parameters. For example, the noise reduction parameters for different target pixel points may be determined according to the degree to which the brightness values of the different target pixel points are improved. For example, if the brightness values of the three target pixels a ', b ', and c ' are respectively increased by 1, 2, and 4, the noise reduction parameter ratio parameters of the three target pixels a ', b ', and c ' are determined, and the noise reduction ratio parameters of the three target pixels a ', b ', and c ' are, for example, 1: 2: 4. Wherein the noise reduction values of the three target pixels a ', b ', c ' are 1, 2, 3, or 2, 4, 8, or 3, 6, 12, etc.
According to the embodiment of the disclosure, the target pixel points with different brightness values between the first original collected image and the target collected image are compared, and noise reduction processing is performed on the target pixel points. The method and the device realize the targeted noise reduction of the pixel points subjected to the brightness compensation processing, thereby realizing the high-precision noise reduction of the image.
Fig. 4 schematically shows a schematic diagram of determining a target area according to another embodiment of the present disclosure.
As shown in fig. 4, the target area of this embodiment is different from, for example, a partial area where the luminance compensation process is performed. For example, the target region includes at least a part of a partial region in which the luminance compensation process is performed. For example, the partial area 411 subjected to the luminance compensation process is an area inside a dotted line in the first originally captured image 410.
According to an embodiment of the present disclosure, for example, the first originally captured image 410 may be divided into a plurality of first regions including, for example, regions a to F. The target captured image 420 is divided into a plurality of second regions, which include, for example, regions a 'to F'. Wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, e.g., first region a corresponds to second region a ', first region B corresponds to second region B', etc.
The plurality of first regions and the plurality of second regions may then be processed to determine at least one of the plurality of second regions as a target region. For example, a luminance value of the second region and a luminance value of the first region corresponding to the second region may be determined, and at least one of the plurality of second regions may be determined as the target region based on the luminance values of the second region and the corresponding luminance values of the first region.
For example, each of the plurality of first regions may be determined in turn as a current first region. For example, take the current first region C as an example. The current first region C includes, for example, a plurality of first pixel points. For example, the plurality of first pixel points include pixel points e, f, and g. The brightness values of the pixels e, f, g are, for example, 1, 2, 3, respectively. Thus, the first set of luminance values comprises, for example, luminance values 1, 2, 3.
Then, a second luminance value set of a plurality of second pixel points e ', f', g 'in a second region C' corresponding to the current first region C is determined, for example, the second luminance value set includes luminance values 4, 5, and 3. The brightness value of the pixel point g and the brightness value of the pixel point g 'are both 3, which can indicate that the pixel point g' is not subjected to brightness compensation processing.
Thereafter, the current second region C' may be determined as at least part of the target region 421 in response to determining that the first and second sets of luminance values satisfy the preset condition.
According to an embodiment of the present disclosure, the first and second sets of luminance values satisfying the preset condition may include, for example: a first particular difference value between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference value. According to the embodiment of the present disclosure, the first predetermined difference may be a specific value, for example, the embodiment of the present disclosure takes the first predetermined difference as 5 as an example. For example, the sum of the elements 1, 2, 3 in the first set of luminance values is 6, and the sum of the elements 4, 5, 3 in the second set of luminance values is, for example, 12. Wherein, a first specific difference between 12 and 6 is, for example, 6, and the first specific difference 6 is, for example, greater than a first predetermined difference 5, which may at least represent that at least some of the pixel points in the current second region C' are subjected to the brightness compensation process.
Alternatively, a second particular difference between the average of the elements in the first set of luminance values and the average of the elements in the second set of luminance values is larger than a second predetermined difference. According to the embodiment of the present disclosure, the second predetermined difference may be a specific value, for example, the embodiment of the present disclosure takes the second predetermined difference as 1 as an example. For example, the average value of the elements 1, 2, 3 in the first set of luminance values is 2, and the average value of the elements 4, 5, 3 in the second set of luminance values is, for example, 4. Wherein, a second specific difference between 4 and 2 is, for example, 2, and the second specific difference 2 is, for example, greater than a second predetermined difference 1, and may at least represent that at least a part of the pixel points in the current second region C' are subjected to the brightness compensation process.
According to the embodiment of the present disclosure, the first predetermined difference or the second predetermined difference may be specifically set according to the actual application. The specific value set by the first predetermined difference or the second predetermined difference is used for ensuring that most of the pixels in the determined current second region are subjected to the brightening treatment as far as possible, or is also used for ensuring that the brightening degree of the pixels in the determined current second region is larger as far as possible.
In the same or similar manner as when the current second region is C ', for example, the second region D' may be determined to be a part of the target region 421. In the embodiment of the present disclosure, the target area 421 is composed of, for example, a second area C 'and a second area D'. It is understood that the target area 421 includes at least part of the partial area 411, for example. For example, the target region 421 includes pixel points e ', f ', and g ', where the pixel points e ', f ' are pixel points in the partial region 411. In other words, the target area 421 may include other areas outside the partial area 411, for example, the other areas include the pixel point g ', that is, the pixel point g' is outside the partial area 411, for example.
According to the embodiment of the present disclosure, for example, when the target region includes at least two second regions, the noise reduction processing may be performed on the plurality of second regions based on the same noise reduction parameter, or may be performed on each of the plurality of second regions based on different noise reduction parameters. For example, the plurality of second regions includes a second region C 'and a second region D'.
For example, the noise reduction processing may be performed on the at least two second regions based on the same first noise reduction parameter. In particular, the first noise reduction parameter may be, for example, a specific noise value, for example, 2. Then, the noise values of all the pixel points in the second region C 'and the second region D' may be reduced to a specific noise value of 2. Alternatively, the first noise reduction parameter may be a specific noise difference value, for example, 1, and the noise values of all the pixel points in the second region C 'and the second region D' may be reduced by 1. It will be appreciated that the second region C 'and the second region D' are noise reduced based on the same first noise reduction parameter, whether the first noise reduction parameter is a specific noise value or a specific noise difference value.
According to the embodiment of the disclosure, the noise reduction processing may be further performed on the at least two second regions based on different second noise reduction parameters. The second noise reduction parameters corresponding to the different second regions may be determined based on the first specific difference or the second specific difference corresponding to the different second regions.
In the first case, the second noise reduction parameters of the different second regions may be determined based on first specific difference values corresponding to the different second regions, e.g. as mentioned above the difference between the sum of the elements in the first set of luminance values and the sum of the elements in the second set of luminance values. For example, if the first specific difference value corresponding to the second region C 'is 6 and the first specific difference value corresponding to the second region D' is 5, the ratio of the second noise reduction parameters corresponding to the second region C 'and the second region D' is, for example, 6: 5. Then, the second noise reduction parameter corresponding to the second region C 'is, for example, 6, and the second noise reduction parameter corresponding to the second region D' is, for example, 5. Alternatively, the second noise reduction parameter corresponding to the second region C 'is, for example, 12, for example, the second noise reduction parameter corresponding to the second region D' is, for example, 10, and so on.
Wherein the second noise reduction parameter may be a specific noise value or a specific noise difference value. For example, taking the second noise reduction parameter corresponding to the second region C 'as 6 for example, when the second noise reduction parameter 6 is a specific noise value, the noise values of all the pixel points in the second region C' may be reduced to the noise value 6. When the second noise reduction parameter 6 is a specific noise difference, the noise values of all the pixels in the second region C' may be reduced by 6.
In a second case, the second noise reduction parameter of the different second region may be determined based on a second specific difference value corresponding to the different second region, for example, a difference value between an average value of elements in the first set of luminance values and an average value of elements in the second set of luminance values. It is understood that the specific process of determining the second noise reduction parameter based on the second specific difference is the same as or similar to the process of determining the second noise reduction parameter based on the first specific difference, and is not repeated herein.
It is to be understood that the specific values of the luminance values or the noise values in the embodiments of the present disclosure are only schematic examples made for facilitating understanding of the embodiments of the present disclosure. The brightness value or the noise value does not represent a real gray value or a real noise value of a pixel point in an image, and a specific value of the brightness value or the noise value does not specifically limit the embodiment of the disclosure.
According to the technical scheme of the embodiment of the disclosure, the image is divided into a plurality of areas, and the noise reduction processing is carried out on the areas subjected to brightness compensation. By means of regional noise reduction, the brightness value of each pixel point in the first original collected image and the target collected image does not need to be compared, the calculated amount in the image processing process is greatly reduced, and the image noise reduction efficiency is improved.
According to another embodiment of the present disclosure, in one scenario, for example, a photo may be taken with a mobile phone. For example, when a camera of a mobile phone is turned on, the camera can capture a current image and display the current image in a preview mode. The displayed preview image is, for example, an image that has not been subjected to noise reduction processing. It will be appreciated that the preview image has, for example, a relatively large amount of noise. Then, after the mobile phone takes a picture in response to the photographing instruction, the obtained photographed picture is, for example, an image subjected to noise reduction processing, and the noise of the photographed picture is, for example, small. The photographed picture can be stored in a mobile phone and displayed, and when a user watches the photographed picture, the influence of noise information on the use experience of the user can be at least avoided.
Fig. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure.
As shown in fig. 5, the electronic device 500 of the embodiment of the present disclosure includes, for example, a camera module 510 and a processor 520. Wherein, camera module 510 is used for obtaining the collection image, and processor 520 is used for carrying out: and responding to the photographing instruction, processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction, generating a photographing picture, and displaying the photographing picture. Wherein, the mode of shooing that the instruction of shooing corresponds includes at least: and performing noise reduction processing on a target area of the target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
According to the embodiment of the present disclosure, the photographing mode corresponding to the photographing instruction at least includes: at least one original collected image is obtained through the camera module 510 in response to the photographing instruction, and the at least one original collected image is processed based on a brightness compensation algorithm to obtain a target collected image, wherein the brightness of a partial area of the target collected image is higher than that of a partial area of the first original collected image.
According to an embodiment of the present disclosure, processor 520 is further configured to perform: a target region in the target captured image is determined. Determining a target region in a target captured image comprises: processing the target collected image and the first original collected image, determining target pixel points with different brightness values from the first original collected image in the target collected image, and determining the region of the target pixel points in the target collected image as a target region.
According to an embodiment of the present disclosure, determining the target region in the target captured image further comprises: the method includes dividing a first original captured image into a plurality of first regions, dividing a target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as a target region.
According to an embodiment of the present disclosure, processing the plurality of first areas and the plurality of second areas, determining at least one of the plurality of second areas as the target area includes: the luminance value of the second region and the luminance value of the first region corresponding to the second region are determined, and at least one of the plurality of second regions is determined as the target region based on the luminance value of the second region and the luminance value of the corresponding first region.
According to an embodiment of the present disclosure, determining at least one of the plurality of second regions as the target region based on the luminance values of the second regions and the luminance values of the corresponding first regions includes: determining a first brightness value set of a plurality of first pixel points in a current first area of a plurality of first areas, determining a second brightness value set of a plurality of second pixel points in a second area corresponding to the current first area, and determining the current second area as a target area in response to determining that the first brightness value set and the second brightness value set meet a preset condition.
According to an embodiment of the disclosure, the first and second sets of luminance values satisfying the preset condition comprises at least one of: a first particular difference between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference, and a second particular difference between an average of elements in the first set of luminance values and an average of elements in the second set of luminance values is greater than a second predetermined difference.
According to the embodiment of the present disclosure, when the target region includes at least two second regions, performing noise reduction processing on the target region of the target captured image includes at least one of: and performing noise reduction processing on the at least two second regions based on the same first noise reduction parameter, determining a second noise reduction parameter corresponding to each of the at least two second regions based on a first specific difference or a second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
Fig. 6 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 6, the image processing apparatus 600 includes a response module 610, a processing module 620, and a display module 630.
The response module 610 may be configured to respond to a photograph instruction. According to the embodiment of the present disclosure, the response module 610 may, for example, perform the operation S210 described above with reference to fig. 2, which is not described herein again.
The processing module 620 may be configured to process the acquired image obtained in response to the photographing instruction based on the photographing mode corresponding to the photographing instruction, and generate a photographing picture. Wherein, the mode of shooing that the instruction of shooing corresponds includes at least: and performing noise reduction processing on a target area of the target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image, and the partial area of the target acquisition image is an area subjected to brightness compensation processing. According to the embodiment of the present disclosure, the processing module 620 may, for example, perform operation S220 described above with reference to fig. 2, which is not described herein again.
The display module 630 may be used to display the photographed picture. According to the embodiment of the present disclosure, the display module 630 may perform, for example, the operation S230 described above with reference to fig. 2, which is not described herein again.
According to the embodiment of the present disclosure, the photographing mode corresponding to the photographing instruction at least includes: and responding to a photographing instruction to obtain at least one original acquisition image through the camera module, and processing the at least one original acquisition image based on a brightness compensation algorithm to obtain a target acquisition image, wherein the brightness of a partial area of the target acquisition image is higher than that of a partial area of the first original acquisition image.
According to an embodiment of the present disclosure, the apparatus 600, for example, further comprises: and the determining module is used for determining a target area in the target acquisition image. Determining a target region in a target captured image comprises: processing the target collected image and the first original collected image, determining target pixel points with different brightness values from the first original collected image in the target collected image, and determining the region of the target pixel points in the target collected image as a target region.
According to an embodiment of the present disclosure, determining the target region in the target captured image further comprises: the method includes dividing a first original captured image into a plurality of first regions, dividing a target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other, processing the plurality of first regions and the plurality of second regions, and determining at least one of the plurality of second regions as a target region.
According to an embodiment of the present disclosure, processing the plurality of first areas and the plurality of second areas, determining at least one of the plurality of second areas as the target area includes: the luminance value of the second region and the luminance value of the first region corresponding to the second region are determined, and at least one of the plurality of second regions is determined as the target region based on the luminance value of the second region and the luminance value of the corresponding first region.
According to an embodiment of the present disclosure, determining at least one of the plurality of second regions as the target region based on the luminance values of the second regions and the luminance values of the corresponding first regions includes: determining a first brightness value set of a plurality of first pixel points in a current first area of a plurality of first areas, determining a second brightness value set of a plurality of second pixel points in a second area corresponding to the current first area, and determining the current second area as a target area in response to determining that the first brightness value set and the second brightness value set meet a preset condition.
According to an embodiment of the disclosure, the first and second sets of luminance values satisfying the preset condition comprises at least one of: a first particular difference between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference, and a second particular difference between an average of elements in the first set of luminance values and an average of elements in the second set of luminance values is greater than a second predetermined difference.
According to the embodiment of the present disclosure, when the target region includes at least two second regions, performing noise reduction processing on the target region of the target captured image includes at least one of: and performing noise reduction processing on the at least two second regions based on the same first noise reduction parameter, determining a second noise reduction parameter corresponding to each of the at least two second regions based on a first specific difference or a second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any of the response module 610, the processing module 620, and the display module 630 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the response module 610, the processing module 620, and the display module 630 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware by any other reasonable manner of integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware. Alternatively, at least one of the response module 610, the processing module 620 and the display module 630 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
FIG. 7 schematically shows a block diagram of a computer system for implementing image processing according to an embodiment of the present disclosure. The computer system illustrated in FIG. 7 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 7, a computer system 700 implementing image processing includes a processor 701, a computer-readable storage medium 702. The system 700 may perform a method according to an embodiment of the present disclosure.
In particular, the processor 701 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 701 may also include on-board memory for caching purposes. The processor 701 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 702 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 702 may comprise a computer program 703, which computer program 703 may comprise code/computer-executable instructions that, when executed by the processor 701, cause the processor 701 to perform a method according to an embodiment of the disclosure, or any variant thereof.
The computer program 703 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 703 may include one or more program modules, including for example 703A, modules 703B, … …. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 701 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 701.
According to an embodiment of the present disclosure, at least one of the response module 610, the processing module 620, and the display module 630 may be implemented as a computer program module described with reference to fig. 7, which, when executed by the processor 701, may implement the respective operations described above.
The present disclosure also provides a computer-readable medium, which may be embodied in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The above-mentioned computer-readable medium carries one or more programs which, when executed, implement the above-mentioned image processing method.
According to embodiments of the present disclosure, a computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, optical fiber cable, radio frequency signals, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. An image processing method comprising:
responding to a photographing instruction;
processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction to generate a photographing picture; and
displaying the photographed picture;
wherein, the photographing mode corresponding to the photographing instruction at least comprises:
performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image; and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
2. The method of claim 1, wherein the photographing mode corresponding to the photographing instruction at least comprises:
responding to the photographing instruction to obtain at least one original acquisition image through a camera module; and
processing the at least one original collected image based on a brightness compensation algorithm to obtain a target collected image; the brightness of the partial area of the target acquisition image is higher than that of the partial area of the first original acquisition image.
3. The method of claim 2, further comprising: determining the target area in the target acquisition image; the determining the target region in the target captured image comprises:
processing the target captured image and the first original captured image;
determining target pixel points in the target collected image, which are different from the brightness value of the first original collected image; and
and determining the region of the target pixel point in the target acquisition image as the target region.
4. The method of claim 3, wherein the determining the target region in the target captured image further comprises:
dividing the first original captured image into a plurality of first regions;
dividing the target captured image into a plurality of second regions, wherein each of the plurality of first regions and each of the plurality of second regions correspond to each other; and
and processing the plurality of first areas and the plurality of second areas, and determining at least one of the plurality of second areas as the target area.
5. The method of claim 4, wherein the processing the plurality of first regions and the plurality of second regions, determining at least one of the plurality of second regions as the target region comprises:
determining a brightness value of the second region and a brightness value of the first region corresponding to the second region; and
determining at least one of the plurality of second regions as the target region based on the luminance values of the second regions and the luminance values of the corresponding first regions.
6. The method of claim 5, wherein the determining at least one of the plurality of second regions as the target region based on the luminance values of the second regions and the luminance values of the corresponding first regions comprises:
determining a first set of luminance values for a plurality of first pixel points in a current first region of the plurality of first regions;
determining a second brightness value set of a plurality of second pixel points in the second region corresponding to the current first region; and
determining the current second region as the target region in response to determining that the first and second sets of luminance values satisfy a preset condition.
7. The method of claim 6, wherein the first and second sets of luminance values satisfying a preset condition comprises at least one of:
a first particular difference value between a sum of elements in the first set of luminance values and a sum of elements in the second set of luminance values is greater than a first predetermined difference value; and
a second particular difference between the average of the elements in the first set of luminance values and the average of the elements in the second set of luminance values is greater than a second predetermined difference.
8. The method of claim 7, wherein when the target region includes at least two of the second regions,
the denoising processing for the target region of the target acquisition image comprises at least one of the following steps:
performing noise reduction processing on at least two second regions based on the same first noise reduction parameter; and
and determining a second noise reduction parameter corresponding to each of the at least two second regions based on the first specific difference or the second specific difference corresponding to the at least two second regions, and performing noise reduction processing on the at least two second regions based on the second noise reduction parameter.
9. An electronic device, comprising:
the camera module is used for acquiring an acquired image; and
a processor to perform:
responding to a photographing instruction;
processing the acquired image obtained by responding to the photographing instruction based on the photographing mode corresponding to the photographing instruction to generate a photographing picture; and
displaying the photographed picture;
wherein, the photographing mode corresponding to the photographing instruction at least comprises:
performing noise reduction processing on a target area of a target acquisition image, wherein the target area of the target acquisition image is a partial area of the target acquisition image; and the partial area of the target acquisition image is an area subjected to brightness compensation processing.
10. The electronic device of claim 9, wherein the photographing mode corresponding to the photographing instruction at least includes:
responding to the photographing instruction to obtain at least one original acquisition image through the camera module; and
processing the at least one original collected image based on a brightness compensation algorithm to obtain a target collected image; the brightness of the partial area of the target acquisition image is higher than that of the partial area of the first original acquisition image.
CN201911424082.1A 2019-12-31 2019-12-31 Image processing method and electronic device Active CN111131716B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911424082.1A CN111131716B (en) 2019-12-31 2019-12-31 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911424082.1A CN111131716B (en) 2019-12-31 2019-12-31 Image processing method and electronic device

Publications (2)

Publication Number Publication Date
CN111131716A true CN111131716A (en) 2020-05-08
CN111131716B CN111131716B (en) 2021-06-15

Family

ID=70507173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911424082.1A Active CN111131716B (en) 2019-12-31 2019-12-31 Image processing method and electronic device

Country Status (1)

Country Link
CN (1) CN111131716B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6078686A (en) * 1996-09-30 2000-06-20 Samsung Electronics Co., Ltd. Image quality enhancement circuit and method therefor
JP2001036769A (en) * 1999-07-23 2001-02-09 Hitachi Denshi Ltd Noise reduction device
CN104050645A (en) * 2014-06-23 2014-09-17 小米科技有限责任公司 Image processing method and device
CN104349080A (en) * 2013-08-07 2015-02-11 联想(北京)有限公司 Image processing method and electronic equipment
US20150195440A1 (en) * 2012-07-30 2015-07-09 Samsung Electronics Co., Ltd. Image capture method and image capture apparatus
CN105227805A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal
CN105447827A (en) * 2015-11-18 2016-03-30 广东欧珀移动通信有限公司 Image noise reduction method and system thereof
CN105898151A (en) * 2015-11-15 2016-08-24 乐视移动智能信息技术(北京)有限公司 Image processing method and device
CN108038834A (en) * 2017-12-28 2018-05-15 努比亚技术有限公司 A kind of method, terminal and computer-readable recording medium for reducing noise
CN108513043A (en) * 2017-02-27 2018-09-07 中兴通讯股份有限公司 A kind of image denoising method and terminal
CN110136085A (en) * 2019-05-17 2019-08-16 凌云光技术集团有限责任公司 A kind of noise-reduction method and device of image
CN110213462A (en) * 2019-06-13 2019-09-06 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and image processing circuit
CN110519485A (en) * 2019-09-09 2019-11-29 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6078686A (en) * 1996-09-30 2000-06-20 Samsung Electronics Co., Ltd. Image quality enhancement circuit and method therefor
JP2001036769A (en) * 1999-07-23 2001-02-09 Hitachi Denshi Ltd Noise reduction device
US20150195440A1 (en) * 2012-07-30 2015-07-09 Samsung Electronics Co., Ltd. Image capture method and image capture apparatus
CN104349080A (en) * 2013-08-07 2015-02-11 联想(北京)有限公司 Image processing method and electronic equipment
CN104050645A (en) * 2014-06-23 2014-09-17 小米科技有限责任公司 Image processing method and device
CN105227805A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal
CN105898151A (en) * 2015-11-15 2016-08-24 乐视移动智能信息技术(北京)有限公司 Image processing method and device
CN105447827A (en) * 2015-11-18 2016-03-30 广东欧珀移动通信有限公司 Image noise reduction method and system thereof
CN108513043A (en) * 2017-02-27 2018-09-07 中兴通讯股份有限公司 A kind of image denoising method and terminal
CN108038834A (en) * 2017-12-28 2018-05-15 努比亚技术有限公司 A kind of method, terminal and computer-readable recording medium for reducing noise
CN110136085A (en) * 2019-05-17 2019-08-16 凌云光技术集团有限责任公司 A kind of noise-reduction method and device of image
CN110213462A (en) * 2019-06-13 2019-09-06 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and image processing circuit
CN110519485A (en) * 2019-09-09 2019-11-29 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN111131716B (en) 2021-06-15

Similar Documents

Publication Publication Date Title
Galdran Image dehazing by artificial multiple-exposure image fusion
US11128809B2 (en) System and method for compositing high dynamic range images
KR102149187B1 (en) Electronic device and control method of the same
US11127117B2 (en) Information processing method, information processing apparatus, and recording medium
US9600741B1 (en) Enhanced image generation based on multiple images
US20150350513A1 (en) Constant Bracket High Dynamic Range (cHDR) Operations
US20180109711A1 (en) Method and device for overexposed photography
US9489728B2 (en) Image processing method and image processing apparatus for obtaining an image with a higher signal to noise ratio with reduced specular reflection
US20140270487A1 (en) Method and apparatus for processing image
US9313413B2 (en) Image processing method for improving image quality and image processing device therewith
US9569688B2 (en) Apparatus and method of detecting motion mask
WO2018228310A1 (en) Image processing method and apparatus, and terminal
CN105812675A (en) Method for generating an HDR image of a scene based on a tradeoff between brightness distribution and motion
WO2018223394A1 (en) Method and apparatus for photographing image
CN111586314B (en) Image fusion method and device and computer storage medium
CN112272832A (en) Method and system for DNN-based imaging
CN111383206B (en) Image processing method and device, electronic equipment and storage medium
CN110889802B (en) Image processing method and device
WO2012170462A2 (en) Automatic exposure correction of images
US20160071253A1 (en) Method and apparatus for image enhancement
US9258490B2 (en) Smoothing of ghost maps in a ghost artifact detection method for HDR image creation
US11727540B2 (en) Image sharpening
CN113259594A (en) Image processing method and device, computer readable storage medium and terminal
CN111131716B (en) Image processing method and electronic device
CN115334250B (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant