CN111340894A - Image processing method and device and computer equipment - Google Patents
Image processing method and device and computer equipment Download PDFInfo
- Publication number
- CN111340894A CN111340894A CN201910926843.7A CN201910926843A CN111340894A CN 111340894 A CN111340894 A CN 111340894A CN 201910926843 A CN201910926843 A CN 201910926843A CN 111340894 A CN111340894 A CN 111340894A
- Authority
- CN
- China
- Prior art keywords
- channel
- dispersion
- pixel
- pixel point
- pixel points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 14
- 239000006185 dispersion Substances 0.000 claims abstract description 730
- 238000012545 processing Methods 0.000 claims abstract description 100
- 238000000034 method Methods 0.000 claims abstract description 62
- 238000010130 dispersion processing Methods 0.000 claims abstract description 47
- 239000011159 matrix material Substances 0.000 claims description 88
- 238000001514 detection method Methods 0.000 claims description 30
- 108091006146 Channels Proteins 0.000 description 717
- 230000008569 process Effects 0.000 description 23
- 230000000875 corresponding effect Effects 0.000 description 22
- 238000012216 screening Methods 0.000 description 21
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 4
- 239000013307 optical fiber Substances 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The disclosure provides an image processing method, an image processing device and computer equipment, and relates to the technical field of computer vision. The method comprises the following steps: acquiring a first image to be processed; determining a dispersion distribution rule of a pixel point according to a channel value of the pixel point for any one of a plurality of pixel points in a first image; determining the dispersion type of the pixel points according to the dispersion distribution rule of the pixel points; and according to the dispersion type of the pixel points, performing de-dispersion processing on the pixel points to obtain a second image. The method has the advantages that the pixel points of different dispersion types are determined by determining the channel values of the pixel points, and the pixel points of different dispersion types are respectively subjected to dispersion removing treatment, so that the accuracy of dispersion treatment is improved.
Description
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to an image processing method and apparatus, and a computer device.
Background
Since most light sources in nature are formed by mixing light of different colors, for example, sunlight is mixed light formed by combining seven colors of visible light, infrared light, ultraviolet light and the like. The different colors of light have different wavelengths, and the refractive index of the optical element in the image pickup apparatus varies depending on the wavelength of the light. Therefore, when light passes through the optical element, the dispersion phenomenon that light of different colors is separated and dispersed occurs. For example, when an object is photographed by an image pickup apparatus, when there is a region where a difference in brightness intensity is large in the object, a dispersion phenomenon occurs at a boundary between a region where the brightness intensity is high and a region where the brightness intensity is low, resulting in an unclear image, and therefore, it is necessary to perform a de-dispersion process on the image.
In the related art, in the process of performing dispersion removal processing on an image, a dispersion area and a normal area are marked in the image, and filtering processing is performed on pixel points in the dispersion area in a filtering processing mode, so that the dispersion removal processing on the image is realized.
In the above-mentioned correlation technique, because the dispersion degree of different pixel points is different, adopt the mode of unified processing to carry out the mode of unified processing to all pixel points in the dispersion region, the wavelength that probably causes the filtering is improper, leads to getting rid of the dispersion and clean inadequately and the problem of erroneous judgement normal pixel to lead to the precision poor.
Disclosure of Invention
In order to overcome the problem of poor dispersion processing accuracy in the related art, the present disclosure provides an image processing method, apparatus, and computer device.
In one aspect, an image processing method is provided, and the method includes:
acquiring a first image to be processed;
determining a dispersion distribution rule of any one of a plurality of pixel points in the first image according to a channel value of the pixel point;
determining the dispersion type of the pixel points according to the dispersion distribution rule of the pixel points;
and according to the dispersion type of the pixel point, performing dispersion removal processing on the pixel point to obtain a second image.
In a possible implementation manner, the determining a dispersion distribution rule of the pixel according to the channel value of the pixel includes:
determining a dispersion distribution rule of the pixel points according to the channel values of the G channel and the B channel of the pixel points; or,
and determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the R channel of the pixel points.
In another possible implementation manner, the method further includes:
determining a difference value between the channel value of the G channel and the channel value of the B channel of the pixel point according to the channel value of the G channel, the channel value of the B channel and the channel value of the R channel of the pixel point to obtain a first difference value, and determining a difference value between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference value;
when the first difference is smaller than the second difference, executing the step of determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the B channel of the pixel points;
and when the first difference is larger than the second difference, executing the step of determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the R channel of the pixel points.
In another possible implementation manner, the determining a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point includes:
for any one of a plurality of designated directions, determining a plurality of pixel points in the designated direction of the pixel points;
and when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
In another possible implementation manner, the determining a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point includes:
for any one of a plurality of designated directions, determining a plurality of pixel points in the designated direction of the pixel points;
and when the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
In another possible implementation manner, the determining the dispersion type of the pixel according to the dispersion distribution rule of the pixel includes:
when the dispersion distribution rule of the pixel point is a high-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a high-temperature dispersion type;
and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
In another possible implementation manner, the performing, according to the dispersion type of the pixel point, a dispersion removal process on the pixel point to obtain a second image includes:
processing the channel value of each channel of the pixel point according to the dispersion processing coefficient corresponding to the dispersion type of the pixel point to obtain the de-dispersion channel value of each channel of the pixel point;
and modifying the channel value of each channel value of the pixel points in the first image into the de-dispersion channel value to obtain the second image.
In another possible implementation manner, before performing a de-dispersion process on the pixel point according to the dispersion type of the pixel point to obtain a second image, the method further includes:
selecting a pixel point with a high-temperature dispersion type and a pixel point with a low-temperature dispersion type from the plurality of pixel points of the first image according to the dispersion types of the plurality of pixel points in the first image;
generating a first dispersion matrix according to the high-temperature dispersion type pixel points, and generating a second dispersion matrix according to the low-temperature dispersion type pixel points;
traversing through a traversal window in the first dispersion matrix and the second dispersion matrix;
when the number of the high-temperature dispersion type pixel points in the traversal window in the first dispersion matrix is larger than a first preset threshold value, determining all the pixel points in the traversal window as the high-temperature dispersion type pixel points;
when the number of the low-temperature dispersion type pixel points in the traversal window in the second dispersion matrix is larger than a second preset threshold value, all the pixel points in the traversal window are determined as the low-temperature dispersion type pixel points.
In another possible implementation manner, the processing the channel value of each channel of the pixel according to the dispersion processing coefficient corresponding to the dispersion type of the pixel to obtain the de-dispersed channel value of each channel of the pixel includes:
determining a minimum channel value in a plurality of channel values of the pixel points;
for each channel of the pixel point, determining a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient;
and determining a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is in positive correlation with the compensation value and the channel value respectively.
In another possible implementation manner, the acquiring the first image to be processed includes:
acquiring a third image, and determining a designated pixel point from the third image;
determining a first image with the designated pixel point as the center and the detection width as the designated detection width in the third image;
the designated pixel points are pixel points of which the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel points of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value.
In another aspect, there is provided an image processing apparatus, the apparatus including:
the acquisition module is used for acquiring a first image to be processed;
the first determining module is used for determining the dispersion distribution rule of any one of a plurality of pixel points in the first image according to the channel value of the pixel point;
the second determining module is used for determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point;
and the processing module is used for performing dispersion removal processing on the pixel points according to the dispersion types of the pixel points to obtain a second image.
In a possible implementation manner, the first determining module is further configured to determine a dispersion distribution rule of the pixel point according to a channel value of a G channel and a channel value of a B channel of the pixel point; or determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the R channel of the pixel points.
In another possible implementation manner, the apparatus further includes:
a third determining module, configured to determine, according to the channel value of the G channel, the channel value of the B channel, and the channel value of the R channel of the pixel point, a difference between the channel value of the G channel and the channel value of the B channel of the pixel point to obtain a first difference, and determine a difference between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference;
the first determining module is further configured to determine a dispersion distribution rule of the pixel point according to a channel value of a G channel and a channel value of a B channel of the pixel point when the first difference is smaller than the second difference;
the first determining module is further configured to determine a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point when the first difference is greater than the second difference.
In another possible implementation manner, the first determining module is further configured to determine, for any one of a plurality of specified directions, a plurality of pixel points in the specified direction of the pixel points; and when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
In another possible implementation manner, the first determining module is further configured to determine, for any one of a plurality of specified directions, a plurality of pixel points in the specified direction of the pixel points; and when the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
In another possible implementation manner, the second determining module is further configured to determine that the dispersion type of the pixel is a high-temperature dispersion type when the dispersion distribution rule of the pixel is a high-temperature dispersion distribution rule; and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
In another possible implementation manner, the processing module is further configured to process a channel value of each channel of the pixel according to a dispersion processing coefficient corresponding to the dispersion type of the pixel, so as to obtain a de-dispersion channel value of each channel of the pixel; and modifying the channel value of each channel value of the pixel points in the first image into the de-dispersion channel value to obtain the second image.
In another possible implementation manner, the apparatus further includes:
the selection module is used for selecting a pixel point with a high-temperature dispersion type and a pixel point with a low-temperature dispersion type from the plurality of pixel points of the first image according to the dispersion types of the plurality of pixel points in the first image;
the generating module is used for generating a first dispersion matrix according to the high-temperature dispersion type pixel points and generating a second dispersion matrix according to the low-temperature dispersion type pixel points;
the traversal module is used for traversing the first dispersion matrix and the second dispersion matrix through a traversal window;
a fourth determining module, configured to determine, when the number of high-temperature dispersion-type pixel points in the traversal window in the first dispersion matrix is greater than a first preset threshold, all the pixel points in the traversal window as the high-temperature dispersion-type pixel points;
and the fifth determining module is used for determining all the pixel points in the traversal window as the pixel points of the low-temperature dispersion type when the number of the pixel points of the low-temperature dispersion type in the traversal window in the second dispersion matrix is greater than a second preset threshold value.
In another possible implementation manner, the processing module is further configured to determine a minimum channel value of the plurality of channel values of the pixel point; for each channel of the pixel point, determining a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient; and determining a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is in positive correlation with the compensation value and the channel value respectively.
In another possible implementation manner, the obtaining module is further configured to obtain a third image, and determine a designated pixel point from the third image; determining a first image with the designated pixel point as the center and the detection width as the designated detection width in the third image; the designated pixel points are pixel points of which the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel points of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value.
In another aspect, a computer device is provided, the computer device comprising:
at least one processor; and
at least one memory;
the at least one memory stores one or more programs configured to be executed by the at least one processor, the one or more programs including instructions for performing the image processing method according to the first aspect of an embodiment of the disclosure.
In another aspect, a computer-readable storage medium applied to a terminal is provided, and the computer-readable storage medium stores at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the steps in the image processing method according to the first aspect of the embodiments of the disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, a first image to be processed is obtained, and for any one of a plurality of pixel points in the first image, a dispersion distribution rule of the pixel point is determined according to a channel value of the pixel point; and determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point, and then performing dispersion removal processing on the pixel point according to the dispersion type of each pixel point to obtain a second image after dispersion removal. The method has the advantages that the pixel points of different dispersion types are determined by determining the channel values of the pixel points, and the pixel points of different dispersion types are respectively subjected to dispersion removing treatment, so that the accuracy of dispersion treatment is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a block diagram illustrating a system involved in image processing according to an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of image processing according to an exemplary embodiment;
FIG. 3 is a flow diagram illustrating a method of image processing according to an exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a designated direction according to an exemplary embodiment;
FIG. 5 is a block diagram of an image processing apparatus according to an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating a configuration of a computer device, according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a schematic diagram of an image processing system according to an exemplary embodiment, where the image processing system is related to an image processing method, and the image processing system may be an image processing system in any computer device with an image processing function, the computer device may be a computer, a mobile phone, a computer, or the like with an image processing function, and may also be an image capturing device with an image processing function, and in the embodiment of the present disclosure, the computer device is not particularly limited. As shown in fig. 1. The image processing system includes: the device comprises an input module, a pixel-by-pixel processing module, a de-dispersion processing module and an output module. The output end of the input module is connected with the input end of the pixel-by-pixel processing module, the output end of the pixel-by-pixel processing module is connected with the input end of the de-dispersion module, and the output end of the de-dispersion module is connected with the output module.
The input module is configured to receive a third image to be processed, and output the third image to be processed to the pixel-by-pixel processing module.
The pixel-by-pixel processing module is used for receiving the third image input by the input module and respectively determining the dispersion types of a plurality of pixel points in the third image. Determining a designated pixel point from the third image, taking the designated pixel point as a center, and determining a region with a detection width of the designated width as a first image, wherein the designated pixel point is a pixel point with a gray value within a preset gray value range, the difference between the gray value and the gray value of the edge pixel point of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value. The pixel-by-pixel processing module is further configured to determine whether each channel of any one of the plurality of pixels in the first image satisfies a dispersion distribution rule, determine a dispersion type of the pixel that has undergone dispersion according to a channel type that satisfies the dispersion distribution rule, and input the pixel that has undergone dispersion to the de-dispersion processing module.
In a possible implementation manner, the pixel-by-pixel processing module screens out the pixels with the dispersion type of high-temperature dispersion and the pixels with the dispersion type of low-temperature dispersion by respectively judging the dispersion distribution rule satisfied by each channel of any one of the plurality of pixels. Accordingly, the pixel-by-pixel processing module may include: the device comprises a first judging unit, a second judging unit and a third judging unit. The input end of the first judging unit is connected with the output end of the input module, the output end of the first judging unit is respectively connected with the input ends of the second judging unit and the third judging unit, and the output ends of the second judging unit and the third judging unit are connected with the de-dispersion processing module. The first judging unit is a G channel judging unit and is used for screening a plurality of pixel points which accord with a first preset condition in the first image and determining the dispersion distribution rule of the G channel of any one pixel point in the plurality of pixel points. The first preset condition may be: the gray value of the pixel point is within a preset range.
The second judgment unit is a B channel judgment unit and is used for determining the dispersion distribution rule of the B channel of any one pixel point in the plurality of pixel points, screening out pixel points with the G channel and the B channel satisfying the dispersion distribution rule from the plurality of pixel points, and determining the dispersion type of the pixel point as a high-temperature dispersion type. The third judging unit is an R channel judging unit and is used for determining the dispersion distribution rule of the R channel of any one pixel point in the plurality of pixel points, screening out pixel points of which the G channel and the R channel both meet the dispersion distribution rule from the plurality of pixel points, and determining the dispersion type of the pixel point as a low-temperature dispersion type.
It should be noted that, before the second and third determining units determine the dispersion type of the pixel, the pixel meeting the B channel screening condition and the R channel screening condition may be screened out from the plurality of pixels according to the B channel screening condition and the R channel screening condition. Correspondingly, the second judging unit can screen the pixel points input to the second judging unit according to the B channel screening condition to determine the dispersion distribution rule of the B channel of the pixel points meeting the B channel screening condition; the third judging unit can screen the pixel points input to the third judging unit according to the R channel screening condition, and determine the dispersion distribution rule of the R channel of the pixel points meeting the R channel screening condition.
The de-dispersion module is used for receiving the high-temperature dispersion type pixel points and the low-temperature dispersion type pixel points sent by the pixel-by-pixel processing module, performing de-dispersion processing on the high-temperature dispersion type pixel points and the low-temperature dispersion type pixel points according to the dispersion coefficient of high-temperature dispersion and the dispersion coefficient of low-temperature dispersion respectively to obtain the de-dispersed pixel points, and sending the de-dispersed pixel points to the output module.
The output module is used for receiving the dispersed pixel points output by the dispersion removing module, generating a second image according to the dispersed pixel points and outputting the second image.
In a possible implementation manner, the image processing system further includes a global processing module, an input end of the global processing module is connected to an output end of the pixel-by-pixel processing module, and an output end of the global processing module is connected to an input end of the de-dispersion module.
The global processing module is used for receiving the two types of pixel points output by the pixel-by-pixel processing module and determining the dispersion type of each pixel point in the third image according to each pixel point output by the pixel-by-pixel processing module; and sending each pixel point which is subjected to chromatic dispersion in the third image to a chromatic dispersion removing processing module.
Correspondingly, the de-dispersion module is further configured to receive the high-temperature dispersion type pixel point and the low-temperature dispersion type pixel point sent by the global processing module, perform de-dispersion processing on the high-temperature dispersion type pixel point and the low-temperature dispersion type pixel point according to the high-temperature dispersion coefficient and the low-temperature dispersion coefficient, respectively, obtain a de-dispersed pixel point, and send the de-dispersed pixel point to the output module.
FIG. 2 is a flow diagram illustrating a method of image processing, as shown in FIG. 2, including the following steps, according to an exemplary embodiment.
Step 201: a first image to be processed is acquired.
Step 202: and determining the dispersion distribution rule of any one of the pixel points in the first image according to the channel value of the pixel point.
Step 203: and determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point.
Step 204: and according to the dispersion type of the pixel point, performing dispersion removal processing on the pixel point to obtain a second image.
In a possible implementation manner, the determining a dispersion distribution rule of the pixel point according to the channel value of the pixel point includes:
determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point; or,
and determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point.
In another possible implementation manner, the method further includes:
determining a difference value between the channel value of the G channel and the channel value of the B channel of the pixel point according to the channel value of the G channel, the channel value of the B channel and the channel value of the R channel of the pixel point to obtain a first difference value, and determining a difference value between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference value;
when the first difference is smaller than the second difference, executing the step of determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point;
and when the first difference is larger than the second difference, executing the step of determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point.
In another possible implementation manner, the determining a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point includes:
for any one appointed direction in a plurality of appointed directions, determining a plurality of pixel points of the pixel point in the appointed direction;
and when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
In another possible implementation manner, the determining a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point includes:
for any one appointed direction in a plurality of appointed directions, determining a plurality of pixel points of the pixel point in the appointed direction;
and when the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
In another possible implementation manner, the determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point includes:
when the dispersion distribution rule of the pixel point is a high-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a high-temperature dispersion type;
and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
In another possible implementation manner, the performing a de-dispersion process on the pixel point according to the dispersion type of the pixel point to obtain a second image includes:
processing the channel value of each channel of the pixel point according to the dispersion processing coefficient corresponding to the dispersion type of the pixel point to obtain the de-dispersion channel value of each channel of the pixel point;
and modifying the channel value of each channel value of the pixel point in the first image into the de-dispersion channel value to obtain the second image.
In another possible implementation manner, before performing a de-dispersion process on the pixel point according to the dispersion type of the pixel point to obtain the second image, the method further includes:
selecting a pixel point with a high-temperature dispersion type and a pixel point with a low-temperature dispersion type from a plurality of pixel points of the first image according to the dispersion types of the plurality of pixel points in the first image;
generating a first dispersion matrix according to the high-temperature dispersion type pixel points, and generating a second dispersion matrix according to the low-temperature dispersion type pixel points;
traversing through a traversal window in the first dispersion matrix and the second dispersion matrix;
when the number of the high-temperature dispersion type pixel points in the traversal window in the first dispersion matrix is larger than a first preset threshold value, determining all the pixel points in the traversal window as the high-temperature dispersion type pixel points;
and when the number of the low-temperature dispersion type pixel points in the traversal window in the second dispersion matrix is larger than a second preset threshold value, determining all the pixel points in the traversal window as the low-temperature dispersion type pixel points.
In another possible implementation manner, the processing the channel value of each channel of the pixel according to the dispersion processing coefficient corresponding to the dispersion type of the pixel to obtain the de-dispersed channel value of each channel of the pixel includes:
determining the minimum channel value in the plurality of channel values of the pixel point;
for each channel of the pixel point, determining a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient;
and determining a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is positively correlated with the compensation value and the channel value respectively.
In another possible implementation manner, the acquiring the first image to be processed includes:
acquiring a third image, and determining a designated pixel point from the third image;
determining a first image with the designated pixel point as the center and the detection width as the designated detection width in the third image;
the designated pixel point is a pixel point of which the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel point of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value.
In the embodiment of the disclosure, a first image to be processed is obtained, and for any one of a plurality of pixel points in the first image, a dispersion distribution rule of the pixel point is determined according to a channel value of the pixel point; and determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point, and then performing dispersion removal processing on the pixel point according to the dispersion type of each pixel point to obtain a second image after dispersion removal. The method has the advantages that the pixel points of different dispersion types are determined by determining the channel values of the pixel points, and the pixel points of different dispersion types are respectively subjected to dispersion removing treatment, so that the accuracy of dispersion treatment is improved.
FIG. 3 is a flowchart illustrating a method of image processing, as shown in FIG. 3, including the following steps, according to an exemplary embodiment.
Step 301: the computer device obtains a first image to be processed.
The first image is an image containing dispersed pixels. The first image is an image to be processed, and may also be a partial region in the image to be processed. When the first image is a partial image region in the image to be processed, the process of the computer device acquiring the first image to be processed may be realized by the steps (a1) - (a2) including:
(A1) and the computer equipment acquires a third image and determines the specified pixel points from the third image.
Wherein the third image is an image to be processed. Accordingly, the manner in which the computer device acquires the third image may be adjusted according to the difference of the computer device.
For example, the computer device is an image capturing device with an image processing function, and the process of acquiring the third image by the computer device may be: the computer device shoots a current scene and obtains a third image shot by the computer device. For example, the computer device may be an endoscope, which is a commonly used medical instrument and is composed of a bendable part, a light source and a lens, and when the computer device is used, the computer device can enter the human body through a natural duct of the human body or a small incision obtained through an operation to reach a pre-examined organ, and an image is acquired through the lens on the endoscope, so that a user can directly observe changes of a relevant part of the pre-examined organ. Accordingly, in the embodiments of the present disclosure, the image to be processed may be an image captured by an endoscope.
(A2) And the computer equipment determines the first image with the specified detection width as the center and the specified pixel point as the detection width in the third image.
The designated pixel point is a pixel point screened out by the computer equipment according to a preset condition. The computer device can screen the designated pixel point according to the gray value of the pixel point.
In a possible implementation manner, the computer device may select a pixel point, as the designated pixel point, where the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel point of the first image is smaller than a first preset threshold, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold.
The preset gray value range may be set and changed as needed, and in the embodiment of the present disclosure, the preset range is not specifically limited. For example, the range of gray values may beWherein, Ii,jRepresenting a pixel point ai,jThe gray values of (d), th _ max and th _ min, are the upper and lower limits of the preset gray value range, respectively.
In this implementation manner, the computer device screens out the pixel points with the gray values within the preset gray value range from the first image according to the gray value of each pixel point, and then determines the dispersion type of the pixel points. By screening the pixel points with the gray values within the preset range, the range of the pixel points to be determined when the computer equipment determines the dispersion type of the pixel points is reduced, and the image processing efficiency is improved.
In another possible implementation manner, the computer device may further select, as the designated pixel, a pixel in which a difference between a gray value of the pixel and a gray value of an edge pixel of the first image is smaller than a first preset threshold and a difference between a maximum channel value and a minimum channel value is larger than a second preset threshold. That is, the computer device needs to determine whether the gray value of the pixel point satisfies the following condition:
Ii,j-Ii,j-N>th_start_end,Ii,j-Ii,j+N>th_start_end,Ii,j-Ii-N,j>th_start_end,Ii,j-Ii+N,j>th_start_end,Ii,j-Ii-N,j+N>th_start_end,Ii,j-Ii+N,j-N>th_start_end,Ii,j-Ii-N,j-N>th_start_end,Ii,j-Ii+N,j+N> th _ start _ end. Wherein, Ii,jRepresenting a pixel point ai,jGray value of (1)i,j-N,Ii,j+N,Ii-N,j,Ii+N,j,Ii-N,j+N,Ii+N,j-N,Ii+N,j+N,Ii-N,j-NRespectively representing pixel points a in different designated directionsi,jThe gray value th _ start _ end of (a) is a first preset threshold, and the first threshold may be set and changed according to setting requirements. Correspondingly, when the pixel point meets the above conditions, the pixel point is taken as the designated pixel point.
In this implementation manner, the computer device screens out the pixel points meeting the screening condition from the first image according to the gray value of each pixel point, and then determines the dispersion type of the pixel points. By screening the pixel points with the gray values within the preset range, the range of the pixel points to be determined when the computer equipment determines the dispersion type of the pixel points is reduced, and the image processing efficiency is improved.
After determining that the gray value of the pixel point meets the preset condition, the computer device can also determine the difference value between the maximum channel value and the minimum channel value of the pixel point, screen out the pixel point of which the maximum channel value and the minimum channel value are smaller than the second preset threshold value, and take the pixel point as the designated pixel point.
And (3) screening out the pixel points which are possibly dispersed from the third image through preset conditions, and then carrying out subsequent processing, thereby avoiding processing all the pixel points in the image, and screening out the pixel points which are possibly dispersed in advance, and further improving the efficiency of image processing.
The point to be described is that the computer device may further determine the channel value of each channel of the pixel, select an alternative pixel from the pixel according to the channel value of the pixel, where a difference between the maximum channel value and the minimum channel value is smaller than a second preset value, and select a pixel from the alternative pixel, where a difference between the gray value of the pixel and the gray value of the edge pixel of the first image is smaller than a first preset threshold value, as the designated pixel.
Another point to be explained is that the computer device can respectively screen out the designated pixel points according to the two screening methods, and the computer device can also screen out the designated pixel points by combining the two screening methods, and when the computer device screens out the designated pixel points by combining the two screening methods, the computer can screen out the designated pixel points through any screening sequence, that is, the computer device can screen out part of the pixel points according to the first implementation method and then screen out the designated pixel points from the part of the pixel points according to the second implementation method; the computer equipment can also screen out partial pixel points according to the second implementation mode, and then screen out appointed pixel points from the partial pixel points according to the first implementation mode.
After the computer device screens out a plurality of designated pixel points in the third image, a plurality of first images are determined by taking each designated pixel point as the center and the designated detection width as the detection width.
The detection width refers to a width from the center of the first image to the edge of the detection range corresponding to the first image. The detection width may be set and changed as needed, and in the embodiment of the present disclosure, the value of the detection width is not particularly limited. For example, the detection width may be 4, 10, 20, or 50, etc. Referring to fig. 4, when the detection width is N, the detected pixel points in each direction are a respectivelyi,j-ai,j-n,ai,j-ai,j+n,ai,j-ai-n,j,ai,j-ai+n,j,ai,j-ai-n,j-n,ai,j-ai-n,j+n,ai,j-ai+n,j-n,ai,j-ai+n,j+n. Wherein, ai,jAnd representing the current specified pixel point, i and j represent the position coordinates of the pixel point, and n represents the detection width.
Step 302: for any one of a plurality of pixel points in the first image, the computer device determines a plurality of pixel points in the designated direction of the pixel point.
In this embodiment, the computer device may process each pixel point in the first image, and the computer device may further select a part of the pixel points from the first image to process the pixel points. Correspondingly, the plurality of pixel points may be all pixel points in the first image, and the plurality of pixel points may also be partial pixel points in the first image.
The designated direction refers to any one of a plurality of designated directions of the pixel point. In this step, the computer device needs to determine any one of the plurality of pixel points in the first image, and then determine a plurality of pixel points in each of the plurality of designated directions of the pixel point. The plurality of designated directions may be set and changed as needed, and in the embodiment of the present disclosure, the number and the directions of the plurality of designated directions are not particularly limited. For example, as shown in fig. 4, the designated directions may be 8 directions, and the current pixel point is aj,jFor example, if the horizontal upward direction in fig. 4 is the north direction, the eight directions are respectively the first pixel point ai,jThe west, east, north, south, north-east, south-west and south-east directions of origin.
For each of the plurality of designated directions, the computer device needs to determine a plurality of pixel points in the designated direction, and the plurality of pixel points may be a plurality of pixel points selected in the designated direction of the pixel point according to the detection width.
Step 303: the computer equipment determines the difference value between the channel value of the G channel and the channel value of the B channel of the pixel point according to the channel value of the G channel, the channel value of the B channel and the channel value of the R channel of the pixel point to obtain a first difference value, and determines the difference value between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference value.
In this step, the computer device first determines the channel value of each channel of the pixel point, and obtains the channel value of the G channel, the channel value of the B channel, and the channel value of the R channel of the pixel point. The computer equipment determines the channel access of the G channel of the pixel point according to the channel value of the pixel pointA first difference between the lane value and the lane value of the B-lane, which may be | ai,j(G)-ai,j(R) |, wherein, ai,j(G) Representing a pixel point ai,jChannel value of G channel of (a)i,j(R) representing a pixel ai,jThe channel value of the R channel of (1); the computer device also determines a second difference between the channel value of the G channel and the channel value of the B channel of the pixel point, where the second difference may be | ai,j(G)-ai,j(B) L wherein ai,j(G) Representing a pixel point ai,jChannel value of G channel of (a)i,j(B) Representing a pixel point ai,jThe channel value of the B channel of (1).
It should be noted that, in order to reduce the erroneous judgment, the computer device may further perform screening again to process the screened pixel points. In one possible implementation, the computer device selects a pixel point having a first difference greater than a first preset value.
The first preset value may be set and modified as needed, and is not specifically limited in the embodiment of the present disclosure. I.e. the computer device can choose to meet condition ai,j(G)-ai,j(B) (iii) pixel points > gb _ diff, where ai,j(G) Value representing the G channel, ai,j(B) Represents the value of the B channel, and gb _ diff represents the first preset value.
In the implementation mode, the computer equipment further screens the pixel points, so that the misjudgment is further reduced, and the accuracy of the dispersion removal processing is improved.
In another possible implementation manner, the computer device selects a pixel point whose R channel value is not the minimum channel value among the channel values of the pixel point.
In this step, the computer device passes through ai,j(R)≠min(ai,j(R, G, B)) to eliminate interference of R channels of other pixel points. Wherein, ai,j(G) Denotes the value of G channel, min (a)i,j(R, G, B)) represents the value of the smallest channel of the three channels.
In the implementation mode, the computer equipment further screens the pixel points, so that the misjudgment is further reduced, and the accuracy of the dispersion removal processing is improved.
Step 304: and when the first difference is smaller than the second difference, the computer equipment determines the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point.
When the first difference is smaller than the second difference, the probability that the current pixel is a high-temperature dispersion pixel is high, the computer device determines the dispersion distribution rule of the pixel according to the dispersion distribution rule of the G channel and the dispersion distribution rule of the B channel by respectively determining the dispersion distribution rule of the G channel and the dispersion distribution rule of the B channel, and in the step, the computer device determines the dispersion distribution rules of the pixels in a plurality of designated directions respectively.
The process can be realized by the following steps (1) to (2), including:
(1) for any one of a plurality of designated directions, the computer device determines a plurality of pixel points in the designated direction of the pixel point.
In this step, the computer device determines a plurality of designated directions of the pixel point, and the computer device determines a plurality of pixel points in each designated width in the plurality of designated directions according to the detection width.
The detection width is a detection range for each pixel point, and the detection width can be set and changed as required. For example, the detection width may be 4, 10, 20, or 50, etc.
With continued reference to fig. 4, when the detection width is N, the detected pixel points in each direction are a respectivelyi,j-ai,j-n,ai,j-ai,j+n,ai,j-ai-n,j,ai,j-ai+n,j,ai,j-ai-n,j-n,ai,j-ai-n,j+n,ai,j-ai+n,j-n,ai,j-ai+n,j+n. Wherein, ai,jRepresenting the current pixel point, i and j representing the position of the pixel pointAnd n denotes a detection width.
(2) And when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the pixel points in the appointed direction, the computer equipment determines that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
And for the G channel of the pixel point, detecting the channel value of the G channel of the pixel point by taking the pixel point as a starting point, and judging whether the channel value is greater than the channel values of the G channels of other pixels in any specified direction. When the channel values of the G channels of the pixel points in the designated direction are all larger than the channel values of the G channels of other pixel points in the designated direction, marking the G channels of the pixel points as first numerical values in the designated direction; and when the channel values of the G channels of the pixel points in the designated direction are not all larger than the channel values of the G channels of other pixel points in the designated direction, marking the G channels of the pixel points as second numerical values in the designated direction. In the embodiment of the present disclosure, the identification manners of the first numerical value and the second numerical value are not particularly limited. For example, when the channel value of the G channel of the pixel in any given direction is greater than the channel values of the G channels of the other pixels in the given direction, the given direction of the G channel of the pixel may be marked as 1, and when the channel value of the G channel of the pixel in any given direction is not greater than the channel values of the G channels of the other pixels in the given direction, the given direction of the G channel of the pixel may be marked as 0. Accordingly, the computer device may determine the tag value of the G channel of the pixel point in the following manner.
If a isi,j(G)>ai,j-n(G) N is 1, 2.. and N, the horizontal West direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ West 1, where G _ West represents the horizontal West direction of the G channel of the pixel.
If a isi,j(G)>ai,j+n(G) N1, 2, N, the image is thenThe horizontal East direction of the G channel of the pixel point meets the dispersion distribution rule, and the mark is G _ East as 1, wherein G _ East represents the horizontal East direction of the G channel of the pixel point.
If a isi,j(G)>ai-n,j(G) N is 1, 2.. and N, the vertical North direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ North is 1, where G _ North represents the vertical North direction of the G channel of the pixel.
If a isi,j(G)>ai+n,j(G) N is 1, 2.. and N, the vertical southward direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ South is 1, where G _ South represents the vertical southward direction of the G channel of the pixel.
If a isi,j(G)>ai-n,j+n(G) N is 1, 2., N, the northeast direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ NE is 1, where G _ NE represents the northeast direction of the G channel of the pixel.
If a isi,j(G)>ai+n,j-n(G) N is 1, 2.. and N, the southwest direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ SW 1, where G _ SW represents the southwest direction of the G channel of the pixel.
If a isi,j(G)>ai-n,j-n(G) N is 1, 2.. N, the northwest direction of the G channel of the pixel satisfies the dispersion distribution rule, and is labeled as G _ NW is 1, where G _ NW represents the northwest direction of the G channel of the pixel.
If a isi,j(G)>ai+n,j+n(G) N is 1, 2.. and N, the southeast direction of the G channel of the pixel satisfies the dispersion distribution rule, and is marked as G _ SE is 1, where G _ SE represents the southeast direction of the G channel of the pixel.
Wherein, ai,j(G) Representing a pixel point ai,jValue of G channel of (a)i,j-n(G) N is 1, 2, and N represents a pixel point ai,jA channel value of the G channel of N pixel points in the horizontal westward direction, ai,j+n(G) N1, 2, N tableDisplay pixel point ai,jA channel value of the G channel of N pixel points in the horizontal eastward direction, ai-n,j(G) N is 1, 2, and N represents a pixel point ai,jA channel value of the G channel of N pixel points in the vertical north direction, ai+n,j(G) N is 1, 2, and N represents a pixel point ai,jA channel value of the G channel of N pixel points in the vertical southbound direction, ai-n,j+n(G) N is 1, 2, and N represents a pixel point ai,jThe channel value a of the G channel of N pixel points in the northeast directioni+n,j-n(G) N is 1, 2, and N represents a pixel point ai,jThe channel value a of the G channel of N pixel points in the southwest directioni-n,j-n(G) N is 1, 2, and N represents the pixel point ai,jThe channel value a of the G channel of N pixel points in the northwest directioni+n,j+n(G) N is 1, 2, and N represents the pixel point ai,jAnd the channel values of the G channels of the N pixel points in the southeast direction.
And for the B channel of the pixel point, detecting the channel value of the B channel of the pixel point by taking the pixel point as a starting point, and judging whether the channel value is greater than the channel values of the B channels of other pixels in any specified direction. When the channel values of the B channels of the pixel points in the designated direction are all larger than the channel values of the B channels of other pixel points in the designated direction, marking the B channels of the pixel points as first numerical values in the designated direction; and when the channel values of the B channels of the pixel points in the designated direction are not all larger than the channel values of the B channels of other pixel points in the designated direction, marking the B channels of the pixel points as second numerical values in the designated direction. For example, when the channel value of the B channel of the pixel in any one designated direction is greater than the channel values of the B channels of other pixels in the designated direction, the designated direction of the B channel of the pixel may be marked as 1, and when the channel values of the B channels of other pixels in any one designated direction of the pixel are greater than the designated direction of the B channel of the other pixels in the designated direction, the designated direction of the B channel of the pixel may be marked as 0. The corresponding computer device may determine the label value of the B channel of the pixel point in the following manner.
If a isi,j(B)>ai,j-n(B) N is 1, 2.. and N, the horizontal westward direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ West is 1, wherein B _ West represents the horizontal westward direction of the B channel of the pixel.
If a isi,j(B)>ai,j+n(B) N is 1, 2.. and N, the horizontal East direction of the B channel of the pixel satisfies the dispersion distribution rule, and the B _ East is marked as B _ East is 1, where B _ East represents the horizontal East direction of the B channel of the pixel.
If a isi,j(B)>ai-n,j(B) N is 1, 2.. and N, the vertical North direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ North is 1, where B _ North represents the vertical North direction of the B channel of the pixel.
If a isi,j(B)>ai+n,j(B) N is 1, 2.. and N, the vertical southbound direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ South is 1, where B _ South represents the vertical southbound direction of the B channel of the pixel.
If a isi,j(B)>ai-n,j+n(B) N is 1, 2., N, the northeast direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ NE is 1, where B _ NE represents the northeast direction of the B channel of the pixel.
If a isi,j(B)>ai+n,j-n(B) N is 1, 2.. and N, the southwest direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ SW is 1, where B _ SW represents the southwest direction of the B channel of the pixel.
If a isi,j(B)>ai-n,j-n(B) N is 1, 2.. times, N, the northwest direction of the B channel of the pixel satisfies the dispersion distribution rule, which is labeled as B _ NW is 1, where B _ NW represents the northwest direction of the B channel of the pixel.
If a isi,j(B)>ai+n,j+n(B) N is 1, 2.. and N, the southeast direction of the B channel of the pixel satisfies the dispersion distribution rule, and is marked as B _ SE is 1, where B _ SE represents the pixel's southeast directionThe southeast direction of the B channel.
Wherein, ai,j(B) Representing a pixel point ai,jValue of B channel of (a)i,j-n(B) N is 1, 2, and N represents a pixel point ai,jA channel value of the B channel of N pixel points in the horizontal westward direction, ai,j+n(B) N is 1, 2, and N represents a pixel point ai,jA channel value of the B channel of N pixel points in the horizontal eastward direction, ai-n,j(B) N is 1, 2, and N represents a pixel point ai,jA channel value of the B channel of N pixel points in the vertical north direction, ai+n,j(B) N is 1, 2, and N represents a pixel point ai,jA channel value of the B channel of N pixel points in the vertical southbound direction, ai-n,j+n(B) N is 1, 2, and N represents a pixel point ai,jThe channel value a of the B channel of N pixel points in the northeast directioni+n,j-n(B) N is 1, 2, and N represents a pixel point ai,jChannel value, a, of the B channel of N pixel points in the southwest directioni-n,j-n(B) N is 1, 2, and N represents the pixel point ai,jThe channel value a of the B channel of N pixel points in the northwest directioni+n,j+n(B) N is 1, 2, and N represents the pixel point ai,jAnd the channel values of the B channels of the N pixel points in the southeast direction.
The point to be described is that the computer device may determine the dispersion distribution rule of the G channel of the pixel point first, and then determine the dispersion distribution rule of the B channel of the pixel point; the computer equipment can also determine the dispersion distribution rule of the B channel of the pixel point and then determine the dispersion distribution rule of the G channel of the pixel point; the computer equipment can also determine the dispersion distribution rule of the G channel and the dispersion distribution rule of the B channel of the pixel point at the same time. In this disclosure, the order of determining the dispersion distribution rule of the G channel and the dispersion distribution rule of the B channel of the pixel point by the computer device is not specifically limited.
After the computer equipment determines the dispersion distribution rules of the G channel and the B channel of the pixel point, whether the dispersion distribution rule of the pixel point is a high-temperature dispersion distribution rule or not is determined according to the dispersion distribution rules of the G channel and the B channel of the pixel point in the same appointed direction. And when the channel value of the G channel and the channel value of the B channel in the same designated direction of the pixel point both meet the dispersion distribution rule, determining that the dispersion distribution rule of the pixel point is a high-temperature dispersion distribution rule. That is, when the mark values of the G channel and the B channel of the pixel point in the designated direction are both the first numerical value, determining that the pixel point is in a high-temperature dispersion distribution rule in the designated direction.
For example, if G _ West is 1 and B _ West is 1, the pixel point a is determinedi,jThe horizontal westward direction is a high-temperature dispersion distribution rule.
If G _ East is equal to 1 and B _ East is equal to 1, determining the pixel point ai,iThe horizontal to east direction of (a) is a high-temperature dispersion distribution law.
If G _ North is 1 and B _ North is 1, then determine pixel ai,jThe vertical north direction of the optical fiber is a high-temperature dispersion distribution rule.
If G _ South is equal to 1 and B _ South is equal to 1, determining the pixel point ai,iThe high-temperature dispersion distribution rule in the direction perpendicular to the south.
If G _ NW is 1 and B _ NW is 1, the pixel point a is determinedi,jThe northwest direction of the optical fiber is a high-temperature dispersion distribution rule.
If G _ SE is equal to 1 and B _ SE is equal to 1, determining the pixel point ai,jThe southeast direction of the optical fiber is a high-temperature dispersion distribution rule.
If G _ NE is 1 and B _ NE is 1, determining pixel ai,jThe northeast direction of the dispersion is a high-temperature dispersion distribution rule.
If G _ SW is 1 and B _ SW is 1, determining the pixel point ai,jThe southwest direction of the optical fiber is a high-temperature dispersion distribution rule.
In this implementation, the computer device determines the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point, so that the electronic device can determine the pixel point of the high-temperature dispersion distribution rule from the plurality of electronic devices, the electronic device can determine the pixel point of the high-temperature dispersion type through the channel value of the determined pixel point, the pixel point of the high-temperature dispersion type is subjected to dispersion removal processing, and the accuracy of the dispersion processing is improved.
After the computer equipment determines the dispersion distribution rules of the pixel points in a plurality of designated directions, the dispersion type of the pixel point is determined according to the dispersion distribution rules of the pixel points in the plurality of designated directions. In a possible implementation manner, when there is at least one dispersion distribution rule in any one designated direction in a plurality of designated directions of the pixel point as a high-temperature dispersion distribution rule, the dispersion type of the pixel point is determined as the high-temperature dispersion type. In another possible implementation manner, when the dispersion distribution rules of all the designated directions in the plurality of designated directions of the pixel point are high-temperature dispersion distribution rules, the dispersion type of the pixel point is determined as the high-temperature dispersion type.
In the implementation mode, the computer equipment determines the dispersion type of the pixel point with the high-temperature dispersion distribution rule as the high-temperature dispersion type, and then performs dispersion removal processing on the pixel point according to the dispersion type of the pixel point, so that the process of respectively removing dispersion of different dispersion types is realized, and the accuracy of dispersion processing is improved.
In addition, after the computer equipment determines the dispersion distribution rule of the pixel point, the dispersion distribution type of the pixel point can be determined according to the dispersion distribution rule. In a possible implementation manner, the computer device may determine, according to the dispersion distribution rule of the pixel point, only the dispersion type of the pixel point as the dispersion type corresponding to the dispersion distribution rule.
In another possible implementation manner, after the computer device determines the dispersion distribution rule of the pixel point, the dispersion distribution rule of the pixel point in the designated direction of the pixel point corresponding to the dispersion distribution rule may be determined as the dispersion distribution rule of the pixel point.
For example, if G _ West is 1 and B _ West is 1, the pixel point a is determinedi,jThe horizontal westward direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai,j-nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ East is equal to1, and B _ East equals 1, then determine pixel ai,jThe horizontal east direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai,j+nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ North is 1 and B _ North is 1, then determine pixel ai,jThe vertical north direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,jN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ South is equal to 1 and B _ South is equal to 1, determining the pixel point ai,jThe vertical southward direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,jN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ NW is equal to i and B _ NW is equal to 1, determining the pixel point ai,jThe northwest direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,j+nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ SE is equal to 1 and B _ SE is equal to 1, determining the pixel point ai,jThe southeast direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,j-nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ NE is 1 and B _ NE is 1, determining pixel ai,jThe northeast direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,j-nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
If G _ SW is 1 and B _ SW is 1, determining the pixel point ai,jThe southwest direction of the image is a high-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,j+nN is 1, 2, the dispersion distribution rule of N is a high-temperature dispersion distribution rule.
In the implementation mode, the dispersion distribution rule of the pixel point is determined through the channel value of the G channel and the channel value of the B channel of the pixel point, so that the computer equipment can determine the dispersion distribution rule of the pixel point according to the channel values of different channels of the pixel point, the subsequent determination of the dispersion type of the pixel point according to different dispersion distribution rules is facilitated, further, the pixel point is subjected to dispersion removal processing according to the dispersion type of the pixel point, and the accuracy of dispersion processing is improved.
Another point to be described is that, in the embodiment of the present disclosure, the computer device may mark the pixel point meeting the high-temperature dispersion in the high-temperature dispersion matrix, and correspondingly, the computer device determines whether the pixel point is a high-temperature dispersion pixel point according to the dispersion distribution rule of each pixel point, and marks the pixel point in the high-temperature dispersion matrix. In another possible implementation manner, when the computer device determines the dispersion distribution rules of a plurality of pixels in the designated direction with the pixel as the starting pixel according to the dispersion distribution rules of the G channel and the B channel of the pixel, the plurality of pixels are marked in the high-temperature dispersion matrix.
When the pixel point ai,j-ai,j-nWhen N is a pixel point with a high-temperature dispersion distribution rule, let h _ mask i,j-n1, N is 0, 1, 2i,j-nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h-maski,jTo h _ maski,j-NPoint of (1), h _ mask i,j-n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h _ maski,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai,j+nWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i,j+n1, N is 0, 1, 2i,j+nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski,j+NPoint of (1), h _ mask i,j+n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h-maski,j+NThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,jWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i-n,j1, N-0, 1, 2, N, where h _ mask denotes a high temperature dispersion matrix, h-maski-n,jN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h-maski,jTo h _ maski-N,jPoint of (1), h _ mask i-n,j1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h _ maski-N,jThe point of (2) is marked as 1. When the pixel point ai,j-ai+n,jWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i+n,j1, N is 0, 1, 2i+n,jN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski+N,jPoint of (a), h-mask i+n,j1, N-0, 1, 2, N denotes the high temperature dispersion matrix from h-maski,jTo h _ maski+N,jThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,j+nWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i-n,j+n1, N is 0, 1, 2i-n,j+nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski-N,j+Point of N, h _ mask i-n,j+n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h _ maski-N,j+NThe point of (2) is marked as 1. When the pixel point ai,j-ai+n,j-nWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i+n,j-n1, N is 0, 1, 2i+n,j-nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski+N,j-NPoint of (1), h _ mask i+n,j-n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h _ maski+N,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,j-nWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i-n,j-n1, N is 0, 1, 2i-n,j-nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski-N,j-NPoint of (1), h _ mask i-n,j-n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h _ maski-N,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai+n,j+nWhen N is a pixel point satisfying the high-temperature dispersion distribution rule, let h _ mask i+n,j+n1, N is 0, 1, 2i+n,j+nN is 0, 1, 2, N denotes a high temperature dispersion matrix, from h _ maski,jTo h _ maski+N,j+NPoint of (1), h _ mask i+n,j+n1, N is 0, 1, 2, N denotes the sum of h _ mask in the high temperature dispersion matrixi,jTo h-maski+N,j+NThe point of (2) is marked as 1.
Step 305: and when the first difference is larger than the second difference, the computer equipment determines the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point.
When the first difference is larger than the second difference, the probability that the current pixel is a low-temperature dispersion pixel is high, the computer device determines the dispersion distribution rule of the pixel according to the dispersion distribution rule of the G channel and the dispersion distribution rule of the R channel by respectively determining the dispersion distribution rule of the G channel and the dispersion distribution rule of the R channel, and in the step, the computer device determines the dispersion distribution rules of the pixels in a plurality of designated directions respectively.
The process can be realized by the following steps (1) to (2), including:
(1) for any one of a plurality of designated directions, the computer device determines a plurality of pixel points in the designated direction of the pixel point.
This step is similar to step (1) in step 304, and is not described herein again.
(2) When the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the pixel points in the appointed direction, the computer equipment determines that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
In this step, the method for determining the dispersion distribution rule of the G channel of the pixel point is similar to the method for determining the dispersion distribution rule of the G channel of the pixel point by the computer device in step (2) of step 304, and details are not repeated here.
And for the R channel of the pixel point, detecting the channel value of the R channel of the pixel point by taking the pixel point as a starting point, and judging whether the channel value is greater than the channel values of the R channels of other pixels in any specified direction. When the channel values of the R channels of the pixel bands in the designated direction are all larger than the channel values of the R channels of other pixels in the designated direction, marking the R channels of the pixels as first numerical values in the designated direction; and when the channel values of the R channels of the pixel points in the designated direction are not all larger than the channel values of the R channels of other pixel points in the designated direction, marking the R channels of the pixel points as second numerical values in the designated direction. For example, when the channel value of the R channel of the pixel in any one designated direction is greater than the channel values of the R channels of other pixels in the designated direction, the designated direction of the R channel of the pixel may be marked as 1, and when the channel value of the R channel of the pixel in any one designated direction is not greater than the channel values of the R channels of other pixels in the designated direction, the designated direction of the R channel of the pixel may be marked as 0. Accordingly, the computer device may determine the label value of the R channel of the pixel point in the following manner.
If a isi,j(R)>ai,j-n(R), N is 1, 2.. and N, then the horizontal westward direction of the R channel of the pixel satisfies the dispersion distribution rule, and the R channel is labeled as R _ West is 1, where R _ West represents the R channel of the pixelIn the horizontal westward direction.
If a isi,j(R)>ai,j+n(R), N is 1, 2.. and N, the horizontal East direction of the R channel of the pixel satisfies the dispersion distribution rule, and the R _ East is marked as R _ East is 1, where R _ East represents the horizontal East direction of the R channel of the pixel.
If a isi,j(R)>ai-n,j(R), N is 1, 2.. and N, the vertical North direction of the R channel of the pixel satisfies the dispersion distribution rule, and is labeled as R _ North is 1, where R _ North represents the vertical North direction of the R channel of the pixel.
If a isi,j(R)>ai+n,j(R), N is 1, 2.. and N, the vertical southbound direction of the R channel of the pixel satisfies the dispersion distribution rule, and is labeled as R _ South is 1, where R _ South represents the vertical southbound direction of the R channel of the pixel.
If a isi,j(R)>ai-n,j+n(R), N is 1, 2.. and N, then the northeast direction of the R channel of the pixel satisfies the dispersion distribution rule, and is marked as R _ NE is 1, where R _ NE represents the northeast direction of the R channel of the pixel.
If a isi,j(R)>ai+n,j-n(R), N is 1, 2.. and N, the southwest direction of the R channel of the pixel satisfies the dispersion distribution rule, and is marked as R _ SW is 1, where R _ SW represents the southwest direction of the R channel of the pixel.
If a isi,j(R)>ai-n,j-n(R), N is 1, 2.. N, and the northwest direction of the R channel of the pixel satisfies the dispersion distribution rule, which is labeled as R _ NW is 1, where R _ NW represents the northwest direction of the R channel of the pixel.
If a isi,j(R)>ai+n,j+n(R), N is 1, 2.. and N, then the southeast direction of the R channel of the pixel satisfies the dispersion distribution rule, and is marked as R _ SE is 1, where R _ SE represents the southeast direction of the R channel of the pixel.
Wherein, ai,j(R) representing a pixel ai,jValue of R channel of (a)i,j-n(R),n=1,2,...,N represents a pixel point ai,jA channel value of R channel of N pixel points in the horizontal westward direction, ai,j+n(R), N ═ 1, 2.. N denotes a pixel point ai,jA channel value of R channel of N pixel points in the horizontal eastward direction, ai-n,j(R), N ═ 1, 2.. N denotes a pixel point ai,jA channel value of R channel of N pixel points in the vertical north direction, ai+n,j(R), N ═ 1, 2.. N denotes a pixel point ai,jA channel value of R channel of N pixel points in the vertical southbound direction, ai-n,j+n(R), N ═ 1, 2.. N denotes a pixel point ai,jThe channel value a of the R channel of N pixel points in the northeast directioni+n,j-n(R), N ═ 1, 2.. N denotes a pixel point ai,jChannel value a of R channel of N pixel points in southwest directioni-n,j-n(R), N ═ 1, 2., N denotes the pixel ai,jThe channel value a of the R channel of N pixel points in the northwest directioni+n,j+n(R), N ═ 1, 2., N denotes the pixel ai,jAnd the channel values of the R channels of the N pixel points in the southeast direction.
In this implementation, the computer device determines the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point, so that the electronic device can determine the pixel point of the low-temperature dispersion distribution rule from the plurality of electronic devices, the electronic device can determine the pixel point of the low-temperature dispersion type through the channel value of the determined pixel point, the pixel point of the low-temperature dispersion type is subjected to dispersion removal, and the accuracy of dispersion treatment is improved.
The point to be described is that the computer device may determine the dispersion distribution rule of the G channel of the pixel point first, and then determine the dispersion distribution rule of the R channel of the pixel point; the computer equipment can also determine the dispersion distribution rule of the R channel of the pixel point and then determine the dispersion distribution rule of the G channel of the pixel point; the computer equipment can also determine the dispersion distribution rule of the G channel and the dispersion distribution rule of the R channel of the pixel point at the same time. In this embodiment of the present disclosure, the order in which the computer device determines the dispersion distribution rule of the G channel and the dispersion distribution rule of the R channel of the pixel point is not specifically limited.
After the computer equipment determines the dispersion distribution rules of the G channel and the R channel of the pixel point, whether the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule or not is determined according to the dispersion distribution rules of the G channel and the R channel of the pixel point in the same appointed direction. And when the channel value of the G channel and the channel value of the R channel in the same designated direction of the pixel point both meet the dispersion distribution rule, determining that the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule. That is, when the marking values of the G channel and the R channel of the pixel point in the designated direction are both the first numerical value, the low-temperature dispersion distribution rule of the pixel point in the designated direction is determined.
For example, if G _ West is 1 and R _ West is 1, the pixel point a is determinedi,jThe horizontal western direction is the low-temperature dispersion distribution rule.
If G _ East is equal to 1 and R _ East is equal to 1, determining the pixel point ai,jThe horizontal to east direction is the low temperature dispersion distribution law.
If G _ North is 1 and R _ North is 1, determining pixel point ai,jThe vertical north direction is the low-temperature dispersion distribution rule.
If G _ South is equal to 1 and R _ South is equal to 1, then the pixel is determined to be convexi,jThe vertical southward direction is the low-temperature dispersion distribution rule.
If G _ NW is 1 and R _ NW is 1, the pixel point a is determinedi,jThe northwest direction is the low-temperature dispersion distribution rule.
If G _ SE is equal to 1 and R _ SE is equal to 1, determining the pixel point ai,jThe southeast direction is the low-temperature dispersion distribution rule.
If G _ NE is 1 and R _ NE is 1, determining pixel ai,jThe northeast direction is the low-temperature dispersion distribution rule.
If G _ SW is 1 and R _ SW is 1, determining the pixel point ai,jThe southwest direction is the low-temperature dispersion distribution rule.
It should be noted that, in the embodiment of the present disclosure, after determining the dispersion distribution rule of a pixel, the computer device determines the dispersion type of the pixel according to the dispersion distribution rule of the pixel, and correspondingly, when the dispersion distribution rule of the pixel is a high-temperature dispersion distribution rule, determines that the dispersion type of the pixel is a high-temperature dispersion type; and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
After the computer equipment determines the dispersion distribution rules of the pixel points in a plurality of designated directions, the dispersion type of the pixel point is determined according to the dispersion distribution rules of the pixel points in the plurality of designated directions. In a possible implementation manner, when there is at least one dispersion distribution rule in any one designated direction in a plurality of designated directions of the pixel point as a low-temperature dispersion distribution rule, the dispersion type of the pixel point is determined as the low-temperature dispersion type. In another possible implementation manner, when the dispersion distribution rules in all the designated directions in the plurality of designated directions of the pixel point are low-temperature dispersion distribution rules, the dispersion type of the pixel point is determined as the low-temperature dispersion type.
In the implementation mode, the computer equipment determines the dispersion type of the pixel point with the low-temperature dispersion distribution rule as the low-temperature dispersion type, and then performs the dispersion removing treatment on the pixel point according to the dispersion type of the pixel point, so that the process of respectively removing dispersion of different dispersion types is realized, and the accuracy of the dispersion treatment is improved.
In addition, after the computer equipment determines the dispersion distribution rule of the pixel point, the dispersion distribution type of the pixel point can be determined according to the dispersion distribution rule. In a possible implementation manner, the computer device may determine, according to the dispersion distribution rule of the pixel point, only the dispersion type of the pixel point as the dispersion type corresponding to the dispersion distribution rule.
In another possible implementation manner, after the computer device determines the dispersion distribution rule of the pixel point, the dispersion distribution rule of the pixel point in the designated direction of the pixel point corresponding to the dispersion distribution rule may be determined as the dispersion distribution rule of the pixel point.
If G _ West is 1 and R _ West is 11, determining the pixel point ai,jThe horizontal westward direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai,jN, N is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ East is equal to 1 and R _ East is equal to 1, determining the pixel point ai,jThe horizontal east direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai,j+nN is 1, 2, and the dispersion distribution rule of N is a low-temperature dispersion distribution rule point.
If G _ North is 1 and R _ North is 1, determining pixel point ai,jThe vertical north direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,jN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ South is equal to 1 and R _ South is equal to 1, determining the pixel point ai,jThe vertical southward direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,jN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ NW is 1 and R _ NW is 1, the pixel point a is determinedi,jThe northwest direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,j+nN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ SE is equal to 1 and R _ SE is equal to 1, determining the pixel point ai,jThe southeast direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,j-nN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ NE is 1 and R _ NE is 1, determining pixel ai,jThe northeast direction of the image is the low-temperature dispersion distribution rule, and the pixel point ai,j-ai-n,j-nN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
If G _ SW is 1 and R _ SW is 1, determining the pixel point ai,jThe southwest direction of the image is a low-temperature dispersion distribution rule, and the pixel point ai,j-ai+n,j+nN is 1, 2, the dispersion distribution rule of N is the low-temperature dispersion distribution rule.
In the implementation mode, the dispersion distribution rule of the pixel point is determined through the channel value of the G channel and the channel value of the R channel of the pixel point, so that the computer equipment can determine the dispersion distribution rule of the pixel point according to the channel values of different channels of the pixel point, the subsequent determination of the dispersion type of the pixel point according to different dispersion distribution rules is facilitated, further, the pixel point is subjected to dispersion removal processing according to the dispersion type of the pixel point, and the accuracy of dispersion processing is improved.
Another point to be described is that, in the embodiment of the present disclosure, the computer device may mark the pixel point meeting the low-temperature dispersion in the high-temperature dispersion matrix, and correspondingly, the computer device determines whether the pixel point is a low-temperature dispersion pixel point according to the dispersion distribution rule of each pixel point, and marks the pixel point in the low-temperature dispersion matrix. In another possible implementation manner, when the computer device determines the dispersion distribution rules of a plurality of pixels in the designated direction with the pixel as the starting pixel according to the dispersion distribution rules of the G channel and the R channel of the pixel, the plurality of pixels are marked in the low-temperature dispersion matrix.
When the pixel point ai,j-ai,j-nWhen N is a pixel point with a low-temperature dispersion distribution rule, let l _ mask i,j-n1, N is 0, 1, 2i,j-nN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski,j-NPoint of (1), l _ mask i,j-n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai,j+nWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i,j+n1, N is 0, 1, 2i,j+nN is 0, 1, 2, and N represents a low temperature colorIn the scatter matrix, from l _ maski,jTo l _ maski,j+NPoint of (1), l _ mask i,j+n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski,j+NThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,jWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i-n,j1, N is 0, 1, 2i-n,jN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski-N,jPoint of (1), l _ mask i-n,j1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski-N,jThe point of (2) is marked as 1. When the pixel point ai,j-ai+n,jWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i+n,j1, N is 0, 1, 2i+n,jN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski+N,jPoint of (1), l _ mask i+n,j1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski+N,jThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,j+nWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i-n,j+n1, N is 0, 1, 2i-n,j+nN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski-N,j+NPoint of (1), l _ mask i-n,j+n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski-N,j+The point of N is marked as 1. When the pixel point ai,j-ai+n,j-nWhen N is a pixel point satisfying the high-low temperature distribution rule, let l _ mask i+n,j-n1, N is 0, 1, 2ki+n,j-nN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski+N,j-NPoint of (1), l _ mask i+n,j-n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski+N,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai-n,j-nWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i-n,j-n1, N is 0, 1, 2i-n,j-nN is 0, 1, 2, and N is a low-temperature dispersion matrix, and is expressed by l _ maski,jTo l _ maski-N,j-NPoint of (1), l _ mask i-n,j-n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski-N,j-NThe point of (2) is marked as 1. When the pixel point ai,j-ai+n,j+nWhen N is a pixel point satisfying the low-temperature dispersion distribution rule, let l _ mask i+n,j+n1, N is 0, 1, 2i+n,j+nN is 0, 1, 2, and N is in high and low temperature scattering matrix, and is selected from l _ maski,jTo l _ maski+N,j+NPoint of (1), l _ mask i+n,j+n1, N is 0, 1, 2, N denotes the sum of l _ mask in the low temperature dispersion matrixi,jTo l _ maski+N,j+NThe point of (2) is marked as 1.
Step 306: and the computer equipment processes the channel value of each channel of the pixel point according to the dispersion processing coefficient corresponding to the dispersion type of the pixel point to obtain the de-dispersion channel value of each channel of the pixel point.
After the computer equipment determines the dispersion types of a plurality of pixel points, the computer equipment can directly execute the step to respectively perform dispersion removal processing on the pixel points of the high-temperature dispersion type and the low-temperature dispersion type, and the computer equipment can also determine a dispersion processing area according to the pixel points of the high-temperature dispersion type and the low-temperature dispersion type and perform dispersion removal processing on the pixel points of the area. The process may be realized by the following steps (a1) - (a4), including:
(A1) the computer equipment generates a first dispersion matrix according to the high-temperature dispersion type pixel points and generates a second dispersion matrix according to the low-temperature dispersion type pixel points.
The computer device can generate dispersion matrixes of different dispersion types according to the dispersion types of the pixel points in the first image. It should be noted that the first dispersion matrix and the second dispersion matrix may be the same matrix, and the first dispersion matrix and the second dispersion matrix may also be two matrices, which is not particularly limited in the embodiments of the present disclosure. When the first dispersion matrix and the second dispersion matrix are the same matrix, the computer device may mark the high-temperature dispersion type pixel in the matrix as 1, mark the low-temperature dispersion type pixel as 2, mark the dispersed pixel as 0, and so on. When the first dispersion matrix and the second dispersion matrix are two matrices, the computer device may mark the high-temperature dispersion type pixel in the first dispersion matrix as 1, mark the low-temperature dispersion type pixel and the non-dispersion type pixel as 0, and similarly mark the low-temperature dispersion type pixel in the second dispersion matrix as 1 and mark the high-temperature dispersion type pixel and the non-dispersion type pixel as 0.
(A2) The computer device traverses through a traversal window in the first dispersion matrix and the second dispersion matrix.
The size of the traversal window may be set and changed as needed, and in the embodiment of the present disclosure, the size of the traversal window is not specifically limited.
(A3) When the number of the high-temperature dispersion type pixel points in the traversal window in the first dispersion matrix is larger than a first preset threshold value, the computer equipment determines all the pixel points in the traversal window as the high-temperature dispersion type pixel points.
The first preset threshold may be set and changed as needed, and in the embodiment of the present disclosure, the first preset threshold is not specifically limited.
(A4) And when the number of the low-temperature dispersion type pixel points in the traversal window in the second dispersion matrix is larger than a second preset threshold value, the computer equipment determines all the pixel points in the traversal window as the low-temperature dispersion type pixel points.
The second preset threshold may be the same as or different from the first preset threshold, and the second preset threshold may also be set and changed as needed, which is not specifically limited in the embodiment of the present disclosure.
In the implementation mode, the pixel points of the high-temperature dispersion type and the low-temperature dispersion type are traversed through the traversal window, and whether other pixel points in the traversal window are dispersed or not and the type of dispersion are determined according to the number and the type of the dispersion pixel points in the traversal window, so that misjudgment is reduced, and the accuracy of dispersion processing is improved.
After the computer device determines the pixel points of the high-temperature dispersion type and the low-temperature dispersion type, the dispersion processing coefficients corresponding to different dispersion types can be determined according to the pixel points of the high-temperature dispersion type and the pixel points of the low-temperature dispersion type respectively, and then the pixel points are subjected to de-dispersion processing according to the dispersion processing coefficients. The process may be implemented by the following steps (B1) - (B3), including:
(B1) the computer device determines a minimum channel value of the plurality of channel values for the pixel point.
(B2) For each channel of the pixel point, the computer device determines a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient.
(B3) And the computer equipment determines a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is positively correlated with the compensation value and the channel value respectively.
For example, the dispersion processing coefficient for the low temperature dispersion type may be α1、α2And α3(ii) a For dispersion handling coefficients of the high temperature dispersion typeIs β1、β2And β3
In one possible implementation, the computer device determines a minimum channel value of the plurality of channel values of the pixel point; determining the difference value between each channel value and the minimum channel value, and multiplying the difference value between the channel value and the minimum channel value by a dispersion processing coefficient to obtain a compensation value of the channel; and taking the sum of the compensation value and the minimum channel value of the pixel point as the channel value of the pixel point after the channel is subjected to dispersion removal.
In the implementation mode, the computer equipment determines the compensation value of the channel of the dispersion pixel point according to the dispersion removal coefficient, the channel value and the difference value of the minimum channel value, and then performs dispersion removal processing on the pixel point, so that the respective processing of the pixel points with different dispersion types is realized, and the accuracy of the dispersion removal processing is improved.
Correspondingly, the process of performing the dispersion removal processing on the low-temperature dispersion type pixel point by the computer equipment can be represented by the following formula:
wherein, a'i,j(R) represents a value a 'of a R channel of a low-temperature dispersion type pixel point'i,j(G) Value a 'of a G-channel de-dispersed channel representing a low-temperature dispersion type pixel point'i,j(B) The value of the channel after the B channel of the pixel point of the low-temperature dispersion type is dispersed,represents the dispersion processing coefficient corresponding to the low-temperature dispersion of the R channel,the dispersion processing coefficient corresponding to the low-temperature dispersion of the G channel is expressed,represents the dispersion processing coefficient, min (a), corresponding to the low-temperature dispersion of the G channeli,j(RG, B)) represents the smallest channel value of the three channels of the pixel.
In the implementation mode, the computer equipment determines the corresponding dispersion processing coefficient for the low-temperature dispersion type pixel point and performs the dispersion removing processing on the low-temperature dispersion type pixel point, so that the respective processing on the different dispersion type pixel points is realized, and the accuracy of the dispersion removing processing is improved.
The process of performing the de-dispersion processing on the high-temperature dispersion type pixel point by the computer equipment can be represented by the following formula:
wherein, a'i,j(R) represents a value a 'of a R channel of a high-temperature dispersion type pixel point'i,j (G) represents the value a 'of the G channel of the high-temperature dispersion type pixel point after the dispersion removal'i,j(B) Value of the B channel de-dispersed channel representing a high temperature dispersion type pixel point, β1Dispersion processing coefficient corresponding to high temperature dispersion of R channel, β2Dispersion processing coefficient corresponding to high temperature dispersion representing G channel, β3Represents the dispersion processing coefficient, min (a), corresponding to the high temperature dispersion of the G channeli,j(R, G, B)) represents the value of the smallest channel of the three channels of the pixel.
In the implementation mode, the computer equipment determines the corresponding dispersion processing coefficient for the high-temperature dispersion type pixel points and performs the dispersion removing processing on each pixel point in the high-temperature dispersion type pixel points, so that the respective processing on the different dispersion type pixel points is realized, and the dispersion removing processing precision is improved.
The dispersion processing coefficient is a number greater than 0, and may be set as needed, and in the embodiment of the present disclosure, the dispersion processing coefficient is not specifically limited. For example, the dispersion handling coefficient may be 0.2, 0.5, 1, 1.2, 1.5, or the like.
In the embodiment of the disclosure, a first image to be processed is obtained, and for any one of a plurality of pixel points in the first image, a dispersion distribution rule of the pixel point is determined according to a channel value of the pixel point; and determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point, and then performing dispersion removal processing on the pixel point according to the dispersion type of each pixel point to obtain a second image after dispersion removal. The method has the advantages that the pixel points of different dispersion types are determined by determining the channel values of the pixel points, and the pixel points of different dispersion types are respectively subjected to dispersion removing treatment, so that the accuracy of dispersion treatment is improved.
FIG. 5 is a block diagram illustrating an apparatus in accordance with an example embodiment. The apparatus is for performing the steps performed when performing the above method, see fig. 5, the apparatus comprising:
an obtaining module 501, configured to obtain a first image to be processed;
a first determining module 502, configured to determine, for any one of multiple pixel points in the first image, a dispersion distribution rule of the pixel point according to a channel value of the pixel point;
a second determining module 503, configured to determine a dispersion type of the pixel according to a dispersion distribution rule of the pixel;
the processing module 504 is configured to perform a de-dispersion processing on the pixel point according to the dispersion type of the pixel point, so as to obtain a second image.
In a possible implementation manner, the first determining module 502 is further configured to determine a dispersion distribution rule of the pixel point according to a channel value of a G channel and a channel value of a B channel of the pixel point; or determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point.
In another possible implementation manner, the apparatus further includes:
a third determining module, configured to determine, according to the channel value of the G channel, the channel value of the B channel, and the channel value of the R channel of the pixel point, a difference between the channel value of the G channel and the channel value of the B channel of the pixel point to obtain a first difference, and determine a difference between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference;
the first determining module 502 is further configured to determine a dispersion distribution rule of the pixel point according to a channel value of a G channel and a channel value of a B channel of the pixel point when the first difference is smaller than the second difference;
the first determining module 502 is further configured to determine a dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point when the first difference is greater than the second difference.
In another possible implementation manner, the first determining module 502 is further configured to determine, for any one of a plurality of designated directions, a plurality of pixel points in the designated direction of the pixel point; and when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
In another possible implementation manner, the first determining module 502 is further configured to determine, for any one of a plurality of designated directions, a plurality of pixel points in the designated direction of the pixel point; and when the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
In another possible implementation manner, the second determining module 503 is further configured to determine that the dispersion type of the pixel is a high-temperature dispersion type when the dispersion distribution rule of the pixel is a high-temperature dispersion distribution rule; and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
In another possible implementation manner, the processing module 504 is further configured to process the channel value of each channel of the pixel according to the dispersion processing coefficient corresponding to the dispersion type of the pixel, so as to obtain a de-dispersion channel value of each channel of the pixel; and modifying the channel value of each channel value of the pixel point in the first image into the de-dispersion channel value to obtain the second image.
In another possible implementation manner, the apparatus further includes:
the selection module is used for selecting a pixel point with a high-temperature dispersion type and a pixel point with a low-temperature dispersion type from a plurality of pixel points of the first image according to the dispersion types of the plurality of pixel points in the first image;
the generating module is used for generating a first dispersion matrix according to the high-temperature dispersion type pixel points and generating a second dispersion matrix according to the low-temperature dispersion type pixel points;
the traversing module is used for traversing in the first dispersion matrix and the second dispersion matrix through a traversing window;
a fourth determining module, configured to determine, when the number of high-temperature dispersion-type pixel points in the traversal window in the first dispersion matrix is greater than a first preset threshold, all the pixel points in the traversal window as the high-temperature dispersion-type pixel points;
and the fifth determining module is used for determining all the pixel points in the traversal window as the pixel points of the low-temperature dispersion type when the number of the pixel points of the low-temperature dispersion type in the traversal window in the second dispersion matrix is greater than a second preset threshold value.
In another possible implementation manner, the processing module 504 is further configured to determine a minimum channel value of the multiple channel values of the pixel point; for each channel of the pixel point, determining a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient; and determining a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is positively correlated with the compensation value and the channel value respectively.
In another possible implementation manner, the obtaining module 501 is further configured to obtain a third image, and determine a specified pixel point from the third image; determining a first image with the designated pixel point as the center and the detection width as the designated detection width in the third image; the designated pixel point is a pixel point of which the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel point of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value.
In the embodiment of the disclosure, a first image to be processed is obtained, and for any one of a plurality of pixel points in the first image, a dispersion distribution rule of the pixel point is determined according to a channel value of the pixel point; and determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point, and then performing dispersion removal processing on the pixel point according to the dispersion type of each pixel point to obtain a second image after dispersion removal. The method has the advantages that the pixel points of different dispersion types are determined by determining the channel values of the pixel points, and the pixel points of different dispersion types are respectively subjected to dispersion removing treatment, so that the accuracy of dispersion treatment is improved.
It should be noted that: in the image processing apparatus provided in the above embodiment, only the division of the above functional modules is taken as an example for illustration during image processing, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
FIG. 6 is a block diagram illustrating a configuration of a computer device 600 according to an example embodiment. The computer device 600 may be: the mobile terminal comprises equipment with a shooting function, such as a smart phone, a tablet computer, a notebook computer or a desktop computer. Computer device 600 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
Generally, the computer device 600 includes: a processor 601 and a memory 602.
The processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 601 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store at least one instruction for execution by the processor 601 to implement the image processing method provided by the method embodiments in the present disclosure.
In some embodiments, the computer device 600 may further optionally include: a peripheral interface 603 and at least one peripheral. The processor 601, memory 602, and peripheral interface 603 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 603 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 604, a touch screen display 605, a camera 606, an audio circuit 607, a positioning component 608, and a power supply 609.
The peripheral interface 603 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 601 and the memory 602. In some embodiments, the processor 601, memory 602, and peripheral interface 603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 601, the memory 602, and the peripheral interface 603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 604 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 604 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 604 may communicate with other control devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 604 may also include NFC (Near Field Communication) related circuits, which are not limited by this disclosure.
The touch display 605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the touch display screen 605 is a touch display screen, the touch display screen 605 also has the ability to acquire touch signals on or over the surface of the touch display screen 605. The touch signal may be input to the processor 601 as a control signal for processing. At this point, the touch display 605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display screen 605 may be one, providing the front panel of the computer device 600; in other embodiments, the touch screen display 605 can be at least two, respectively disposed on different surfaces of the computer device 600 or in a folded design; in still other embodiments, the touch display 605 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 600. Even more, the touch screen display 605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 606 is used to capture images or video. Optionally, camera assembly 606 includes a front camera and a rear camera. Generally, a front camera is provided on a front panel of the control apparatus, and a rear camera is provided on a rear surface of the control apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 606 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The Location component 608 is used to locate the current geographic Location of the computer device 600 to implement navigation or LBS (Location Based Service). The positioning component 608 can be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
The power supply 609 is used to supply power to the various components in the computer device 600. The power supply 609 may be ac, dc, disposable or rechargeable. When the power supply 609 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the computer device 600 also includes one or more sensors 610. The one or more sensors 610 include, but are not limited to: acceleration sensor 611, gyro sensor 612, pressure sensor 613, fingerprint sensor 614, optical sensor 615, and proximity sensor 616.
The acceleration sensor 611 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the computer apparatus 600. For example, the acceleration sensor 611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 601 may control the touch screen display 605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 611. The acceleration sensor 611 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 612 may detect a body direction and a rotation angle of the computer apparatus 600, and the gyro sensor 612 may cooperate with the acceleration sensor 611 to acquire a 3D motion of the user on the computer apparatus 600. The processor 601 may implement the following functions according to the data collected by the gyro sensor 612: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 613 may be disposed on the side bezel of the computer device 600 and/or underneath the touch display screen 605. When the pressure sensor 613 is disposed on the side frame of the computer device 600, the holding signal of the user to the computer device 600 can be detected, and the processor 601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 613. When the pressure sensor 613 is disposed at the lower layer of the touch display screen 605, the processor 601 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 614 is used for collecting a fingerprint of a user, and the processor 601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 614, or the fingerprint sensor 614 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 601 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 614 may be provided on the front, back, or side of the computer device 600. When a physical key or vendor Logo is provided on the computer device 600, the fingerprint sensor 614 may be integrated with the physical key or vendor Logo.
The optical sensor 615 is used to collect the ambient light intensity. In one embodiment, processor 601 may control the display brightness of touch display 605 based on the ambient light intensity collected by optical sensor 615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 605 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 605 is turned down. In another embodiment, the processor 601 may also dynamically adjust the shooting parameters of the camera assembly 606 according to the ambient light intensity collected by the optical sensor 615.
The proximity sensor 616, also known as a distance sensor, is typically disposed on the front panel of the computer device 600. The proximity sensor 616 is used to capture the distance between the user and the front of the computer device 600. In one embodiment, the processor 601 controls the touch display screen 605 to switch from the bright screen state to the rest screen state when the proximity sensor 616 detects that the distance between the user and the front face of the computer device 600 is gradually decreased; when the proximity sensor 616 detects that the distance between the user and the front of the computer device 600 is gradually increasing, the touch display screen 605 is controlled by the processor 601 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in FIG. 6 does not constitute a limitation of the computer device 600, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
The embodiment of the present disclosure also provides a computer-readable storage medium, which is applied to a terminal, and has at least one instruction, at least one program, a code set, or a set of instructions stored therein, where the instruction, the program, the code set, or the set of instructions are loaded and executed by a processor to implement the operations performed by the computer device in the image processing method in the foregoing embodiments.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, and a program that can be implemented by the hardware and can be instructed by the program to be executed by the relevant hardware may be stored in a computer readable storage medium, where the storage medium may be a read-only memory, a magnetic or optical disk, and the like.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs operations has been described in detail in the embodiment related to the method, and will not be described in detail here.
It is to be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (12)
1. An image processing method, characterized in that the method comprises:
acquiring a first image to be processed;
determining a dispersion distribution rule of any one of a plurality of pixel points in the first image according to a channel value of the pixel point;
determining the dispersion type of the pixel points according to the dispersion distribution rule of the pixel points;
and according to the dispersion type of the pixel point, performing dispersion removal processing on the pixel point to obtain a second image.
2. The method of claim 1, wherein determining the dispersion distribution rule of the pixel points according to the channel values of the pixel points comprises:
determining a dispersion distribution rule of the pixel points according to the channel values of the G channel and the B channel of the pixel points; or,
and determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the R channel of the pixel points.
3. The method of claim 2, further comprising:
determining a difference value between the channel value of the G channel and the channel value of the B channel of the pixel point according to the channel value of the G channel, the channel value of the B channel and the channel value of the R channel of the pixel point to obtain a first difference value, and determining a difference value between the channel value of the G channel and the channel value of the R channel of the pixel point to obtain a second difference value;
when the first difference is smaller than the second difference, executing the step of determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the B channel of the pixel points;
and when the first difference is larger than the second difference, executing the step of determining the dispersion distribution rule of the pixel points according to the channel values of the G channel and the R channel of the pixel points.
4. The method of claim 2, wherein determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the B channel of the pixel point comprises:
for any one of a plurality of designated directions, determining a plurality of pixel points in the designated direction of the pixel points;
and when the channel values of the G channels of the pixel points are all larger than the pixel points of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the B channels of the pixel points are all larger than the channel values of the B channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a high-temperature dispersion distribution rule.
5. The method of claim 2, wherein determining the dispersion distribution rule of the pixel point according to the channel value of the G channel and the channel value of the R channel of the pixel point comprises:
for any one of a plurality of designated directions, determining a plurality of pixel points in the designated direction of the pixel points;
and when the channel values of the G channels of the pixel points are all larger than the channel values of the G channels of the plurality of pixel points in the appointed direction, and the channel values of the R channels of the pixel points are all larger than the channel values of the R channels of the plurality of pixel points in the appointed direction, determining that the dispersion distribution rule of the pixel points is a low-temperature dispersion distribution rule.
6. The method of claim 1, wherein determining the dispersion type of the pixel according to the dispersion distribution rule of the pixel comprises:
when the dispersion distribution rule of the pixel point is a high-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a high-temperature dispersion type;
and when the dispersion distribution rule of the pixel point is a low-temperature dispersion distribution rule, determining that the dispersion type of the pixel point is a low-temperature dispersion type.
7. The method according to claim 1, wherein the performing the de-dispersion processing on the pixel point according to the dispersion type of the pixel point to obtain a second image comprises:
processing the channel value of each channel of the pixel point according to the dispersion processing coefficient corresponding to the dispersion type of the pixel point to obtain the de-dispersion channel value of each channel of the pixel point;
and modifying the channel value of each channel value of the pixel points in the first image into the de-dispersion channel value to obtain the second image.
8. The method according to claim 1 or 7, wherein before performing the de-dispersion processing on the pixel point according to the dispersion type of the pixel point to obtain the second image, the method further comprises:
selecting a pixel point with a high-temperature dispersion type and a pixel point with a low-temperature dispersion type from the plurality of pixel points of the first image according to the dispersion types of the plurality of pixel points in the first image;
generating a first dispersion matrix according to the high-temperature dispersion type pixel points, and generating a second dispersion matrix according to the low-temperature dispersion type pixel points;
traversing through a traversal window in the first dispersion matrix and the second dispersion matrix;
when the number of the high-temperature dispersion type pixel points in the traversal window in the first dispersion matrix is larger than a first preset threshold value, determining all the pixel points in the traversal window as the high-temperature dispersion type pixel points;
when the number of the low-temperature dispersion type pixel points in the traversal window in the second dispersion matrix is larger than a second preset threshold value, all the pixel points in the traversal window are determined as the low-temperature dispersion type pixel points.
9. The method according to claim 7, wherein the processing the channel value of each channel of the pixel point according to the dispersion processing coefficient corresponding to the dispersion type of the pixel point to obtain the de-dispersed channel value of each channel of the pixel point comprises:
determining a minimum channel value in a plurality of channel values of the pixel points;
for each channel of the pixel point, determining a compensation value of the channel according to the channel value of the channel and the minimum channel value, wherein the compensation value of the channel is in positive correlation with the minimum channel value and in negative correlation with the dispersion processing coefficient;
and determining a de-dispersion channel value of the channel according to the compensation value of the channel and the channel value of the channel, wherein the de-dispersion channel value is in positive correlation with the compensation value and the channel value respectively.
10. The method of claim 1, wherein the acquiring the first image to be processed comprises:
acquiring a third image, and determining a designated pixel point from the third image;
determining a first image with the designated pixel point as the center and the detection width as the designated detection width in the third image;
the designated pixel points are pixel points of which the gray value is within a preset gray value range, the difference between the gray value and the gray value of the edge pixel points of the first image is smaller than a first preset threshold value, and the difference between the maximum channel value and the minimum channel value is larger than a second preset threshold value.
11. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a first image to be processed;
the first determining module is used for determining the dispersion distribution rule of any one of a plurality of pixel points in the first image according to the channel value of the pixel point;
the second determining module is used for determining the dispersion type of the pixel point according to the dispersion distribution rule of the pixel point;
and the processing module is used for performing dispersion removal processing on the pixel points according to the dispersion types of the pixel points to obtain a second image.
12. A computer device, characterized in that the computer device comprises:
at least one processor; and
at least one memory;
the at least one memory stores one or more programs configured for execution by the at least one processor, the one or more programs including instructions for performing the image processing method of any of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910926843.7A CN111340894B (en) | 2019-09-27 | 2019-09-27 | Image processing method, device and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910926843.7A CN111340894B (en) | 2019-09-27 | 2019-09-27 | Image processing method, device and computer equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111340894A true CN111340894A (en) | 2020-06-26 |
CN111340894B CN111340894B (en) | 2023-07-14 |
Family
ID=71185204
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910926843.7A Active CN111340894B (en) | 2019-09-27 | 2019-09-27 | Image processing method, device and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111340894B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1092233A (en) * | 1992-10-30 | 1994-09-14 | 夏普公司 | The method of projected image displaying apparatus and its look inhomogeneities of correction |
CN106572342A (en) * | 2016-11-10 | 2017-04-19 | 北京奇艺世纪科技有限公司 | Image anti-distortion and anti-dispersion processing method, device and virtual reality device |
US20170161583A1 (en) * | 2015-12-03 | 2017-06-08 | Le Holdings (Beijing) Co., Ltd. | Testing device and method thereof |
CN108124141A (en) * | 2017-12-15 | 2018-06-05 | 浙江大华技术股份有限公司 | A kind of image processing method and device |
WO2018137773A1 (en) * | 2017-01-27 | 2018-08-02 | Sony Mobile Communications Inc | Method and device for blind correction of lateral chromatic aberration in color images |
CN108399606A (en) * | 2018-02-02 | 2018-08-14 | 北京奇艺世纪科技有限公司 | A kind of method and device of Image Adjusting |
-
2019
- 2019-09-27 CN CN201910926843.7A patent/CN111340894B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1092233A (en) * | 1992-10-30 | 1994-09-14 | 夏普公司 | The method of projected image displaying apparatus and its look inhomogeneities of correction |
US20170161583A1 (en) * | 2015-12-03 | 2017-06-08 | Le Holdings (Beijing) Co., Ltd. | Testing device and method thereof |
CN106572342A (en) * | 2016-11-10 | 2017-04-19 | 北京奇艺世纪科技有限公司 | Image anti-distortion and anti-dispersion processing method, device and virtual reality device |
WO2018137773A1 (en) * | 2017-01-27 | 2018-08-02 | Sony Mobile Communications Inc | Method and device for blind correction of lateral chromatic aberration in color images |
CN108124141A (en) * | 2017-12-15 | 2018-06-05 | 浙江大华技术股份有限公司 | A kind of image processing method and device |
CN108399606A (en) * | 2018-02-02 | 2018-08-14 | 北京奇艺世纪科技有限公司 | A kind of method and device of Image Adjusting |
Non-Patent Citations (1)
Title |
---|
侯伟佳;: "对松下摄像机色散校正补偿技术的研究和使用" * |
Also Published As
Publication number | Publication date |
---|---|
CN111340894B (en) | 2023-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109829456B (en) | Image identification method and device and terminal | |
CN109360222B (en) | Image segmentation method, device and storage medium | |
CN110059685A (en) | Word area detection method, apparatus and storage medium | |
CN109302632B (en) | Method, device, terminal and storage medium for acquiring live video picture | |
CN109522863B (en) | Ear key point detection method and device and storage medium | |
CN114170349A (en) | Image generation method, image generation device, electronic equipment and storage medium | |
CN111028144A (en) | Video face changing method and device and storage medium | |
CN110839174A (en) | Image processing method and device, computer equipment and storage medium | |
CN110941375A (en) | Method and device for locally amplifying image and storage medium | |
CN110991457A (en) | Two-dimensional code processing method and device, electronic equipment and storage medium | |
CN112135191A (en) | Video editing method, device, terminal and storage medium | |
CN111754386A (en) | Image area shielding method, device, equipment and storage medium | |
CN112396076A (en) | License plate image generation method and device and computer storage medium | |
CN113038165A (en) | Method, apparatus and storage medium for determining a set of coding parameters | |
CN111586279B (en) | Method, device and equipment for determining shooting state and storage medium | |
CN114494073A (en) | Image processing method, device, equipment and storage medium | |
CN111753606A (en) | Intelligent model upgrading method and device | |
CN111860064A (en) | Target detection method, device and equipment based on video and storage medium | |
CN112882094B (en) | First-arrival wave acquisition method and device, computer equipment and storage medium | |
CN113709353B (en) | Image acquisition method and device | |
CN110443841B (en) | Method, device and system for measuring ground depth | |
CN112243083B (en) | Snapshot method and device and computer storage medium | |
CN110620935B (en) | Image processing method and device | |
CN113824902A (en) | Method, device, system, equipment and medium for determining time delay of infrared camera system | |
CN111757146B (en) | Method, system and storage medium for video splicing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |