CN111524152B - Depth estimation-based method and system for improving calculation efficiency of underwater image restoration - Google Patents
Depth estimation-based method and system for improving calculation efficiency of underwater image restoration Download PDFInfo
- Publication number
- CN111524152B CN111524152B CN202010380531.3A CN202010380531A CN111524152B CN 111524152 B CN111524152 B CN 111524152B CN 202010380531 A CN202010380531 A CN 202010380531A CN 111524152 B CN111524152 B CN 111524152B
- Authority
- CN
- China
- Prior art keywords
- image
- depth estimation
- point
- value
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000004364 calculation method Methods 0.000 title claims abstract description 35
- 238000012545 processing Methods 0.000 claims abstract description 29
- 238000003708 edge detection Methods 0.000 claims abstract description 19
- 238000010606 normalization Methods 0.000 claims abstract description 10
- 238000000926 separation method Methods 0.000 claims abstract description 8
- 230000000877 morphologic effect Effects 0.000 claims abstract description 6
- 230000009467 reduction Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000002474 experimental method Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 230000006872 improvement Effects 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000003707 image sharpening Methods 0.000 description 3
- 238000007906 compression Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000000889 atomisation Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000013441 quality evaluation Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention provides a depth estimation-based method and a depth estimation-based system for improving the efficiency of underwater image restoration calculation, which sequentially perform graying processing on an original image, edge detection on an image subjected to graying processing, morphological closed operation, background and distant view separation and normalization processing, obtain a scene depth estimation image with a small near view pixel value and a large distant view pixel value, and set a depth threshold value P, wherein P is the maximum value of a depth estimation value in a near view area; and if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, setting the point as an adopted point, and using the data information of the point for data calculation of a subsequent program. The invention provides a concept of improving the efficiency of underwater image restoration calculation based on depth estimation, namely, according to the depth estimation information of an original image, the quantity of pixel point information is compressed by reserving the pixel point information of a near scene area in the depth estimation and neglecting the pixel point information of a far scene area, and the aim of improving the working efficiency of an algorithm is fulfilled.
Description
Technical Field
The invention relates to the technical field of underwater image sharpening processing, in particular to a depth estimation-based method and a depth estimation-based system for improving the efficiency of underwater image restoration calculation.
Background
The conventional idea of underwater image sharpening is to process an underwater image obtained by shooting an underwater camera through various image processing methods, and eliminate the influence of factors such as light absorption and scattering in water on the image as much as possible so as to obtain an image with higher definition relative to an original image. The principle of the image restoration method is to restore original information of an image by using a priori knowledge model of an image degradation process. The image enhancement method can directly adjust and process in a color space or a transformation domain without applying a priori knowledge model.
When light is transmitted in water, the light is affected by the absorption effect and scattering effect of water media and particles in water on the light, so that the defects of blurring or atomization, reduced contrast, color distortion, uneven brightness and the like of an underwater shot image are caused, and great troubles are caused for target feature extraction, detection and tracking. Although the quality of an underwater image is improved to a certain extent by the existing method, in the calculation process of the image sharpening processing method, a certain parameter value is often obtained by calculating, comparing and the like of each pixel point in the image, so that the phenomena of long time consumption, low efficiency and the like of the method are caused.
Disclosure of Invention
The invention aims to solve the technical problem of how to improve the underwater image processing efficiency.
The invention solves the technical problems through the following technical means:
an underwater image restoration calculation efficiency improvement method based on depth estimation comprises
S01, carrying out gray processing on an original image;
s02, carrying out Sobel operator edge detection on the image which is subjected to graying processing;
s03, performing morphological closed operation on the result obtained in the step S02 to realize the separation of the background and the long-range view;
s04, carrying out normalization processing on the result obtained in the step S03 to obtain a scene depth estimation image with a small close-range pixel value and a large driving-range pixel value;
s05, setting a depth threshold value P, wherein P is the maximum value of the depth estimation value in the close-range area; and if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, setting the point as an adopted point, using the data information of the point for data calculation of a subsequent program, and otherwise, not adopting the data information of the coordinate point.
The method provides a concept of improving the efficiency of the underwater image restoration calculation based on depth estimation, namely, according to the depth estimation information of an original image, the quantity of pixel point information is compressed by reserving the pixel point information of a near scene area in the depth estimation and neglecting the pixel point information of a far scene area, and the aim of improving the working efficiency of an algorithm is fulfilled.
Preferably, the edge detection principle in step S02 is as follows:
wherein G is X 、G Y Respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB of the pixel points of the original image
Information, G is the final merged gradient.
Preferably, the determination rule in step S05 is:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
The invention also provides a depth estimation-based underwater image restoration calculation efficiency improving system, which comprises
The graying processing module is used for performing graying processing on the original image;
the edge detection module is used for carrying out Sobel operator edge detection on the image which is subjected to graying processing;
the morphology closed operation module is used for performing morphology closed operation on the result obtained by the edge detection module to realize the separation of the background and the long-range view;
the normalization processing module is used for performing normalization processing on the result obtained by the morphology closed operation module to obtain a scene depth estimation image with a small close-range pixel value and a large vision pixel value;
the calculation module is used for setting a depth threshold value P, wherein the P is the maximum value of depth estimation values in the close-range region; if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, the point is set as an adopted point, the data information of the point is used for data calculation of a subsequent program, and otherwise, the data information of the coordinate point is not adopted.
Preferably, the edge detection principle is as follows:
wherein G X 、G Y The gradient is respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB information of the pixel points of the original image, and G is the final combined gradient.
Preferably, the judgment rule in the calculation module is:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
The invention has the advantages that:
aiming at the existing defects, the invention provides a concept of improving the efficiency of the underwater image restoration calculation based on the depth estimation according to the different importance of the original image in the near view area and the far view area, namely, according to the depth estimation information of the original image, the quantity compression of the pixel point information is realized by reserving the pixel point information of the near view area in the depth estimation and neglecting the pixel point information of the far view area, and the aim of improving the working efficiency of the algorithm is achieved. Specifically, according to the depth estimation information of the original image, the distant points with lower importance in the scene are ignored, the calculated amount of a compression program is achieved, and the working efficiency of the algorithm is improved. Compared with other methods for realizing the calculation amount compression of the underwater image restoration method by reducing the number of calculation points through the comparison of experimental data, the method can provide better calculation amount reduction for the underwater image restoration method with pixel point operation on the premise of ensuring the detail information, edge information and color information of the result image as much as possible.
Drawings
FIG. 1 is a flowchart of an underwater image restoration calculation efficiency improvement method based on depth estimation according to an embodiment of the present invention;
fig. 2 is a time-consuming reduction rate statistical graph of three experiments for verifying a depth estimation-based underwater image restoration calculation efficiency improvement method in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the present embodiment provides a method for improving efficiency of underwater image restoration calculation based on depth estimation, including
S01, carrying out gray processing on an original image;
s02, performing Sobel operator edge detection on the image which is subjected to graying processing; the edge detection principle is as follows:
wherein G is X 、G Y The gradient is respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB information of the pixel points of the original image, and G is the final combined gradient.
S03, performing morphological closed operation on the result obtained in the step S02 to realize the separation of the background and the long-range view;
s04, carrying out normalization processing on the result obtained in the step S03 to obtain a scene depth estimation image with a small close-range pixel value and a large main-range pixel value;
s05, setting a depth threshold value P, wherein the P is the maximum value of depth estimation values in a close-range area; if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, the point is set as an adopted point, the data information of the point is used for data calculation of a subsequent program, and otherwise, the data information of the coordinate point is not adopted. The judgment rule is as follows:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
The method provides a concept of improving the efficiency of the underwater image restoration calculation based on depth estimation, namely, according to the depth estimation information of an original image, the quantity of pixel point information is compressed by reserving the pixel point information of a near scene area in the depth estimation and neglecting the pixel point information of a far scene area, and the aim of improving the working efficiency of an algorithm is fulfilled.
The invention also provides a depth estimation-based underwater image restoration calculation efficiency improving system,
the graying processing module is used for performing graying processing on the original image;
the edge detection module is used for carrying out Sobel operator edge detection on the image which is subjected to graying processing;
the morphological closing operation module is used for performing morphological closing operation on the result obtained by the edge detection module to realize the separation of the background and the distant view;
the normalization processing module is used for performing normalization processing on the result obtained by the morphology closed operation module to obtain a scene depth estimation image with a small close-range pixel value and a large vision pixel value;
the calculation module is used for setting a depth threshold value P, wherein the P is the maximum value of depth estimation values in the close-range region; and if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, setting the point as an adopted point, using the data information of the point for data calculation of a subsequent program, and otherwise, not adopting the data information of the coordinate point.
Preferably, the edge detection principle is as follows:
wherein G is X 、G Y Respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB of the pixel points of the original image
Information, G is the final merged gradient.
Preferably, the judgment rule in the calculation module is:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
In order to verify the effectiveness of the method, the method is experimentally verified.
(1) The experimental foundation is as follows:
the experimental equipment is a notebook computer, msi microscoon GP62 QE-215XCN, a CPU, intel core i75700HQ, a main frequency 2.7GHz, a 8G internal memory and a double display card. And an underwater image restoration method in scene depth estimation and white balance-based underwater image restoration is taken as an experimental subject, and the calculated amount is compressed in the process of calculating RGB channels by a least square method from a middle step diagram and an original diagram in the method to obtain related parameters, so that the working efficiency is improved.
(2) Time-consuming reduction rate experiments:
in order to verify the efficiency improvement effect of the method in time consumption, the parameter threshold value P is tested for 3 times in the experiment, and the selected value can be expressed by the following formula:
P=(S MAX -S MIN )·B L
wherein S MAX Representing the maximum value of the depth estimate, S MIN Denotes the minimum value of the depth estimate, B L Is a scaling factor.
First experiment B L 25%, second experiment B L 50% for the third experiment B L The calculation efficiency improvement effect of the original algorithm after the method is applied is represented by a time-consuming reduction rate Q, which is defined as follows:
wherein T is 1 Represents the time consumption of the original algorithm, T 2 Indicating that the procedure is time consuming.
The time-consuming reduction rate statistics for three experiments were as follows 2:
as shown in fig. 2, when the threshold P is set to 75% of the difference between the nearest point and the farthest point in the depth estimation map, the average reduction rate of the consumed time is about 29.4%; when the threshold P is set to 50% of the difference in value, the average elapsed time reduction rate is about 32%; when the threshold P is set to 25% of the difference in value, the average elapsed time reduction rate is about 33.6%.
(3) Results of different methods with equal screening points
Aiming at the characteristics of an underwater image and the purpose of image restoration, color cast, contrast, average gradient and PSNR (Peak Signal to noise ratio) values are selected as evaluation indexes. Other methods for reducing the number of points to be calculated for comparison include random point selection and interval point selection.
The color deviation value can be compared with the deviation between the color expression of the obtained result image and the color expression under a standard reference light source, and the smaller the color deviation value is, the more the color deviation value accords with the observation effect of human eyes; the contrast value may compare the liveliness, richness of the image color. In the image, the contrast value is higher when the difference between the brightest point and the darkest point of different color channels is larger, the image is more obvious when the contrast value is higher, and the color expression is richer and more prominent; the average gradient is used for reflecting the definition and texture change of the image, and the larger the average gradient value is, the clearer the image is; the PSNR value is an image quality evaluation index based on the degree of error sensitivity, and the degree of similarity between the result map and the original image can be compared, and a larger value indicates a smaller difference from the original image.
The random point-taking method is to carry out 0-N once on each pixel point to be operated in the algorithm process R If the random number is larger than the set threshold M R ,M R <N R If so, calculating by using the numerical information of the coordinate point, otherwise, not using the information of the coordinate point; the interval point-taking method is to take the image origin as the initial point in the calculation process and set m r ,n r The separation threshold for the x-axis and y-axis respectively. If the remainder of the quotient of the coordinates of the point on the image and the interval threshold value is 0, the point is a adopted point.
The method is used for comparing the effect of the result images of the methods when the number of the adopted pixel points of the same original image is equal. Firstly, the numerical values of all points on the depth estimation map of the method are arranged from small to large, and the smaller the numerical value is, the closer the point position is to the acquisition equipment is, and the threshold value P is obtained as follows:
wherein S 25 Depth information values of 25 th% of the points, S, arranged from small to large for the entire number series MAX Representing the maximum value of the depth estimate, S MIN Expressing the minimum value of the depth estimation value and setting a parameter N of a random point taking method R =4,M R =3; setting parameter m of interval point taking method r =1,n r And =1. The number of the adopted pixel points of the random point taking method, the interval point taking method and the text method is equal, and a plurality of 640 × 480 underwater images are processed. The mean values of the obtained image quality indexes are shown in table 1:
TABLE 1 mean values of image quality indices
From the above statistical results, when the number of the adopted pixels is equal, the result values of the random point taking and the interval point taking are similar, wherein the color cast and the gradient effect of the random point taking are slightly good, but the difference is not large, and the contrast, the PSNR value and the operation time consumption of the method are superior to those of the results of other methods, and the difference is large. In conclusion, the method has smaller influence on the detail texture and the overall error of the original result graph and has more advantages in the aspect of time consumption of operation.
The above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (6)
1. A method for improving the efficiency of underwater image restoration calculation based on depth estimation is characterized by comprising the following steps: comprises that
S01, carrying out gray processing on an original image;
s02, performing Sobel operator edge detection on the image which is subjected to graying processing;
s03, performing morphological closed operation on the result obtained in the step S02 to realize the separation of the background and the long-range view;
s04, carrying out normalization processing on the result obtained in the step S03 to obtain a scene depth estimation image with a small close-range pixel value and a large driving-range pixel value;
s05, setting a depth threshold value P, wherein the P is the maximum value of depth estimation values in a close-range area; and if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, setting the point as an adopted point, using the data information of the point for data calculation of a subsequent program, and otherwise, not adopting the data information of the coordinate point.
2. The method for improving the efficiency of underwater image restoration computation based on depth estimation according to claim 1, characterized in that: the edge detection principle in step S02 is as follows:
wherein G X 、G Y The gradient is respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB information of the pixel points of the original image, and G is the final combined gradient.
3. The method for improving efficiency of underwater image restoration calculation based on depth estimation according to claim 1 or 2, characterized in that: the judgment rule in step S05 is:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
4. The utility model provides an underwater image restores computational efficiency lift system based on depth estimation which characterized in that: comprises that
The graying processing module is used for performing graying processing on the original image;
the edge detection module is used for carrying out Sobel operator edge detection on the image which is subjected to graying processing;
the morphology closed operation module is used for performing morphology closed operation on the result obtained by the edge detection module to realize the separation of the background and the long-range view;
the normalization processing module is used for performing normalization processing on the result obtained by the morphology closed operation module to obtain a scene depth estimation image with a small close-range pixel value and a large vision pixel value;
the calculation module is used for setting a depth threshold value P, wherein the P is the maximum value of depth estimation values in the close-range region; and if the scene depth estimation value of the coordinate point of the original image is smaller than the depth threshold value P, setting the point as an adopted point, using the data information of the point for data calculation of a subsequent program, and otherwise, not adopting the data information of the coordinate point.
5. The system of claim 4, wherein the computing efficiency of the underwater image restoration based on the depth estimation is increased by: the edge detection principle is as follows:
wherein G is X 、G Y The gradient is respectively the horizontal gradient and the vertical gradient of the original image, I is the RGB information of the pixel points of the original image, and G is the final combined gradient.
6. The system of claim 4, wherein the computing efficiency of the underwater image restoration based on the depth estimation is increased by: the judgment rule in the calculation module is as follows:
f (x, y) is whether pixel point RGB information with coordinates of x and y in the image is adopted, and d (x, y) is a scene depth value with coordinates of x and y in the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010380531.3A CN111524152B (en) | 2020-05-08 | 2020-05-08 | Depth estimation-based method and system for improving calculation efficiency of underwater image restoration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010380531.3A CN111524152B (en) | 2020-05-08 | 2020-05-08 | Depth estimation-based method and system for improving calculation efficiency of underwater image restoration |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111524152A CN111524152A (en) | 2020-08-11 |
CN111524152B true CN111524152B (en) | 2023-04-11 |
Family
ID=71905143
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010380531.3A Active CN111524152B (en) | 2020-05-08 | 2020-05-08 | Depth estimation-based method and system for improving calculation efficiency of underwater image restoration |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111524152B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114419045A (en) * | 2022-03-30 | 2022-04-29 | 武汉中导光电设备有限公司 | Method, device and equipment for detecting defects of photoetching mask plate and readable storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8131098B2 (en) * | 2007-07-06 | 2012-03-06 | Panasonic Corporation | Image processing device, image processing method, image processing system, program, storage medium, and integrated circuit |
CN103400362B (en) * | 2013-07-30 | 2015-11-25 | 中国人民解放军第三军医大学第三附属医院 | Accident close-range figure and Aerial Images merge the method obtaining clear scene graph mutually |
CN108932700A (en) * | 2018-05-17 | 2018-12-04 | 常州工学院 | Self-adaption gradient gain underwater picture Enhancement Method based on target imaging model |
-
2020
- 2020-05-08 CN CN202010380531.3A patent/CN111524152B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN111524152A (en) | 2020-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Gu et al. | Multiscale natural scene statistical analysis for no-reference quality evaluation of DIBR-synthesized views | |
CN112308095A (en) | Picture preprocessing and model training method and device, server and storage medium | |
US8126268B2 (en) | Edge-guided morphological closing in segmentation of video sequences | |
EP1969560B1 (en) | Edge-controlled morphological closing in segmentation of video sequences | |
Park et al. | Single image dehazing with image entropy and information fidelity | |
EP1969559B1 (en) | Contour finding in segmentation of video sequences | |
US20090028432A1 (en) | Segmentation of Video Sequences | |
CN110675340A (en) | Single image defogging method and medium based on improved non-local prior | |
Ling et al. | Single image dehazing using saturation line prior | |
CN111539246A (en) | Cross-spectrum face recognition method and device, electronic equipment and storage medium thereof | |
CN112070739A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
Mi et al. | Multi-purpose oriented real-world underwater image enhancement | |
CN111524152B (en) | Depth estimation-based method and system for improving calculation efficiency of underwater image restoration | |
Chen et al. | Layered projection-based quality assessment of 3D point clouds | |
Liu et al. | Contrast enhancement via dual graph total variation-based image decomposition | |
CN111027564A (en) | Low-illumination imaging license plate recognition method and device based on deep learning integration | |
Tung et al. | ICEBIN: Image contrast enhancement based on induced norm and local patch approaches | |
Wang et al. | Single-image dehazing using color attenuation prior based on haze-lines | |
CN110633705A (en) | Low-illumination imaging license plate recognition method and device | |
Xiang et al. | An improved exemplar-based image inpainting algorithm | |
Guo et al. | Single Image Dehazing Using Adaptive Sky Segmentation | |
Wang et al. | Gray projection for single image dehazing | |
Che et al. | Reduced-reference quality metric for screen content image | |
CN112862726B (en) | Image processing method, device and computer readable storage medium | |
Sun et al. | A Channel Modified Underwater Image Enhancement Algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |