CN109472749B - Edge enhancement algorithm for ultra-wide-angle image - Google Patents

Edge enhancement algorithm for ultra-wide-angle image Download PDF

Info

Publication number
CN109472749B
CN109472749B CN201811265228.8A CN201811265228A CN109472749B CN 109472749 B CN109472749 B CN 109472749B CN 201811265228 A CN201811265228 A CN 201811265228A CN 109472749 B CN109472749 B CN 109472749B
Authority
CN
China
Prior art keywords
wide
angle image
ultra
image
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811265228.8A
Other languages
Chinese (zh)
Other versions
CN109472749A (en
Inventor
向北海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Upixels Technology Co ltd
Original Assignee
Hunan Upixels Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Upixels Technology Co ltd filed Critical Hunan Upixels Technology Co ltd
Priority to CN201811265228.8A priority Critical patent/CN109472749B/en
Publication of CN109472749A publication Critical patent/CN109472749A/en
Application granted granted Critical
Publication of CN109472749B publication Critical patent/CN109472749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an edge enhancement algorithm for an ultra-wide angle image, which comprises the following steps: acquiring an ultra-wide angle image; constructing an edge enhancement model of the super-wide-angle image, and solving the circle center coordinate and radius of the super-wide-angle image; transforming the super wide-angle image from an RGB space to a YUV space, and calculating the average value of each pixel point on the super wide-angle image on a Y component; selecting a buffer area omega 1 formed by pixel points with the distance to the circle center larger than 0.5r from the ultra-wide-angle image, and performing mean value compensation on the pixel points in the buffer area omega 1; selecting a buffer region omega 2 formed by pixel points with the distance to the circle center being more than 0.5r and less than 0.75r from the ultra-wide-angle image, and performing fusion compensation on the pixel points in the buffer region omega 2; and obtaining the ultra-wide-angle image after edge enhancement. After the average value of the ultra-wide-angle image on the Y component is calculated, the edge part of the ultra-wide-angle image is enhanced through average value compensation and fusion compensation, and the ultra-wide-angle image with higher definition is obtained. The invention is applied to the field of image processing.

Description

Edge enhancement algorithm for ultra-wide-angle image
Technical Field
The invention relates to the technical field of image processing, computer vision and virtual reality, in particular to an edge enhancement algorithm for an ultra-wide-angle image.
Background
Along with the continuous development of artificial intelligence in recent years, computer vision technology is more and more widely applied. The demands of many applications have shown limitations in the range of normal lens angles. The ultra-wide-angle lens has an ultra-large view field, scene information of 189 degrees or even 220 degrees can be obtained at one time, the scene information far exceeds the shooting view field of a common lens vision system and also greatly exceeds the natural observation capability of human beings, and the ultra-wide-angle lens is widely applied to the fields of security monitoring, industrial medical treatment, intelligent transportation and the like.
Although the wide-field characteristic of the ultra-wide-angle lens enables the ultra-wide-angle lens to be widely applied to various fields, due to the special optical characteristics of the ultra-wide-angle lens, the quality of a shot image has a great problem: on one hand, the image is seriously distorted and distorted in the whole, and on the other hand, the definition of the edge part of the image is obviously lower than that of the central part. For the former, many distortion correction algorithms have been proposed to solve the problem, but the latter has not been a good solution.
Disclosure of Invention
Aiming at the problems that the definition of the edge part of an ultra-wide-angle image is obviously lower than that of the central part in the prior art, the invention aims to provide an edge enhancement algorithm for the ultra-wide-angle image.
The technical scheme adopted by the invention is as follows:
an edge enhancement algorithm for ultra-wide angle images, comprising the steps of:
s1, acquiring a super wide-angle image, wherein effective information of all scenes in the super wide-angle image is concentrated in the same circular area;
s2, constructing an edge enhancement model of the ultra-wide-angle image: setting a plane coordinate system X-O-Y, locating the ultra-wide angle image in an XOY plane, and calculating the center coordinates (X) of the ultra-wide angle image A (X, Y)0,y0) And a radius r;
s3, transforming the super wide-angle image A (x, Y) from an RGB space to a YUV space, and calculating the mean (x, Y) of each pixel point on the super wide-angle image A (x, Y) on the Y component;
s4, selecting a buffer area omega 1 formed by pixels with the distance to the circle center larger than 0.5r from the super wide-angle image, and performing mean value compensation on the pixels in the buffer area omega 1 according to the mean value mean (x, y) obtained in the step S3;
s5, selecting a buffer region omega 2 formed by pixel points with the distance to the circle center being more than 0.5r and less than 0.75r from the ultra-wide-angle image, and performing fusion compensation on the pixel points with the mean value compensation completed in the buffer region omega 2;
and S6, obtaining the ultra-wide-angle image after edge enhancement.
As a further improvement of the technical scheme, the center coordinates (x) of the ultra-wide angle image A (x, y)0,y0) The process of obtaining the sum radius r is as follows:
s21, converting the color ultra-wide-angle image A (x, y) into a gray scale image G (x, y);
s22, carrying out binarization processing on the gray level image G (x, y) to obtain a binarized image GB (x, y);
s23, calculating the center coordinates (x) of the super-wide angle image0,y0) And radius r:
Figure BDA0001844754040000021
Figure BDA0001844754040000022
Figure BDA0001844754040000023
in the formula, N is the total number of all white pixels in the binary image GB (x, y), Σ x 'is the sum of the abscissa of all white pixels in the binary image, Σ y' is the sum of the ordinate of all white pixels in the binary image.
As a further improvement of the above technical solution, a transformation formula for transforming the super wide angle image a (x, y) from the RGB space to the YUV space is:
Figure BDA0001844754040000024
in the formula, R, G, B represents an R component, a G component, and a B component in an RGB space, and Y, U, V represents a Y component, a U component, and a V component in a YUV space.
As a further improvement of the above technical solution, the process of calculating the mean (x, Y) of each pixel point on the ultra-wide angle image a (x, Y) on the Y component specifically includes:
s31, traversing pixel points on the ultra-wide-angle image A (x, Y), and calculating an integral graph GY (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) in the Y component:
selecting a pixel point from the ultra-wide-angle image A (x, Y), and calculating the sum of the color values of Y components of all the pixel points in a rectangular area formed by the pixel points from the upper left corner of the ultra-wide-angle image A (x, Y), namely the value of an integral graph of the pixel point;
s32, calculating the Mean (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) on the Y component according to the GY (x, Y) of the integral map:
Figure BDA0001844754040000031
where w and h are the length and width of the mean window, respectively.
As a further improvement of the above technical solution, in step S4, the specific process of performing mean value compensation on the pixel points in the buffer region Ω 1 includes:
B(x,y)=A(x,y)+α(x,y)·[A(x,y)-Mean(x,y)]
Figure BDA0001844754040000032
in the formula, α (x, y) is a weighting coefficient, and B (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 1.
As a further improvement of the above technical solution, in step S5, the specific process of performing fusion compensation on the pixel points in the buffer region Ω 2 is as follows:
C(x,y)=β(x,y)×A(x,y)+(1-β(x,y))×B(x,y)
Figure BDA0001844754040000033
in the formula, β (x, y) is a weighting coefficient, and C (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 2.
As a further improvement of the above technical solution, in step S6, the expression for obtaining the edge-enhanced super wide-angle image is as follows:
Figure BDA0001844754040000034
the invention has the beneficial technical effects that:
after the mean value of the ultra-wide-angle image on the Y component is calculated, the edge part of the ultra-wide-angle image is enhanced through mean value compensation and fusion compensation, the ultra-wide-angle image with higher definition is obtained, the algorithm complexity is low, the edge enhancement effect is obvious, the definition of the edge part of the ultra-wide-angle image can be effectively improved in real time, and the defects of the existing ultra-wide-angle image processing method are effectively overcome.
Drawings
FIG. 1 is a schematic flow chart of the present embodiment;
fig. 2 is a schematic diagram of the distribution of the buffer region Ω 1 in the ultra-wide angle image;
fig. 3 is a schematic diagram of the distribution of the buffer region Ω 1 in the ultra-wide-angle image.
Detailed Description
In order to facilitate the practice of the invention, further description is provided below with reference to specific examples.
An edge enhancement algorithm for an ultra-wide angle image as shown in fig. 1 comprises the following steps:
s1, acquiring a super wide-angle image, wherein effective information of all scenes in the super wide-angle image is concentrated in the same circular area;
s2, constructing an edge enhancement model of the ultra-wide-angle image: setting a plane coordinate system X-O-Y, locating the ultra-wide angle image in an XOY plane, and calculating the center coordinates (X) of the ultra-wide angle image A (X, Y)0,y0) And a radius r;
s21, converting the color ultra-wide-angle image A (x, y) into a gray scale image G (x, y);
s22, carrying out binarization processing on the gray level image G (x, y) to obtain a binarized image GB (x, y);
s23, calculating the center coordinates (x) of the super-wide angle image0,y0) And radius r:
Figure BDA0001844754040000041
Figure BDA0001844754040000042
Figure BDA0001844754040000043
in the formula, N is the total number of all white pixels in the binary image GB (x, y), Σ x 'is the sum of the abscissa of all white pixels in the binary image, Σ y' is the sum of the ordinate of all white pixels in the binary image.
S3, transforming the super wide-angle image A (x, Y) from the RGB space to the YUV space, and calculating the mean (x, Y) of each pixel point on the super wide-angle image A (x, Y) on the Y component.
The transformation formula for transforming the ultra-wide angle image A (x, y) from the RGB space to the YUV space is as follows:
Figure BDA0001844754040000044
in the formula, R, G, B represents an R component, a G component, and a B component in an RGB space, and Y, U, V represents a Y component, a U component, and a V component in a YUV space.
The process of calculating the mean (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) on the Y component specifically comprises the following steps:
s31, traversing pixel points on the ultra-wide-angle image A (x, Y), and calculating an integral graph GY (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) in the Y component:
selecting a pixel point from the ultra-wide-angle image A (x, Y), and calculating the sum of the color values of Y components of all the pixel points in a rectangular area formed by the pixel points from the upper left corner of the ultra-wide-angle image A (x, Y), namely the value of an integral graph of the pixel point;
s32, calculating the Mean (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) on the Y component according to the GY (x, Y) of the integral map:
Figure BDA0001844754040000051
where w and h are the length and width of the mean window, respectively.
S4, referring to fig. 2, selecting a buffer region Ω 1 composed of pixels with a distance to the center of a circle greater than 0.5r from the super-wide-angle image, and performing mean compensation on the pixels in the buffer region Ω 1 according to the mean (x, y) obtained in step S3, wherein the specific process is as follows: for any pixel (x, y) in the buffer region Ω 1, the difference between the pixel (x, y) and the Mean (x, y) is calculated first, and then the difference is compensated to the pixel through a weighting coefficient, so that the difference between the pixel and the Mean can be increased, and the edge enhancement effect is obtained.
The calculation formula of the mean value compensation is as follows:
B(x,y)=A(x,y)+α(x,y)·[A(x,y)-Mean(x,y)]
Figure BDA0001844754040000052
in the formula, α (x, y) is a weighting coefficient, and B (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 1.
S5, referring to fig. 3, after the mean value compensation in step S3, the edges of the image are enhanced and the sharpness is improved, but the problem of image faults is also present inside the buffer region Ω 1. Therefore, a buffer region Ω 2 formed by 0.75r pixel points whose distance to the circle center is greater than 0.5r and less than that in the ultra-wide-angle image is selected, and fusion compensation is performed on the pixel points whose mean value compensation is completed in the buffer region Ω 2, that is, image fusion is performed on the image whose mean value compensation is completed in the buffer region Ω 2 and the source image in the buffer region Ω 2, so that the image is prevented from being faulted, wherein the source image in the buffer region Ω 2 is an image before mean value compensation, namely a (x, y), and the specific process is as follows:
C(x,y)=β(x,y)×A(x,y)+(1-β(x,y))×B(x,y)
Figure BDA0001844754040000061
in the formula, β (x, y) is a weighting coefficient, and C (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 2.
S6, obtaining the ultra-wide angle image after edge enhancement, wherein the expression is as follows:
Figure BDA0001844754040000062
after the average value of the ultra-wide-angle image on the Y component is calculated, the edge part of the ultra-wide-angle image is enhanced through average value compensation and fusion compensation, the ultra-wide-angle image with higher definition is obtained, the algorithm complexity is low, the edge enhancement effect is obvious, the definition of the edge part of the ultra-wide-angle image can be effectively improved in real time, and the defects of the existing ultra-wide-angle image processing method are effectively overcome.
The foregoing description of the preferred embodiments of the present invention has been included to describe the features of the invention in detail, and is not intended to limit the inventive concepts to the particular forms of the embodiments described, as other modifications and variations within the spirit of the inventive concepts will be protected by this patent. The subject matter of the present disclosure is defined by the claims, not by the detailed description of the embodiments.

Claims (6)

1. An edge enhancement method for an ultra-wide angle image, characterized by comprising the following steps:
s1, acquiring a super wide-angle image, wherein effective information of all scenes in the super wide-angle image is concentrated in the same circular area;
s2, constructing an edge enhancement model of the ultra-wide-angle image: setting a plane coordinate system X-O-Y, locating the ultra-wide angle image in an XOY plane, and calculating the center coordinates (X) of the ultra-wide angle image A (X, Y)0,y0) And a radius r;
s3, transforming the super wide-angle image A (x, Y) from an RGB space to a YUV space, and calculating the mean (x, Y) of each pixel point on the super wide-angle image A (x, Y) on the Y component;
s4, selecting a buffer area omega 1 formed by pixels with the distance to the circle center larger than 0.5r from the super wide-angle image, and performing mean value compensation on the pixels in the buffer area omega 1 according to the mean value mean (x, y) obtained in the step S3;
s5, selecting a buffer region omega 2 formed by pixel points with the distance to the circle center being more than 0.5r and less than 0.75r from the ultra-wide-angle image, and performing fusion compensation on the pixel points with the mean value compensation completed in the buffer region omega 2;
s6, obtaining an ultra-wide angle image after edge enhancement;
in step S2, the coordinates (x) of the center of the circle of the ultra-wide-angle image a (x, y)0,y0) The process of obtaining the sum radius r is as follows:
s21, converting the color ultra-wide-angle image A (x, y) into a gray scale image G (x, y);
s22, carrying out binarization processing on the gray level image G (x, y) to obtain a binarized image GB (x, y);
s23, calculating the center coordinates (x) of the super-wide angle image0,y0) And radius r:
Figure FDA0003205522770000011
Figure FDA0003205522770000012
Figure FDA0003205522770000013
in the formula, N is the total number of all white pixels in the binary image GB (x, y), Σ x 'is the sum of the abscissa of all white pixels in the binary image, Σ y' is the sum of the ordinate of all white pixels in the binary image.
2. The edge enhancement method for super wide-angle image according to claim 1, wherein in step S3, the transformation formula for transforming super wide-angle image a (x, y) from RGB space to YUV space is:
Figure FDA0003205522770000014
in the formula, R, G, B represents an R component, a G component, and a B component in an RGB space, and Y, U, V represents a Y component, a U component, and a V component in a YUV space.
3. The edge enhancement method for the ultra-wide-angle image according to claim 1, wherein in step S3, the process of calculating the mean (x, Y) of each pixel point on the ultra-wide-angle image a (x, Y) on the Y component specifically comprises:
s31, traversing pixel points on the ultra-wide-angle image A (x, Y), and calculating an integral graph GY (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) in the Y component:
selecting a pixel point from the ultra-wide-angle image A (x, Y), and calculating the sum of the color values of Y components of all the pixel points in a rectangular area formed by the pixel points from the upper left corner of the ultra-wide-angle image A (x, Y), namely the value of an integral graph of the pixel point;
s32, calculating the Mean (x, Y) of each pixel point on the ultra-wide-angle image A (x, Y) on the Y component according to the GY (x, Y) of the integral map:
Figure FDA0003205522770000021
where w and h are the length and width of the mean window, respectively.
4. The edge enhancement method for an ultra-wide angle image according to claim 1, wherein in step S4, the specific process of performing mean value compensation on the pixels in the buffer region Ω 1 is:
B(x,y)=A(x,y)+α(x,y)·[A(x,y)-Mean(x,y)]
Figure FDA0003205522770000022
in the formula, α (x, y) is a weighting coefficient, and B (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 1.
5. The edge enhancement method for the ultra-wide angle image according to claim 4, wherein in step S5, the specific process of performing the fusion compensation on the pixels in the buffer region Ω 2 is as follows:
C(x,y)=β(x,y)×A(x,y)+(1-β(x,y))×B(x,y)
Figure FDA0003205522770000023
in the formula, β (x, y) is a weighting coefficient, and C (x, y) is a pixel value after mean value compensation, where (x, y) is ∈ Ω 2.
6. The edge enhancement method for super wide-angle images according to claim 5, wherein in step S6, the expression for obtaining the super wide-angle images after edge enhancement is:
Figure FDA0003205522770000024
CN201811265228.8A 2018-10-29 2018-10-29 Edge enhancement algorithm for ultra-wide-angle image Active CN109472749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811265228.8A CN109472749B (en) 2018-10-29 2018-10-29 Edge enhancement algorithm for ultra-wide-angle image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811265228.8A CN109472749B (en) 2018-10-29 2018-10-29 Edge enhancement algorithm for ultra-wide-angle image

Publications (2)

Publication Number Publication Date
CN109472749A CN109472749A (en) 2019-03-15
CN109472749B true CN109472749B (en) 2021-10-22

Family

ID=65666535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811265228.8A Active CN109472749B (en) 2018-10-29 2018-10-29 Edge enhancement algorithm for ultra-wide-angle image

Country Status (1)

Country Link
CN (1) CN109472749B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104756151A (en) * 2012-07-03 2015-07-01 马赛网络股份有限公司 System and method to enhance and process a digital image
CN106780405A (en) * 2017-02-28 2017-05-31 长沙全度影像科技有限公司 A kind of quick fish eye images edge enhancing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9686440B2 (en) * 2015-01-12 2017-06-20 Intel Corporation Rendering high quality images via micro-segmentation, error diffusion, and edge enhancement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104756151A (en) * 2012-07-03 2015-07-01 马赛网络股份有限公司 System and method to enhance and process a digital image
CN106780405A (en) * 2017-02-28 2017-05-31 长沙全度影像科技有限公司 A kind of quick fish eye images edge enhancing method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Vignette and Exposure Calibration and Compensation;Dan B Goldman;《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20101231;第32卷(第12期);第2276-2288页 *
粗糙集理论在图像增强处理中的应用;张海玲 等;《同济大学学报(自然科学版)》;20080229;第36卷(第2期);第254-257页 *

Also Published As

Publication number Publication date
CN109472749A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109754377B (en) Multi-exposure image fusion method
CN107767354B (en) Image defogging algorithm based on dark channel prior
CN109785291B (en) Lane line self-adaptive detection method
Gao et al. Sand-dust image restoration based on reversing the blue channel prior
CN110782477A (en) Moving target rapid detection method based on sequence image and computer vision system
Wang et al. Variational single nighttime image haze removal with a gray haze-line prior
CN110428389B (en) Low-light-level image enhancement method based on MSR theory and exposure fusion
CN110956661A (en) Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN110992263A (en) Image splicing method and system
CN110910456B (en) Three-dimensional camera dynamic calibration method based on Harris angular point mutual information matching
CN113744315B (en) Semi-direct vision odometer based on binocular vision
CN106023108A (en) Image defogging algorithm based on boundary constraint and context regularization
CN111899200B (en) Infrared image enhancement method based on 3D filtering
CN111563852A (en) Dark channel prior defogging method based on low-complexity MF
CN109635809B (en) Super-pixel segmentation method for visual degradation image
CN109345479B (en) Real-time preprocessing method and storage medium for video monitoring data
CN111311503A (en) Night low-brightness image enhancement system
CN117726545A (en) Image defogging method using non-local foggy line and multiple exposure fusion
CN105303544A (en) Video splicing method based on minimum boundary distance
CN109472749B (en) Edge enhancement algorithm for ultra-wide-angle image
CN110910457A (en) Multispectral three-dimensional camera external parameter calculation method based on angular point characteristics
CN107590790B (en) Simple lens edge area deblurring method based on symmetric edge filling
CN116263942A (en) Method for adjusting image contrast, storage medium and computer program product
CN109345488B (en) Distortion correction method for ultra-wide-angle image shot by mobile phone angle expanding lens
CN112419172A (en) Remote sensing image processing method for correcting and deblurring inclined image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant