CN102819850A - Method for detecting edge of color image on basis of local self-adaption color difference threshold - Google Patents

Method for detecting edge of color image on basis of local self-adaption color difference threshold Download PDF

Info

Publication number
CN102819850A
CN102819850A CN2012102914785A CN201210291478A CN102819850A CN 102819850 A CN102819850 A CN 102819850A CN 2012102914785 A CN2012102914785 A CN 2012102914785A CN 201210291478 A CN201210291478 A CN 201210291478A CN 102819850 A CN102819850 A CN 102819850A
Authority
CN
China
Prior art keywords
aberration
pixel
image
value
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012102914785A
Other languages
Chinese (zh)
Inventor
李勃
杨娴
丁文
董蓉
江登表
廖娟
陈启美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
李勃
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 李勃 filed Critical 李勃
Priority to CN2012102914785A priority Critical patent/CN102819850A/en
Publication of CN102819850A publication Critical patent/CN102819850A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a method for detecting an edge of a color image on the basis of a local self-adaption color difference threshold and belongs to the field of image processing and computer vision. The method comprises the following steps: firstly, establishing a weighting function containing a color difference threshold of a background luminance mask and contrast ratio sensitivity function; confirming the color difference threshold of each pixel point according to neighborhood information; if a calculating result of a color difference gradient operator of the pixel point is greater than the threshold, judging the present pixel as an edge point and displaying; and if not, setting the pixel luminance as zero. A test proves that according to the method provided by the invention, the luminance mask effect of human eyes and contrast ratio sensitivity characteristics are considered, the visual sensing characteristics of the human eyes are approached, an image edge sensed by the human eyes is effectively detected, and meanwhile, the problem of edge over-detection caused by threshold judgment in the traditional algorithm is avoided, and the noise resistance is better.

Description

Method based on the color images edge detection of local auto-adaptive aberration threshold value
Technical field
The present invention relates to Flame Image Process and computer vision field, be specifically related to a kind of human eye has the color images edge detection of fine robustness to the perception situation of image information and to noise method of simulating.
Background technology
Edge of image is defined as discrete point in the image function, has comprised most of characteristic of image, is the key point of distinguishing object and background, fields of interest and peripheral information.Existing rim detection mainly is converted into gray level image with pending image, regards the image border set of the point that gray-scale value is undergone mutation in the neighborhood as, and classical operator has Sobel operator, SUSAN operator, Laplace operator, Canny operator etc.These algorithms are owing to lack color information, and it is identical but object that color is different is prone to omission to distinguish brightness.
In recent years, the rim detection of coloured image has obtained concern gradually, one type of widely used Color Image Edge extraction algorithm be on original shade of gray edge detection operator, carry out improved; These class methods replace luminance difference with aberration; Calculate the nuclear value difference distance in the operator masterplate district, compare with fixed threshold again, judge whether the pixel at masterplate center is marginal point; Chen H C (Contrast-based color image segmentation for example; IEEE Signal Processing Letters, 2004,11 (7): 641 – 644) Euclidean distance that proposes the gray scale difference in the Laplace operator is changed in the Lab space calculates; People such as Ceng Jun are at document (Color image edge detection method using VTV denoising and color difference [J]; Optik-International Journal for Light and Electron Optics; 2011; Doi:10.1016/j.ijleo.2011.10.0093 and coloured image SUSAN edge detection method [J]; Computer engineering and application, 2011,47 (15): improved respectively Sobel operator and the SUSAN operator that is based on aberration 194-196).For this type algorithm; Choosing of threshold value is very important; Existing algorithm is that single aberration threshold value is decided to be global threshold; The influence of the local message of image to the perception of human eye aberration ignored in this threshold value system of selection, makes a lot of sightless edges crossed detection, and be relatively poor to the robustness of noise.
Summary of the invention
The present invention proposes a kind of method of estimation of the local auto-adaptive aberration threshold value based on human eye vision and combines to be applied in the rim detection of coloured image with gradient operator based on the CIELab aberration, and the detection problem is crossed at the edge that can avoid causing because of decision threshold in the classic method.
The technical scheme that the present invention adopts is following:
Based on the method for the color images edge detection of local auto-adaptive aberration threshold value, detailed process is following:
Step 1: consider the influence of the value of chromatism that the image local background luminance can just be distinguished two kinds of colors in the Lab space, make up background luminance mask weighting function; According to the relation of spatial frequency and contrast perceptual threshold, the texture information of combining image makes up the contrast sensitivity weighting function to the influence of said value of chromatism; Above two kinds of weighting functions are combined the local value of chromatism factor of influence of structure, and the product of this factor of influence and human eye value of chromatism is a self-adaptation aberration perceptual threshold;
Step 2: the edge detection operator with based on gradient is the basis; Begin pointwise processing image from the pixel in the image upper left corner; Calculating with the current pixel point earlier is the nuclear value distance of the aberration gradient masterplate at center; Gray scale difference in its Central Plains gradient operator replaces with the Lab aberration, and whether visible the self-adaptation aberration perceptual threshold of using step 1 to calculate then be used as aberration threshold value; If the result of calculation of the aberration gradient operator of current pixel point, then can be judged current pixel point greater than said self-adaptation aberration perceptual threshold and be marginal point and show, otherwise pixel intensity is changed to zero.
Wherein, the detailed process of calculating self-adaptation aberration perceptual threshold is following in the step 1:
Step 11: when image is handled in pointwise, be masterplate with n * n pixel, the mean flow rate in the calculation template zone is confirmed background luminance mask weight coefficient as the brightness value of central point according to the relation of local luminance and background luminance mask weighting function;
Step 12: when image is handled in pointwise; With n * n pixel is template; The spatial frequency of the difference pixel L of computing center passage, central pixel point a passage and central pixel point b passage; Can see that according to spatial frequency and contrast the relation of threshold value confirms the contrast sensitivity mask coefficient of these three passages respectively, again according to human eye to the perception situation of these three passages coefficient weighted mean with three passages, promptly obtain local contrast sensitivity mask coefficient;
Step 13: with the average of background luminance mask weight coefficient and the local contrast sensitivity mask coefficient factor of influence as local value of chromatism, the product of this factor of influence and human eye value of chromatism is the self-adaptation aberration perceptual threshold of this n * n pixel masterplate central point.
The present invention has at first constructed a weighting function that comprises the aberration threshold value of background luminance mask, contrast sensitivity function (CSF); Confirm the aberration threshold value of each pixel according to neighborhood information; If the result of calculation of the aberration gradient operator of this pixel is greater than this threshold value; Can judge that then current pixel is marginal point and shows, otherwise pixel intensity is changed to zero.Show through experiment test: this method has been considered the brightness mask effect and the contrast sensitivity characteristic of human eye; Can be similar to the human eye vision apperceive characteristic; Effectively detect the appreciable image border of human eye; The detection problem is crossed at the edge of avoiding simultaneously causing because of decision threshold in the traditional algorithm, and noiseproof feature is preferably arranged.
Description of drawings
Fig. 1 is the graph of a relation of background luminance among the present invention and luminance mask weight coefficient.
Fig. 2 is the spatial frequency of different color channels and the graph of a relation of CSF weighting function.
Fig. 3 is the edge detection results of different aberration gradient operators to peppers, lena, colour chart picture; Wherein (a) is the former figure of experiment of peppers image, (b) for figure (a) utilizes the fixedly algorithm testing result of aberration threshold value, (c) for scheming the testing result that (a) utilizes self-adaptation color difference threshold value-based algorithm; (d) be the former figure of experiment of lena image; (e) for figure (b) utilizes the fixedly algorithm testing result of aberration threshold value, (f), (g) be the former figure of experiment of colour chart picture for scheming the testing result that (b) utilizes self-adaptation color difference threshold value-based algorithm; (h) for figure (g) utilizes the fixedly algorithm testing result of aberration threshold value, (i) for scheming the testing result that (g) utilizes self-adaptation color difference threshold value-based algorithm.
Fig. 4 is the antinoise testing result lab diagram of different aberration gradient operators; Wherein (a) is the gray level image of test pattern; (b) be Sobel algorithm testing result, (c) be fixing aberration threshold value Sobel algorithm testing result (d) to be self-adaptation aberration threshold value Sobel algorithm testing result; (e) be fixing aberration threshold value SUSAN algorithm testing result, (f) be self-adaptation difference limen value SUSAN algorithm testing result.
Embodiment
To combine accompanying drawing that the present invention is specified below.
Step 1: consider the influence of the value of chromatism (JNCD) that the image local background luminance can just be distinguished two kinds of colors in the Lab space, make up background luminance mask weighting function.According to the relation of spatial frequency and contrast perceptual threshold, the texture information of combining image makes up the contrast sensitivity weighting function to the influence of JNCD.Two kinds of weighting functions are combined structure local J NCD factor of influence, and the product of this factor of influence and human eye JNCD is called self-adaptation aberration perceptual threshold (AJNCD).
Step 2: the edge detection operator with based on gradient is the basis, like Sobel, and SUSAN, Laplace operator etc.Begin pointwise processing image from the pixel in the image upper left corner; Calculate earlier with current pixel point (i; J) be the nuclear value distance of the aberration gradient masterplate at center; Gray scale difference in its Central Plains gradient operator replaces with the Lab aberration, and whether visible the AJNCD that uses step 1 to calculate then be used as aberration threshold value.If the result of calculation of the aberration gradient of this neighborhood of pixel points, then can be judged current pixel greater than this threshold value and be marginal point and show, otherwise pixel intensity is changed to zero.
Wherein, the detailed process that step 1 is calculated AJNCD is described below, and choosing 5 * 5 pixels is masterplate:
Step 11: for the pixel in the image, if the mean flow rate of neighborhood is different, then this JNCD with other pixel is also inequality, and the relation of the weight coefficient of background luminance and background luminance mask is as shown in Figure 1, and mathematic(al) representation is:
f 1 = - 0.028 E ( Y ) + 3.5 0 &le; E ( Y ) < 90 1.0 90 &le; E ( Y ) < 150 0.015 E ( Y ) + 1.0 150 &le; E ( Y ) < 255 - - - ( 1 )
Wherein E (Y) expression local background brightness, its value is that 5 * 5 pixels are the average brightness value in the masterplate calculation template zone, f 1The weight coefficient of expression background luminance mask.
Step 12: in addition, the spatial frequency of brightness and color also can influence the perception of human eyes characteristic, and the computing method of contrast sensitivity weighting function are following:
A) at first confirm current pixel point (i, the color space angular frequency of j) locating
Figure BDA00002019718400041
(i, j); Method is following: calculate current pixel point L respectively with the prewitt operator earlier; A, and the gradient g of b passage (i, j); Again with (i; J) for the center respectively in the neighborhood window of m * n zero of calculation template inside gradient value horizontal direction and vertical direction pass through number of times and fp (i, j) and fq (i, j); Then pixel (i, j) local horizontal spatial frequency and vertical spatial frequency are respectively:
f h ( i , j ) = 1 m &Sigma; p = 1 m f p ( i , j ) f l ( i , j ) = 1 n &Sigma; q = 1 n f q ( i , j ) - - - ( 2 )
Pixel (i, neighborhood space frequency j) is:
f(i,j)=max(f h(i,j),f l(i,j)) (3)
Hue angle spatial frequency and f (i, transformational relation j) is following:
Figure BDA00002019718400043
Wherein
Figure BDA00002019718400044
is L; A; The Space Angle frequency w of b passage; U, the set of v, unit is week/degree (cpd).D is that viewing distance is got 50cm usually, Δ x remarked pixel spacing.
B) the present invention is defined as pixel and its neighborhood point at the aberration △ in Lab space E with the color contrast C of image LabWith the ratio of the mean flow rate E (Y) of neighborhood, contrast vision threshold value becomes reciprocal relation with contrast sensitivity, and the available formula of relation (5) of contrast sensitivity csf and the visual threshold value of aberration is described:
csf = 1 c = 1 &Delta; E Lab / E ( Y ) = E ( Y ) &Delta; E Lab - - - ( 5 )
The contrast sensitivity function in Lab space can be divided into brightness contrast sensitivity, red green contrast sensitivity and champac colour contrast sensitivity according to the difference of modulation direction, and the present invention adopts the contrast sensitivity function model of Mullen [8], this model comes the relation of match spatial frequency and contrast vision threshold value with exponential form, and (6), (7), (8) formula are represented brightness, red green and champac colour contrast sensitivity function CSF and color space angular frequency respectively
Figure BDA00002019718400046
Relation:
Figure BDA00002019718400047
Formula (6) (7) (8) is gone in a) substitution as a result of step can calculate corresponding C SF.
C) can with the responsive weighting function f2 of contrast regard as brightness, red-green, 3 Color Channel CSF of Huang-indigo plant factor of influence weighted mean and:
&Delta; E Lab = ( E ( Y ) CSF L ( w ) + 0.5 E ( Y ) CSF rg ( u ) + 0.25 E ( Y ) CSF yb ( v ) ) / 1.75 f 2 = &Delta; E Lab JNCD ( JNCD = 3 ) - - - ( 9 )
Fig. 2 has shown the spatial frequency of brightness of image and color information and the relation of CSF weighting function, and wherein the gray-scale value of background luminance is made as 150, can know that by figure spatial frequency is big more, and the aberration threshold value of the abundant more then respective regions of color and texture information is also big more.
D) consider the influence that background luminance mask and CSF differentiate situation to the human eye aberration, can be expressed as based on the loud function m of local message AJNCD:
m ( f 1 ( E ( Y ) ) , f 2 ( E ( Y ) , w , u , v ) ) = f 1 ( E ( Y ) ) + f 2 ( E ( Y ) , w , u , v ) 2 - - - ( 10 )
AJNCD can be by computes:
AJNCD=m×JNCD (11)
The detailed process of said step 2 is following:
With the Sobel operator is example, following based on the Sobel operator computing formula of aberration:
cdsobel = d x 2 + d y 2
dx=cd(f i+1,j+1,f i+1,j-1)+2cd(f i,j+1,f i,j-1)+cd(f i-1,j+1,f i-1,j-1)
dy=cd(f i+1,j+1,f i-1,j+1)+2cd(f i+1,j,f i-1,j)+cd(f i+1,j-1,f i-1,j-1) (12)
Wherein (i, j) (i, j), cd representes the Euclidean distance of two pixels in the Lab space to the remarked pixel point to f.Calculate earlier with pixel (i; J) be the nuclear value distance of the aberration gradient masterplate at center, the gray scale difference in its Central Plains gradient operator replaces with the Lab aberration, and the present invention is according to point (i; J) neighborhood information calculates the AJNCD of this point; If 2 aberration is greater than this AJNCD in the operator template, then this color gradient value keeps, otherwise is zero.Two pixels that aberration is calculated in order are respectively P1, P2, and the self-adaptation aberration threshold value of central point is AJNCD K, mathematical computations is as follows:
d k = ( | | p 1 - p 2 | | - AJN CD K ) &alpha; ( k ) | | p 1 - p 2 | | = ( L 1 - L 2 ) 2 + ( a 1 - a 2 ) 2 + ( b 1 - b 2 ) 2 - - - ( 13 )
Wherein α (k) is a decision function:
&alpha; ( k ) = 1 | | d 1 - d 2 | | &GreaterEqual; AJNCD K 0 | | d 1 - d 2 | | < AJNCD K - - - ( 14 )
For performance evaluation and comparison with the present invention and additive method; The present invention adopts the Sobel based on AJNCD; To 512 * 512 pepper image, 480 * 480 colour chart picture, 500 * 281 colour chart look like to carry out rim detection respectively for SUSAN and Laplace operator.Fig. 3 uses the fixedly edge detection results of aberration threshold value and self-adaptation aberration threshold value respectively for the aberration gradient operator.From the result can find out that the present invention is different in brightness, texture-rich degree difference and the different zone of color space frequency use adaptive aberration threshold value as whether the basis for estimation as edge pixel point, avoided the subregion since in the crossing the phenomenon examined for example 3 (a) scheme of the aberration threshold value the is provided with improper fine edge that the insensitive zone of visually-perceptible occurs the bottom of capsicum be backlight area.And the self-adaptation of threshold value adjustment do not influence the judgement at edge, and the edge in the color lump between the similar color also can clearly detect exactly.
The present invention adopts synthetic colored piece image to use based on fixedly the method and the method for the present invention of aberration threshold value are 0 adding average respectively; Variance is 0.002,0.004,0.006; 0.008 the situation of Gaussian noise under carry out rim detection, and add up the signal to noise ratio (S/N ratio) under the corresponding situation.The calculating of SNR is suc as formula shown in (15), and wherein Pi representes the ideal edge energy, and Pn representes noise energy.
SNR = 10 lg ( p i p n ) - - - ( 15 )
For evaluation algorithms performance from the edge precision aspect, the rim detection performance quality factor formula that the present invention adopts Abdou-pratt to propose:
pratt = 1 max ( I A , I I ) &Sigma; i = 1 I A 1 1 + &alpha; d i 2 &times; 100 % - - - ( 16 )
Wherein, IA, II; D is respectively the vertical range of detected edge, ideal edge, actual edge point and ideal edge point line, and α (getting 0.1 in the experiment) is the constant that is used to punish the dislocation edge, and the Pratt value is big more; Show that detector performance is good more, Pratt=100% when fully accurately detecting.
The table one edge detects performance relatively
Figure BDA00002019718400064
Figure BDA00002019718400071
Data by table 1 statistics can know that no matter under the situation of which kind of noise variance, the present invention has prevented that through setting AJNCD the edge in the insensitive zone of perception is detected, and therefore also is superior to the fixedly method method of aberration threshold value on the bearing accuracy on the edge of.
Fig. 4 is test picture and the testing result that noiseproof feature of the present invention detects, because the gray-scale value difference of test pattern part color lump is merely 2, can not detect the edge of most of right side color lump based on the edge algorithms of brightness, and noise is more.Because fixedly the color gradient algorithm of aberration threshold value is based on the computing of color, therefore can tell the edge of all color lumps.
In the weighting function of aberration threshold value, the background luminance mask factor is different with the contrast sensitivity effector to the rim detection Effect on Performance.Can find out that by Fig. 4 (a) brightness of test pattern color lump differs, under the situation of not considering the background luminance masking effect, different color blocks is different by the degree of noise effect, and like Fig. 4 (b), 4 (c) are shown in 4 (e); After the relation of considering image background brightness and visual aberration threshold value, detected noise spot presents distribution more uniformly in the color lump of different brightness, shown in Fig. 4 (d).The contrast sensitivity effector mainly influences the noise immunity of edge detection operator; The adding of Gaussian noise has been equivalent to increase the color frequency of image; Fixedly the algorithm of aberration threshold value ignored color spatial frequency to the visual threshold affects of aberration; The rim detection effect receives noise effect more serious, and the contrast sensitivity effector can be adjusted the aberration threshold value adaptively according to the spatial frequency of color, and therefore method of the present invention can detect the edge of all color lumps exactly and receive noise effect less.
The present invention is a kind of method of estimation of the visual threshold value of adaptive local aberration based on the human eye vision effect, has made up the visual threshold affects function of the aberration that comprises background luminance mask and contrast sensitivity; Replace the luminance difference in the color gradient operator with the Lab aberration, with the visual threshold application of adaptive local aberration in gradient operator based on aberration.Through experiment confirm, the present invention is the variation of sensed luminance and color effectively, judges the image border exactly, and has suppressed crossing of vision insensitive information and detected, and has excellent noiseproof feature.
The above only is the embodiment among the present invention, but scope of the present invention should not described by this and limits.It should be appreciated by those skilled in the art,, all belong to claim of the present invention and come restricted portion in any modification that does not depart from the scope of the present invention or local replacement.

Claims (2)

1. based on the method for the color images edge detection of local auto-adaptive aberration threshold value, detailed process is following:
Step 1: consider the influence of the value of chromatism that the image local background luminance can just be distinguished two kinds of colors in the Lab space, make up background luminance mask weighting function; According to the relation of spatial frequency and contrast perceptual threshold, the texture information of combining image makes up the contrast sensitivity weighting function to the influence of said value of chromatism; Above two kinds of weighting functions are combined the local value of chromatism factor of influence of structure, and the product of this factor of influence and human eye value of chromatism is a self-adaptation aberration perceptual threshold;
Step 2: the edge detection operator with based on gradient is the basis; Begin pointwise processing image from the pixel in the image upper left corner; Calculating with the current pixel point earlier is the nuclear value distance of the aberration gradient masterplate at center; Gray scale difference in its Central Plains gradient operator replaces with the Lab aberration, and whether visible the self-adaptation aberration perceptual threshold of using step 1 to calculate then be used as aberration threshold value; If the result of calculation of the aberration gradient operator of current pixel point, then can be judged current pixel point greater than said self-adaptation aberration perceptual threshold and be marginal point and show, otherwise pixel intensity is changed to zero.
2. the method for the color images edge detection based on local auto-adaptive aberration threshold value according to claim 1 is characterized in that, the detailed process of calculating self-adaptation aberration perceptual threshold in the said step 1 is following:
Step 11: when image is handled in pointwise, be masterplate with n * n pixel, the mean flow rate in the calculation template zone is confirmed background luminance mask weight coefficient as the brightness value of central point according to the relation of local luminance and background luminance mask weighting function;
Step 12: when image is handled in pointwise; With n * n pixel is template; The spatial frequency of the difference pixel L of computing center passage, central pixel point a passage and central pixel point b passage; Can see that according to spatial frequency and contrast the relation of threshold value confirms the contrast sensitivity mask coefficient of these three passages respectively, again according to human eye to the perception situation of these three passages coefficient weighted mean with three passages, promptly obtain local contrast sensitivity mask coefficient;
Step 13: with the average of background luminance mask weight coefficient and the local contrast sensitivity mask coefficient factor of influence as local value of chromatism, the product of this factor of influence and human eye value of chromatism is the self-adaptation aberration perceptual threshold of this n * n pixel masterplate central point.
CN2012102914785A 2012-08-16 2012-08-16 Method for detecting edge of color image on basis of local self-adaption color difference threshold Pending CN102819850A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012102914785A CN102819850A (en) 2012-08-16 2012-08-16 Method for detecting edge of color image on basis of local self-adaption color difference threshold

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012102914785A CN102819850A (en) 2012-08-16 2012-08-16 Method for detecting edge of color image on basis of local self-adaption color difference threshold

Publications (1)

Publication Number Publication Date
CN102819850A true CN102819850A (en) 2012-12-12

Family

ID=47303953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012102914785A Pending CN102819850A (en) 2012-08-16 2012-08-16 Method for detecting edge of color image on basis of local self-adaption color difference threshold

Country Status (1)

Country Link
CN (1) CN102819850A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103632378A (en) * 2013-12-22 2014-03-12 东北林业大学 Multi-threshold edge detection method based on point connection drawing game idea
CN104346599A (en) * 2013-07-23 2015-02-11 深圳市腾讯计算机系统有限公司 Detection method of color edge, and image processing device
CN104778729A (en) * 2014-01-09 2015-07-15 上海帝仪科技有限公司 Iris extraction method and equipment under uneven illumination condition
CN105069773A (en) * 2015-06-15 2015-11-18 上海应用技术学院 Self-adapting edge detection calculating method based on combination of mask film and canny algorithm
CN106056584A (en) * 2016-05-24 2016-10-26 努比亚技术有限公司 Foreground-background segmenting device and foreground-background segmenting method
CN106409060A (en) * 2016-10-20 2017-02-15 徐次香 Automobile driving simulator
CN106874818A (en) * 2016-08-30 2017-06-20 阿里巴巴集团控股有限公司 A kind of Digital Object Unique Identifier DOI recognition methods and device
CN108470347A (en) * 2017-02-23 2018-08-31 南宁市富久信息技术有限公司 A kind of color image edge detection method
CN112488997A (en) * 2020-11-18 2021-03-12 浙江大学 Method for detecting and evaluating color reproduction of ancient painting printed matter based on characteristic interpolation
CN113256534A (en) * 2021-06-16 2021-08-13 湖南兴芯微电子科技有限公司 Image enhancement method, device and medium
CN113920068A (en) * 2021-09-23 2022-01-11 北京医准智能科技有限公司 Body part detection method and device based on artificial intelligence and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1636755A2 (en) * 2003-06-18 2006-03-22 British Telecommunications Public Limited Company Method and system for video quality assessment
CN101621708A (en) * 2009-07-29 2010-01-06 武汉大学 Method for computing perceptible distortion of color image based on DCT field
CN101246593B (en) * 2008-03-27 2011-07-20 北京中星微电子有限公司 Color image edge detection method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1636755A2 (en) * 2003-06-18 2006-03-22 British Telecommunications Public Limited Company Method and system for video quality assessment
CN101246593B (en) * 2008-03-27 2011-07-20 北京中星微电子有限公司 Color image edge detection method and apparatus
CN101621708A (en) * 2009-07-29 2010-01-06 武汉大学 Method for computing perceptible distortion of color image based on DCT field

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
CHEN H C, CHEN W J, AND WANG S J: "Contrast-based color image segmentation", 《IEEE SIGNAL PROCESSING LETTERS》 *
KUO-CHENG LIU AND CHUN-HSIEN CHOU: "Perceptual Contrast Estimation for Color Edge Detection", 《IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, SIGNALS AND IMAGE PROCESSING AND 6TH EURASIP CONFERENCE FOCUSED ON SPEECH AND IMAGE PROCESSING, MULTIMEDIA COMMUNICATIONS AND SERVICES》 *
KUO-CHENG LIU: "Color-Edge Detection Based on Discrimination of Noticeable Color Contrasts", 《INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY》 *
MULLEN K T: "The contrast sensitivity of human color vision to", 《THE JOURNAL OF PHYSIOLOGY》 *
吕玮阁: "基于色貌的感知对比度评价方法及建模研究", 《浙江大学博士论文》 *
曾俊, 李德华: "彩色图像SUSAN 边缘检测方法", 《计算机工程与应用》 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104346599A (en) * 2013-07-23 2015-02-11 深圳市腾讯计算机系统有限公司 Detection method of color edge, and image processing device
CN104346599B (en) * 2013-07-23 2019-09-17 深圳市腾讯计算机系统有限公司 A kind of detection method and image processing equipment of color boundary
CN103632378A (en) * 2013-12-22 2014-03-12 东北林业大学 Multi-threshold edge detection method based on point connection drawing game idea
CN104778729A (en) * 2014-01-09 2015-07-15 上海帝仪科技有限公司 Iris extraction method and equipment under uneven illumination condition
WO2015103953A1 (en) * 2014-01-09 2015-07-16 上海帝仪科技有限公司 Method and device for extracting iris image image under condition of non-uniform illumination
CN105069773A (en) * 2015-06-15 2015-11-18 上海应用技术学院 Self-adapting edge detection calculating method based on combination of mask film and canny algorithm
CN105069773B (en) * 2015-06-15 2017-12-26 上海应用技术学院 The auto-adaptable image edge detection computational methods being combined based on mask with canny
CN106056584B (en) * 2016-05-24 2019-08-02 努比亚技术有限公司 A kind of front and back scape segmenting device and method
CN106056584A (en) * 2016-05-24 2016-10-26 努比亚技术有限公司 Foreground-background segmenting device and foreground-background segmenting method
TWI696954B (en) * 2016-08-30 2020-06-21 香港商阿里巴巴集團服務有限公司 Digital object unique identifier (DOI) recognition method and device
WO2018040948A1 (en) * 2016-08-30 2018-03-08 阿里巴巴集团控股有限公司 Digital object unique identifier (doi) recognition method and device
CN106874818A (en) * 2016-08-30 2017-06-20 阿里巴巴集团控股有限公司 A kind of Digital Object Unique Identifier DOI recognition methods and device
CN106874818B (en) * 2016-08-30 2019-11-22 阿里巴巴集团控股有限公司 A kind of Digital Object Unique Identifier DOI recognition methods and device
US10664674B2 (en) 2016-08-30 2020-05-26 Alibaba Group Holding Limited Digital object unique identifier (DOI) recognition method and device
CN106409060B (en) * 2016-10-20 2022-12-06 国网浙江省电力有限公司台州供电公司 Image processing method of automobile driving simulator
CN106409060A (en) * 2016-10-20 2017-02-15 徐次香 Automobile driving simulator
CN108470347A (en) * 2017-02-23 2018-08-31 南宁市富久信息技术有限公司 A kind of color image edge detection method
CN112488997A (en) * 2020-11-18 2021-03-12 浙江大学 Method for detecting and evaluating color reproduction of ancient painting printed matter based on characteristic interpolation
CN112488997B (en) * 2020-11-18 2022-04-29 浙江大学 Method for detecting and evaluating color reproduction of ancient painting printed matter based on characteristic interpolation
CN113256534B (en) * 2021-06-16 2022-01-07 湖南兴芯微电子科技有限公司 Image enhancement method, device and medium
CN113256534A (en) * 2021-06-16 2021-08-13 湖南兴芯微电子科技有限公司 Image enhancement method, device and medium
CN113920068A (en) * 2021-09-23 2022-01-11 北京医准智能科技有限公司 Body part detection method and device based on artificial intelligence and electronic equipment
CN113920068B (en) * 2021-09-23 2022-12-30 北京医准智能科技有限公司 Body part detection method and device based on artificial intelligence and electronic equipment

Similar Documents

Publication Publication Date Title
CN102819850A (en) Method for detecting edge of color image on basis of local self-adaption color difference threshold
CN105931220B (en) Traffic haze visibility detecting method based on dark channel prior Yu minimum image entropy
CN106056155B (en) Superpixel segmentation method based on boundary information fusion
CN102883175B (en) Methods for extracting depth map, judging video scene change and optimizing edge of depth map
CN102800082B (en) No-reference image definition detection method
CN109886960A (en) The method of glass edge defects detection based on machine vision
CN104318266B (en) A kind of image intelligent analyzes and processes method for early warning
CN103345755A (en) Chessboard angular point sub-pixel extraction method based on Harris operator
CN102385753A (en) Illumination-classification-based adaptive image segmentation method
CN102938057B (en) A kind of method for eliminating vehicle shadow and device
CN104021527B (en) Rain and snow removal method in image
CN101615241B (en) Method for screening certificate photos
CN106408526B (en) A kind of visibility detecting method based on multilayer polar plot
CN107578399B (en) Full-reference image quality evaluation method based on boundary feature segmentation
CN108470340A (en) A kind of improved Sobel edge detection algorithms
CN105488475A (en) Method for detecting human face in mobile phone
CN104598914A (en) Skin color detecting method and device
CN105787912A (en) Classification-based step type edge sub pixel localization method
CN102509414B (en) Smog detection method based on computer vision
CN102830045A (en) Fabric spray rating objective evaluating method based on image processing
CN106600615A (en) Image edge detection algorithm evaluation system and method
CN110807406A (en) Foggy day detection method and device
CN106157301A (en) A kind of threshold value for Image Edge-Detection is from determining method and device
Shiting et al. Clustering-based shadow edge detection in a single color image
CN101739678B (en) Method for detecting shadow of object

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
ASS Succession or assignment of patent right

Owner name: NANJING UNIVERSITY

Free format text: FORMER OWNER: LI BO

Effective date: 20121206

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20121206

Address after: 210093 Hankou Road, Jiangsu, China, No. 22, No.

Applicant after: Nanjing University

Address before: 210093 Hankou Road, Jiangsu, China, No. 22, No.

Applicant before: Li Bo

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20121212