CN101866484A - Method for computing significance degree of pixels in image - Google Patents

Method for computing significance degree of pixels in image Download PDF

Info

Publication number
CN101866484A
CN101866484A CN 201010193853 CN201010193853A CN101866484A CN 101866484 A CN101866484 A CN 101866484A CN 201010193853 CN201010193853 CN 201010193853 CN 201010193853 A CN201010193853 A CN 201010193853A CN 101866484 A CN101866484 A CN 101866484A
Authority
CN
China
Prior art keywords
pixel
image
pixels
significance degree
circular areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010193853
Other languages
Chinese (zh)
Other versions
CN101866484B (en
Inventor
黄锐
桑农
王岳环
刘乐元
高常鑫
高峻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN2010101938533A priority Critical patent/CN101866484B/en
Publication of CN101866484A publication Critical patent/CN101866484A/en
Application granted granted Critical
Publication of CN101866484B publication Critical patent/CN101866484B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for computing the significance degree of pixels in an image. Aiming at each pixel in the image, a group of adjacent areas in a multi-scale range are designed; by adopting the L*a*b* color characteristics, pixels, which have characteristic difference from the pixel, in each adjacent area are respectively computed; and the number proportion of the pixels with the difference in each adjacent area is taken as the significance degree of the pixel. The multi-scale computation of the pixels is taken into consideration, and the method has the advantages of simple computation and easy implementation. The obtained image significance degree distribution map has the same resolution with the original image, and has more reasonability in visual sense, so the input with good robustness is provided for the significance degree-based computational vision application.

Description

The computing method of significance degree of pixels in a kind of image
Technical field
The invention belongs to computer vision field, be specifically related to a kind of computing method significance degree of pixels in the image.
Background technology
And cognitive science studies show that at heart, when people observe piece image, our vision system can be primarily focused on one or several significant zone apace, and then further observes the content in the image, and these significant zones are our interesting areas often.At present, the computation vision field that is extracted in to marking area in the image has a wide range of applications, for example, extract (Content-based Image Retrieval at content-based image, CBIR) in the research, if what images match adopted is the feature of extracting from marking area, Pi Pei effect and final extraction hit rate are better than the feature that employing is extracted from entire image so.Other application also comprises, adapt in vain Image Compression (as, JPEG2000), the white adaptation browser technology of video, Target Recognition adapts to image zoom technology etc. in vain.In fact, these technology have a common hypothesis, promptly, the main information content in the image has all concentrated in the marking area of image, the feature of extracting from the marking area of image not only can be for application provides enough information, the interfere information that can also avoid those to be brought by non-marking area.Therefore, the extraction to marking area in the image has important researching value.
At present, in the computation vision field, in order to extract the marking area in the image, usually the method that adopts is, by the significance degree of each pixel in the computed image, obtain the significance degree distribution plan (saliency map) of image earlier, then this remarkable distribution plan is got the computing of threshold value, obtain the image segmentation result of a two-value, foreground area wherein is the marking area that will extract.In said process, the effect important influence that the significance degree distribution plan extracts final marking area.At present, about the calculating of significance degree of pixels in the image, mainly be divided into based on the method for physiological mode with based on the method for computation model, usually, the significance degree distribution plan resolution that is obtained by these methods is lower.Recently, A Kante Achant (reference: R.Achanta, S.Hemami, F.Estrada, et al.Frequency-tunedsalient region detection.IEEE Conf.on CVPR, 2009) simple computing method have been proposed, though the significance degree distribution plan that is obtained by this method has reached the resolution identical with original image, but in some cases, the visually-perceptible from the people seems that some is unreasonable.
Summary of the invention
The object of the present invention is to provide the significance degree computing method of pixel in a kind of image, the image significance degree distribution plan that is calculated by this method not only has the resolution identical with original image, on visually-perceptible, also have more simultaneously rationality, thereby provide robustness good input for using based on the computation vision of significance degree.
The significance degree computing method of pixel in a kind of image, for each pixel p in the image, calculate as follows:
(1) asks for neighborhood zone in one group of multiple dimensioned scope of pixel p; Wherein,
The neighborhood zone nearest apart from pixel p is a border circular areas R 0(p), be defined as follows:
R 0(p)={q|0≤‖p-q‖ 2≤r 0,q∈∧}
‖ ‖ 2Expression Euclidean distance tolerance, all pixel in the ∧ presentation video, r 0Radius for this border circular areas;
Border circular areas R 0(p) it is the annular region that concentric circles surrounded in the center of circle with the pixel p that the neighborhood zone outside is k, and they are defined as follows:
R i(p)={t|r i-1≤‖p-t‖ 2≤r i,t∈∧},i=1,...,k
r iIt is ring-like region R i(p) outer boundary is to the radius of p,
Figure GDA0000022119310000021
W is the width of image, and H is the height of image, and the interregional radius step-length difference of adjacent annular is constant;
(2) utilize the L of pixel in the image *a *b *Color characteristic calculates in above-mentioned each neighborhood zone respectively and pixel p has the pixel of difference degree;
(2.1) at border circular areas R 0(p) in, has the pixel D of difference degree with pixel p 0(p) be characterized by:
Figure GDA0000022119310000031
I wherein p,
Figure GDA0000022119310000032
Remarked pixel p,
Figure GDA0000022119310000033
L *a *b *Color feature value,
Figure GDA0000022119310000034
Wherein
Figure GDA0000022119310000035
With
Figure GDA0000022119310000036
Be respectively in the image all pixels at L *a *b *Variance in the color space on three passages;
(2.2) at each annular region R i(p) in, has the pixel D of difference degree with pixel p i(p) be characterized by:
Figure GDA0000022119310000037
Wherein
Figure GDA0000022119310000038
Remarked pixel
Figure GDA0000022119310000039
L *a *b *Color feature value,
Figure GDA00000221193100000310
Or, || CardThe gesture of expression set, I Q 'The L of remarked pixel q ' *a *b *Color feature value;
(3) significance degree of calculating pixel p S ( p ) = 1 k Σ i = 1 k | D i ( p ) | card | R i ( p ) | card .
Technique effect of the present invention is embodied in: from the definition of S (p) as can be seen, the present invention about the significance degree of pixel p calculate be exactly in k annular region of pixel p with the mean value of the dissimilar number of pixels proportion of pixel p, this definition not only simply but also easily realized, S (p) can be regarded as a wave filter, then image is carried out filtering and get final product, when realizing, can adopt parallel processing method to quicken calculating process.Resulting image significance degree distribution plan not only has identical resolution with original image, and visually has more rationality, thereby provides robustness good input for using based on the computation vision of significance degree.
Description of drawings
Fig. 1 is the inventive method process flow diagram;
Fig. 2 is the neighborhood area schematic in one group of multiple dimensioned scope of the present invention;
Fig. 3 is the effect contrast figure who adopts distinct methods to calculate to six examples, wherein Fig. 3 (a) is six secondary real natural images, Fig. 3 (b) is for adopting document (L.Itti, C.Koch, E.Niebur.A model of saliency based visual attention for rapid scene analysis.IEEETrans.on PAMI., 1998,20 (11): the 1254-1259) result that obtains of method, Fig. 3 (c) is for adopting document (X.Hou, L.Zhang.Saliency detection:A spectral residualapproach.IEEE Conf.on CVPR, 2007) result that obtains of method, Fig. 3 (d) is for adopting document (R.Achanta, S.Hemami, F.Estrada, et al.Frequency-tunedsalient region detection.IEEE Conf.on CVPR, 2009) result that obtains of method, the result of Fig. 3 (e) for adopting the inventive method to obtain.
Embodiment
The present invention is further detailed explanation below in conjunction with accompanying drawing.
As shown in Figure 1, detailed process is:
(1) a given width of cloth natural image I for each pixel p in this image, at first tries to achieve the neighborhood zone in one group of multiple dimensioned scope of this pixel p.Wherein nearest apart from pixel p neighborhood zone is a border circular areas R 0(p), as shown in Figure 2, it is defined as follows:
R 0(p)={q|0≤‖p-q‖ 2≤r 0,q∈∧}
‖ ‖ wherein 2Expression Euclidean distance tolerance, all pixel in the ∧ presentation video, r 0Radius for this border circular areas.
Except above-mentioned border circular areas R 0(p) outside, it is the k that concentric circles a surrounded annular region in the center of circle with the pixel p that remaining neighborhood zone is one group, and as shown in Figure 2, they are defined as follows:
R i(p)={t|r i-1≤‖p-t‖ 2≤r i,t∈∧},i=1,...,k
R wherein iIt is ring-like region R i(p) outer boundary is to the radius of p, and among the present invention, for this k annular region, we make r i=r I-1+ Δ r, and require maximum radius r kSatisfy
Figure GDA0000022119310000041
Wherein W is the width of image I, and H is the height of image I.For this group relation, if we have set annular region number k and radius r 0Value, the value of Δ r is easy to obtain so.
(2) utilize the L of pixel in the image *a *b *Color characteristic calculates in above-mentioned each neighborhood zone and pixel p has all pixels of certain difference degree.
(2.1) wherein at border circular areas R 0(p) in, for having those pixels D of certain difference degree with pixel p 0(p), we define its computing formula and are:
Figure GDA0000022119310000051
I wherein p,
Figure GDA0000022119310000052
Remarked pixel p,
Figure GDA0000022119310000053
L *a *b *Color feature value, ‖ ‖ 2Refer to Euclidean distance tolerance, and
Figure GDA0000022119310000054
Wherein
Figure GDA0000022119310000055
With
Figure GDA0000022119310000056
All pixels are at L among the difference correspondence image I *a *b *Variance in the color space on three passages.
(2.2) and at each annular region R i(p) in, for having those pixels D of certain difference degree with pixel p i(p), we define its computing formula and are:
Figure GDA0000022119310000057
Wherein
Figure GDA0000022119310000058
Remarked pixel
Figure GDA0000022119310000059
L *a *b *Color feature value, ‖ ‖ 2Refer to Euclidean distance tolerance.
Figure GDA00000221193100000510
Represent annular region R I-1(p) L of all pixels similar in to pixel p *a *b *The color feature vector average, it is defined as follows:
M p i - 1 = Σ q ′ ∈ R i - 1 ( p ) \ D i - 1 ( p ) I q ′ / | R i - 1 ( p ) \ D i - 1 ( p ) | card or I p if D i - 1 ( p ) = R i - 1 ( p )
R in the following formula I-1(p) D I-1(p) represent annular region R I-1(p) collection of pixels similar to pixel p in is by set R I-1(p) with set D I-1(p) difference obtains, || CardThe gesture of expression set.For special situation, when at annular region R I-1(p) in the dissimilar collection of pixels D of pixel p I-1(p) occupied whole set R I-1(p) time, that is, and D I-1(p)=R I-1(p) time, we make
Figure GDA00000221193100000512
Promptly
Figure GDA00000221193100000513
Be exactly the L of pixel p *a *b *The color feature vector value.
(3) based on aforementioned calculation, we provide the computing formula of the present invention about the significance degree S (p) of each pixel p in the image, that is:
S ( p ) = 1 k Σ i = 1 k | D i ( p ) | card | R i ( p ) | card
Wherein || CardThe gesture of expression set.
When using S (p), we need be provided with two parameters, and one is the k value, and promptly we need the annular region among what Fig. 2, and another is the central circular radius r among Fig. 2 0, concrete size is determined according to actual conditions.Suggestion r value 3~5, in experiment of the present invention, we make k=3, r 0=3.
If all adopt S (p) to calculate significance degree to each pixel in the image, so, we just can obtain a width of cloth significance degree distribution plan of this image, Fig. 3 has provided one group of real natural image, and the significance degree distribution plan that obtains by current other three kinds of classic algorithm, simultaneously, we give the significance degree distribution plan that is obtained by computing method of the present invention.As can be seen from Figure 3, the image significance degree distribution plan (Fig. 3 (e)) that is obtained by the present invention not only has identical resolution with original image, and visually has more rationality.
According to an exemplary embodiment of the present invention, be used to realize that computer system of the present invention can comprise, particularly, central processing unit (CPU), storer and I/O (I/O) interface.Computer system usually by I/O interface and display with link to each other such as this type of various input equipments of mouse and keyboard, support circuit can comprise the fast buffer memory of image height, power supply, clock circuit and the such circuit of communication bus.Storer can comprise random access memory (RAM), ROM (read-only memory) (ROM), disc driver, magnetic tape station etc., or their combination.Computer platform also comprises operating system and micro-instruction code.Various process described herein and function can be by the micro-instruction code of operating system execution or the part of application program (or their combination).In addition, various other peripherals can be connected to this computer platform, as additional data storage device and printing device.
Should also be understood that and so the actual connection between the system component (or process steps) may be different, specifically decide on programming mode of the present invention because the assembly and the method step of some construction system described in the accompanying drawing can form of software be realized.Based on the principle of the invention that proposes herein, the ordinary skill of association area it is contemplated that these and similar embodiment or configuration of the present invention.

Claims (1)

1. significance degree computing method of pixel in the image, for each pixel p in the image, calculate as follows:
(1) asks for neighborhood zone in one group of multiple dimensioned scope of pixel p; Wherein,
The neighborhood zone nearest apart from pixel p is a border circular areas R 0(p), be defined as follows:
R 0(p)={q|0≤||p-q|| 2≤r 0,q∈∧}
|| || 2Expression Euclidean distance tolerance, all pixel in the ∧ presentation video, r 0Radius for this border circular areas;
Border circular areas R 0(p) it is the annular region that concentric circles surrounded in the center of circle with the pixel p that the neighborhood zone outside is k, and they are defined as follows:
R i(p)={t|r i-1≤||p-t|| 2≤r i,t∈∧},i=1,...,k
r iIt is ring-like region R i(p) outer boundary is to the radius of p,
Figure FDA0000022119300000011
W is the width of image, and H is the height of image, and the interregional radius step-length difference of adjacent annular is constant;
(2) calculate respectively in above-mentioned each neighborhood zone with pixel p and have the pixel of difference degree;
(2.1) at border circular areas R 0(p) in, has the pixel D of difference degree with pixel p 0(p) be characterized by:
Figure FDA0000022119300000012
I wherein p,
Figure FDA0000022119300000013
Remarked pixel p,
Figure FDA0000022119300000014
L *a *b *Color feature value, Wherein
Figure FDA0000022119300000016
With
Figure FDA0000022119300000017
Be respectively in the image all pixels at L *a *b *Variance in the color space on three passages;
(2.2) at each annular region R i(p) in, has the pixel D of difference degree with pixel p i(p) be characterized by:
Figure FDA0000022119300000018
I=1 ..., k
Wherein
Figure FDA0000022119300000019
Remarked pixel
Figure FDA00000221193000000110
L *a *b *Color feature value,
Figure FDA0000022119300000021
Or, || CardThe gesture of expression set, I Q 'The L of remarked pixel q ' *a *b *Color feature value;
(3) significance degree of calculating pixel p
CN2010101938533A 2010-06-08 2010-06-08 Method for computing significance degree of pixels in image Expired - Fee Related CN101866484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010101938533A CN101866484B (en) 2010-06-08 2010-06-08 Method for computing significance degree of pixels in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010101938533A CN101866484B (en) 2010-06-08 2010-06-08 Method for computing significance degree of pixels in image

Publications (2)

Publication Number Publication Date
CN101866484A true CN101866484A (en) 2010-10-20
CN101866484B CN101866484B (en) 2012-07-04

Family

ID=42958198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101938533A Expired - Fee Related CN101866484B (en) 2010-06-08 2010-06-08 Method for computing significance degree of pixels in image

Country Status (1)

Country Link
CN (1) CN101866484B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013044418A1 (en) * 2011-09-30 2013-04-04 Intel Corporation Human head detection in depth images
WO2015196715A1 (en) * 2014-06-24 2015-12-30 小米科技有限责任公司 Image retargeting method and device and terminal
US9665925B2 (en) 2014-06-24 2017-05-30 Xiaomi Inc. Method and terminal device for retargeting images

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1760634A2 (en) * 2002-11-18 2007-03-07 Qinetiq Limited Measurement of mitotic activity
CN101493890A (en) * 2009-02-26 2009-07-29 上海交通大学 Dynamic vision caution region extracting method based on characteristic
CN101520894A (en) * 2009-02-18 2009-09-02 上海大学 Method for extracting significant object based on region significance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1760634A2 (en) * 2002-11-18 2007-03-07 Qinetiq Limited Measurement of mitotic activity
CN101520894A (en) * 2009-02-18 2009-09-02 上海大学 Method for extracting significant object based on region significance
CN101493890A (en) * 2009-02-26 2009-07-29 上海交通大学 Dynamic vision caution region extracting method based on characteristic

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Pattern Recognition》 20071231 Qiling Tang et al Extraction of salient contours from cluttered scenes , 2 *
《红外与激光工程》 20041231 张芳 等 基于显著特征引导的红外舰船目标快速分割方法研究 第33卷, 第6期 2 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013044418A1 (en) * 2011-09-30 2013-04-04 Intel Corporation Human head detection in depth images
US9111131B2 (en) 2011-09-30 2015-08-18 Intelcorporation Human head detection in depth images
EP2761533A4 (en) * 2011-09-30 2016-05-11 Intel Corp Human head detection in depth images
US9996731B2 (en) 2011-09-30 2018-06-12 Intel Corporation Human head detection in depth images
WO2015196715A1 (en) * 2014-06-24 2015-12-30 小米科技有限责任公司 Image retargeting method and device and terminal
US9665925B2 (en) 2014-06-24 2017-05-30 Xiaomi Inc. Method and terminal device for retargeting images

Also Published As

Publication number Publication date
CN101866484B (en) 2012-07-04

Similar Documents

Publication Publication Date Title
US20210334524A1 (en) Gesture recognition method and terminal device and computer readable storage medium using the same
US20220092882A1 (en) Living body detection method based on facial recognition, and electronic device and storage medium
US10672140B2 (en) Video monitoring method and video monitoring system
EP3454250B1 (en) Facial image processing method and apparatus and storage medium
US11373443B2 (en) Method and appratus for face recognition and computer readable storage medium
CN102521579B (en) Method for identifying pushing action based on two-dimensional planar camera and system
KR101141643B1 (en) Apparatus and Method for caricature function in mobile terminal using basis of detection feature-point
CN108596193B (en) Method and system for building deep learning network structure aiming at human ear recognition
CN106326830A (en) Fingerprint recognition method and apparatus
Hamdan et al. Face recognition using angular radial transform
CN101577005A (en) Target tracking method and device
CN105069754A (en) System and method for carrying out unmarked augmented reality on image
CN103985136A (en) Target tracking method based on local feature point feature flow pattern
CN101976340B (en) License plate positioning method based on compressed domain
CN110991310A (en) Portrait detection method, portrait detection device, electronic equipment and computer readable medium
CN104850857A (en) Trans-camera pedestrian target matching method based on visual space significant constraints
CN101866484B (en) Method for computing significance degree of pixels in image
CN112380978A (en) Multi-face detection method, system and storage medium based on key point positioning
US20220343507A1 (en) Process of Image
Fan et al. Road vanishing point detection using weber adaptive local filter and salient‐block‐wise weighted soft voting
CN103914690A (en) Shape matching method based on projective invariant
CN113228105A (en) Image processing method and device and electronic equipment
Zhang et al. Driver eye state classification based on cooccurrence matrix of oriented gradients
CN103324753A (en) Image retrieval method based on symbiotic sparse histogram
CN104751139A (en) Fast fingerprint recognition method based on feature points of sweat glands and fingerprint images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120704

Termination date: 20180608

CF01 Termination of patent right due to non-payment of annual fee