CN102163286B - Pornographic image evaluating method - Google Patents

Pornographic image evaluating method Download PDF

Info

Publication number
CN102163286B
CN102163286B CN 201010113826 CN201010113826A CN102163286B CN 102163286 B CN102163286 B CN 102163286B CN 201010113826 CN201010113826 CN 201010113826 CN 201010113826 A CN201010113826 A CN 201010113826A CN 102163286 B CN102163286 B CN 102163286B
Authority
CN
China
Prior art keywords
skin
image
colour
pornographic
unicom
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN 201010113826
Other languages
Chinese (zh)
Other versions
CN102163286A (en
Inventor
胡卫明
左海强
吴偶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Renmin Zhongke Beijing Intelligent Technology Co ltd
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN 201010113826 priority Critical patent/CN102163286B/en
Publication of CN102163286A publication Critical patent/CN102163286A/en
Application granted granted Critical
Publication of CN102163286B publication Critical patent/CN102163286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a pornographic image evaluating method. The method comprises the following steps of: dividing the complexion of an image; extracting 31-dimensional characteristics, namely the integral characteristics of the image, the local part characteristics of a human body and human body shape characteristics, to form characteristic vectors; and training an evaluation model, evaluating the pornographic degree of an input image, and outputting an evaluation result. By the method, whether the input image is a pornographic image or not can be judged, the pornographic degree of the image can be evaluated, 0 to 100 percent of evaluation results can be output, and a larger numerical value means the higher pornographic degree of the image, so that pornographic images is convenient to filter in a classified mode.

Description

A kind of Pornographic image evaluating method
Technical field
The present invention relates to the Computer Applied Technology field, particularly a kind of Pornographic image evaluating method.
Background technology
Current, in the whole nation even world wide, extensively carry out the special campaigns of the obscene pornographic of joint operation internet and mobile media and vulgar information.Because teen-age self-control is relatively poor, in the middle of juvenile deliquency, the temptation that has nearly 80% people to be the network flame.In order to help the teenager netizen to exempt from the erosion of pornographic information, reduce the probability that they go on the crime road, pornographic degree to image is carried out effective evaluation and filtration, technically these harmful informations is tackled before the teenager obtains them, just seems particularly necessary.
Mostly adopt the simple feature such as colour of skin ratio in the prior art scheme, fail to consider integral image feature, body local genius loci and trunk shape facility, thereby discrimination is lower, and mostly Output rusults is pornographic or normal logical value, can not measure the pornographic degree of image.
Summary of the invention
The technical matters that (one) will solve
In view of this, fundamental purpose of the present invention is that to solve the prior art discrimination lower, the problem that can not measure the pornographic degree of image, for this reason, the invention provides a kind of Pornographic image evaluating method, can estimate the pornographic degree of input picture, and the output evaluation result.
(2) technical scheme
For achieving the above object, the invention provides a kind of Pornographic image evaluating method, the method comprises:
Step 1: the colour of skin to input picture is cut apart, and obtains area of skin color;
Step 2: totally 31 dimensional feature constitutive characteristics are vectorial by global feature, body local genius loci and trunk shape facility three classes at the good great amount of images sample extraction input picture of handmarking;
Step 3: the evaluation of training model is also estimated the pornographic degree of input picture, output pornographic image evaluation result.
(3) beneficial effect
Can find out that from technique scheme the present invention has the following advantages:
Utilize the present invention, can judge not only whether input picture is pornographic image, and can estimate the pornographic degree of image that the present invention has improved discrimination, can measure the pornographic degree of image.
1, this Pornographic image evaluating method provided by the invention owing to considered integral image feature, body local genius loci and trunk shape facility, can carry out effectively evaluating to the pornographic degree of image effectively.
2, this Pornographic image evaluating method provided by the invention, because output is the evaluation result of the pornographic degree of image, and be not only simple judgement graphic sex whether, and export 0%~100% evaluation result, numerical value is larger to show that the pornographic degree of image is higher, is convenient to pornographic image is carried out classified filtering.
Description of drawings
Fig. 1 is the process flow diagram of Pornographic image evaluating method.
Fig. 2 is that the human body part detects and the trunk testing result
Embodiment
For making the purpose, technical solutions and advantages of the present invention clearer, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in more detail.
Execution environment of the present invention is made of three module groups below the computer realization:
One, colour of skin extraction module, the major function of this module are that input picture is carried out skin color segmentation, obtain area of skin color.
Two, characteristic extracting module, the major function of this module are to extract integral image feature, body local genius loci and trunk shape facility three classes totally 31 dimensional feature constitutive characteristics vector at input picture.
Three, model training and evaluation module, the major function of this module are by extracting feature at the good great amount of images sample of handmarking, with these feature input evaluation models, to the evaluation model training, obtaining model parameter.The evaluation model that utilization trains is estimated input picture, and the output evaluation result.
Below in conjunction with Fig. 1, provide in detail the explanation of each related in technical solution of the present invention step details problem.
Step 101: to the input picture skin color segmentation.Can adopt arbitrarily skin color segmentation algorithm, mainly contain two class skin color segmentation algorithms, a class is pixel scale, another kind of be region class other.Owing to can consider the attributes such as texture of the colour of skin, thereby usually can obtain preferably segmentation effect based on the skin color segmentation algorithm in zone.
Step 102: extract integral image feature, body local genius loci and trunk shape facility three classes totally 31 dimensional feature constitutive characteristics vector.
One, integral image feature
The global feature of image is comprised of following 5 parts:
Figure GSA00000031272800031
The image length breadth ratio;
Figure GSA00000031272800032
The entropy of image;
The entropy of gray level image is defined as:
Entropy = - Σ i = 0 255 p i · log 2 p i - - - ( 1 )
In the formula, p iThat gray-scale value is the number percent that the number of pixels of i accounts for the entire image sum of all pixels.
Figure GSA00000031272800034
The colour of skin ratio of image;
Colour of skin ratio:
Figure GSA00000031272800035
Figure GSA00000031272800036
Colour of skin connected region number;
The geometric moment of area of skin color (this feature has 10 dimensions);
(p+q) rank geometric moment of image I (x, y) is defined as:
M pq = Σ x Σ y x p y q I ( x , y ) - - - ( 3 )
In the formula, x presentation video lateral coordinates, y presentation video along slope coordinate, p, q is nonnegative integer arbitrarily.
We extract the front 3 rank geometric moments of area of skin color:
0 rank (p+q=0) geometric moment: M 00
1 rank (p+q=1) geometric moment: M 10, M 01
2 rank (p+q=2) geometric moment: M 20, M 11, M 02
3 rank (p+q=3) geometric moment: M 30, M 21, M 12, M 03
These 10 geometric moments are as the geometric moment feature of area of skin color, and the geometric moment character representation of this area of skin color is as follows: M 00, M 10, M 01, M 20, M 11, M 02, M 30, M 21, M 12, M 03
Two, body local genius loci
Next 4 features are relevant body local location detection (seeing Fig. 2), and we adopt the cascade classifier based on Ha Er (Haar) feature to carry out the body local location detection, and described 4 features are expressed as follows:
Figure GSA00000031272800041
The people's face quantity that detects;
Figure GSA00000031272800042
The chest quantity that detects;
Figure GSA00000031272800043
The private section quantity that detects;
Figure GSA00000031272800044
The ratio of the shared integral image colour of skin of face complexion;
Three, trunk shape facility
Last 7 features from maximum colour of skin UNICOM zone and fitted ellipse (seeing Fig. 2) extract, the trunk shape of human body has mainly been described, described 7 features are expressed as follows:
Figure GSA00000031272800045
Maximum colour of skin UNICOM region area accounts for the ratio of total colour of skin area;
Figure GSA00000031272800046
Maximum colour of skin UNICOM regional center is to the distance of picture centre;
Figure GSA00000031272800047
The geometric invariant moment (this feature has 7 dimensions) in maximum colour of skin UNICOM zone;
The geometric invariant moment of image I (x, y) has the character such as rotation, translation, yardstick be constant.For the geometric invariant moment of computed image, at first need to introduce the center square μ of image Pq:
μ pq = Σ x Σ y ( x - x ‾ ) p ( y - y ‾ ) q I ( x , y ) - - - ( 4 )
P wherein, q is nonnegative integer arbitrarily,
Figure GSA00000031272800049
Figure GSA000000312728000410
The barycenter of image-region, M 10, M 01Image moment for formula (3) definition.
Then center, the 0-3 rank square of image can be expressed as:
μ 00=M 00
μ 01=0,
μ 10=0,
μ 11=M 11-xM 01=M 11-yM 10
μ 20=M 20-xM 10
μ 02=M 02-yM 01
μ 21=M 21-2xM 11-yM 20+2x 2M 01, (5)
μ 12=M 12-2yM 11-xM 02+2y 2M 10
μ 30=M 30-3xM 20+2x2M 10
μ 03=M 03-3yM 02+2y2M 01.
The center square of image only has translation invariant character.For the situation of p+q 〉=2, can adopt following formula construction to have the square η of yardstick invariance Pq:
η pq = μ pq μ 00 ( p + q ) / 2 + 1 - - - ( 6 )
On this basis, we extract following geometric invariant moment as feature, this feature h in the maximum colour of skin of image UNICOM zone 1To feature h 7As shown in the formula illustrating:
h 1=η 2002
h 2=(η 2002) 2+(2η 11) 2
h 3=(η 30-3η 12) 2+(3η 2103) 2
h 4=(η 3012) 2+(η 2103) 2
h 5=(η 30-3η 12)(η 3012)[(η 3012) 2-3(η 2103) 2]+
(3η 2102)[3(η 3012) 2-(η 2103) 2],
h 6=(η 20-3η 12)[(η 3012) 2-3(η 2103) 2]+4η 113012)(η 2103)
h 7=(3η 21-3η 03)(η 3012)[(η 3012) 2-3(η 2103) 2]+ (7)
30-3η 12)(η 2103)[3(η 3012) 2-(η 2103) 2].
Figure GSA00000031272800052
The circularity in maximum colour of skin UNICOM zone;
We adopt following formula to calculate the circularity R in maximum colour of skin UNICOM zone:
R = | | F 1 | | 2 | | F - 1 | | 2 - - - ( 8 )
Wherein, F 1 = 1 N ( Σ k = 0 N - 1 ( x k cos ( 2 πk N ) + y k sin ( 2 πk N ) ) + j Σ k = 0 N - 1 ( y k cos ( 2 πk N ) - x k sin ( 2 πk N ) ) ) - - - ( 9 )
F - 1 = 1 N ( Σ k = 0 N - 1 ( x k cos ( 2 πk N ) - y k sin ( 2 πk N ) ) + j Σ k = 0 N - 1 ( y k cos ( 2 πk N ) + x k sin ( 2 πk N ) ) ) - - - ( 10 )
N is the number of pixel on the maximum colour of skin UNICOM region contour, and k is the integer of 0...N-1, (x k, y k) be the coordinate of putting on the maximum colour of skin UNICOM region contour, j is the (j of imaginary unit 2=-1), F 1, F -1Represented respectively with counterclockwise and clockwise direction calculate the first order component of the Fourier transform of maximum colour of skin UNICOM region contour.Represent that when R=0 maximum colour of skin UNICOM zone is a border circular areas; Represent that when R=1 maximum colour of skin UNICOM zone is straight line; Represent that when R is between 0 and 1 maximum colour of skin UNICOM zone is an ellipse.
Figure GSA00000031272800064
The degree of irregularity in maximum colour of skin UNICOM zone;
The degree of irregularity IR in maximum colour of skin UNICOM zone has represented the degree of region contour stray circle, adopts following formula to calculate the degree of irregularity IR in maximum colour of skin UNICOM zone:
IR = 1.0 - | | F 1 | | 2 + | | F - 1 | | 2 σ 2 - - - ( 11 )
Wherein, σ 2Represented to be provided the variance of having a few on the maximum colour of skin UNICOM region contour by following formula:
σ 2 = 1 N Σ k = 0 N ( x k 2 + y k 2 ) - ( ( 1 N Σ k = 0 N x k ) 2 + ( 1 N Σ k = 0 N y k ) 2 ) - - - ( 12 )
Figure GSA00000031272800067
The ratio of semi-minor axis length of the regional fitted ellipse of maximum colour of skin UNICOM;
Figure GSA00000031272800068
The spindle tilt of the regional fitted ellipse of maximum colour of skin UNICOM;
Step 103: the evaluation of training model is also estimated the pornographic degree of input picture, the output evaluation result.
Evaluation model is selected the random forest sorter of Regression Model, and the random forest sorter is made of a series of decision trees, and in Regression Model, what random forests algorithm was exported is the mean value of all decision tree Output rusults.
At first the handmarking is carried out in the training image storehouse, normal picture mark 0.0 value, pornographic image mark 1.0 values.The image library sample good at mark extracts feature, and with these feature input evaluation models, to the evaluation model training, obtain model parameter, by the evaluation of training model can automatic learning go out pornographic image and normal picture said extracted to 31 dimensional feature space on characteristic distributions.The evaluation model that utilization trains is estimated input picture, and each decision tree in the random forest sorter all can independently be judged input picture, and export 0 or 1 the result of decision.Final Output rusults is the mean value of these results of decision, namely between 0%~100% evaluation result, has represented the pornographic degree of image, and numerical value is larger to show that the pornographic degree of image is higher, is convenient to pornographic image is carried out classified filtering.
The above; only be the embodiment among the present invention; but protection scope of the present invention is not limited to this; anyly be familiar with the people of this technology in the disclosed technical scope of the present invention; can understand conversion or the replacement expected; therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (3)

1. Pornographic image evaluating method is characterized in that the method comprises:
Step 1: the colour of skin to input picture is cut apart, and obtains area of skin color;
Step 2: totally 31 dimensional feature constitutive characteristics are vectorial by global feature, body local genius loci and trunk shape facility three classes at the good great amount of images sample extraction input picture of handmarking, wherein, the global feature of described image comprises 14 features: the geometric moment of the colour of skin ratio of the entropy of image length breadth ratio, image, image, colour of skin connected region number and 10 dimension area of skin color; Described body local genius loci comprises 4 features: the ratio of the people's face quantity that detects, the chest quantity that detects, the private section quantity that detects and the shared integral image colour of skin of face complexion; Described trunk shape facility comprises 13 features: maximum colour of skin UNICOM region area accounts for the ratio of total colour of skin area, maximum colour of skin UNICOM regional center to the spindle tilt of the ratio of semi-minor axis length of the degree of irregularity in the circularity in the geometric invariant moment in the distance of picture centre, the maximum colour of skin of 7 dimensions UNICOM zone, maximum colour of skin UNICOM zone, maximum colour of skin UNICOM zone, maximum colour of skin UNICOM zone fitted ellipse and maximum colour of skin UNICOM zone fitted ellipse;
Wherein, the colour of skin ratio of described image is defined as:
Figure FDA00002581057300011
The circularity R in described maximum colour of skin UNICOM zone calculates by following formula:
R = | | F 1 | | 2 | | F - 1 | | 2 ,
Wherein, F 1 = 1 N ( Σ k = 0 N - 1 ( x k cos ( 2 πk N ) + y k sin ( 2 πk N ) ) + j Σ k = 0 N - 1 ( y k cos ( 2 πk N ) - x k sin ( 2 πk N ) ) ) ,
F - 1 = 1 N ( Σ k = 0 N - 1 ( x k cos ( 2 πk N ) - y k sin ( 2 πk N ) ) + j Σ k = 0 N - 1 ( y k cos ( 2 πk N ) + x k sin ( 2 πk N ) ) ) ,
N is the number of pixel on the maximum colour of skin UNICOM region contour, and k is the integer of 0...N-1, (x k, y k) be the coordinate of putting on the maximum colour of skin UNICOM region contour, j is imaginary unit, j 2=-1, F 1, F -1Represent respectively with counterclockwise and clockwise direction calculate the first order component of the Fourier transform of maximum colour of skin UNICOM region contour;
The degree of irregularity IR in described maximum colour of skin UNICOM zone represents the degree of region contour stray circle, adopts following formula to calculate:
IR = 1.0 - | | F 1 | | 2 + | | F - 1 | | 2 σ 2 ,
Wherein, σ 2Represent the variance of having a few on the maximum colour of skin UNICOM region contour:
σ 2 = 1 N Σ k = 0 N ( x k 2 + y k 2 ) - ( ( 1 N Σ k = 0 N x k ) 2 + ( 1 N Σ k = 0 N y k ) 2 ) ;
Step 3: the evaluation of training model is also estimated the pornographic degree of input picture, output pornographic image evaluation result.
2. Pornographic image evaluating method according to claim 1 is characterized in that, described evaluation model is selected random forests algorithm, adopts the Regression Model of random forests algorithm.
3. Pornographic image evaluating method according to claim 1 is characterized in that, described evaluation result is the evaluation result of output 0%~100%, and numerical value is larger to show that the pornographic degree of image is higher, is convenient to pornographic image is carried out classified filtering.
CN 201010113826 2010-02-24 2010-02-24 Pornographic image evaluating method Active CN102163286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010113826 CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010113826 CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Publications (2)

Publication Number Publication Date
CN102163286A CN102163286A (en) 2011-08-24
CN102163286B true CN102163286B (en) 2013-03-20

Family

ID=44464503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010113826 Active CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Country Status (1)

Country Link
CN (1) CN102163286B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093180B (en) * 2011-10-28 2016-06-29 阿里巴巴集团控股有限公司 A kind of method and system of pornographic image detecting
CN102542304B (en) * 2012-01-12 2013-07-31 郑州金惠计算机系统工程有限公司 Region segmentation skin-color algorithm for identifying WAP (Wireless Application Protocol) mobile porn image
US9659258B2 (en) 2013-09-12 2017-05-23 International Business Machines Corporation Generating a training model based on feedback
CN103577831B (en) 2012-07-30 2016-12-21 国际商业机器公司 For the method and apparatus generating training pattern based on feedback
CN104281833B (en) * 2013-07-08 2018-12-18 深圳市腾讯计算机系统有限公司 Pornographic image recognizing method and device
CN103839076B (en) * 2014-02-25 2017-05-10 中国科学院自动化研究所 Network sensitive image identification method based on light characteristics
CN104484683B (en) * 2014-12-31 2019-08-02 小米科技有限责任公司 Yellow map chip detection method and device
CN105631015A (en) * 2015-12-31 2016-06-01 宁波领视信息科技有限公司 Intelligent multimedia player
CN111428032B (en) 2020-03-20 2024-03-29 北京小米松果电子有限公司 Content quality evaluation method and device, electronic equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1704966A (en) * 2004-05-28 2005-12-07 中国科学院计算技术研究所 Method for detecting pornographic images

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751348B2 (en) * 2001-03-29 2004-06-15 Fotonation Holdings, Llc Automated detection of pornographic images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1704966A (en) * 2004-05-28 2005-12-07 中国科学院计算技术研究所 Method for detecting pornographic images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨金锋等.一种新型的基于内容的图像识别与过滤方法.《通信学报》.2004,第25卷(第07期),第3-5部分. *
王志红等.基于随机森林的基金评级模型选择.《财务与金融》.2009,(第01期),全文. *

Also Published As

Publication number Publication date
CN102163286A (en) 2011-08-24

Similar Documents

Publication Publication Date Title
CN102163286B (en) Pornographic image evaluating method
CN105095856B (en) Face identification method is blocked based on mask
CN105844295B (en) A kind of video smoke sophisticated category method based on color model and motion feature
CN103020992B (en) A kind of video image conspicuousness detection method based on motion color-associations
CN102073841B (en) Poor video detection method and device
CN107315998B (en) Vehicle class division method and system based on lane line
CN105335716A (en) Improved UDN joint-feature extraction-based pedestrian detection method
CN105469076B (en) Face alignment verification method based on multi-instance learning
CN105528575B (en) Sky detection method based on Context Reasoning
CN104036323A (en) Vehicle detection method based on convolutional neural network
CN106127137A (en) A kind of target detection recognizer based on 3D trajectory analysis
CN109117788A (en) A kind of public transport compartment crowding detection method merging ResNet and LSTM
CN100589117C (en) Gender recognition method based on gait
CN109816040B (en) Deep learning-based urban inland inundation water depth detection method
CN109558806A (en) The detection method and system of high score Remote Sensing Imagery Change
CN105718889A (en) Human face identity recognition method based on GB(2D)2PCANet depth convolution model
CN104268528A (en) Method and device for detecting crowd gathered region
CN106339657B (en) Crop straw burning monitoring method based on monitor video, device
CN104504395A (en) Method and system for achieving classification of pedestrians and vehicles based on neural network
CN102799872B (en) Image processing method based on face image characteristics
CN103020596A (en) Method for identifying abnormal human behaviors in power production based on block model
CN103020985A (en) Video image saliency detection method based on field quantity analysis
CN103971106A (en) Multi-view human facial image gender identification method and device
CN105956552A (en) Face black list monitoring method
CN111274886A (en) Deep learning-based pedestrian red light violation analysis method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20191203

Address after: 250101 2F, Hanyu Jingu new media building, high tech Zone, Jinan City, Shandong Province

Patentee after: Renmin Zhongke (Shandong) Intelligent Technology Co.,Ltd.

Address before: 100080 Zhongguancun East Road, Beijing, No. 95, No.

Patentee before: Institute of Automation, Chinese Academy of Sciences

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200310

Address after: Room 201, 2 / F, Hanyu Jingu new media building, no.7000, Jingshi Road, Jinan City, Shandong Province, 250000

Patentee after: Renmin Zhongke (Jinan) Intelligent Technology Co.,Ltd.

Address before: 250101 2F, Hanyu Jingu new media building, high tech Zone, Jinan City, Shandong Province

Patentee before: Renmin Zhongke (Shandong) Intelligent Technology Co.,Ltd.

TR01 Transfer of patent right
CP03 Change of name, title or address

Address after: 100176 1401, 14th floor, building 8, No. 8 courtyard, No. 1 KEGU street, Beijing Economic and Technological Development Zone, Daxing District, Beijing (Yizhuang group, high-end industrial area, Beijing Pilot Free Trade Zone)

Patentee after: Renmin Zhongke (Beijing) Intelligent Technology Co.,Ltd.

Address before: Room 201, 2 / F, Hangu Jinggu new media building, 7000 Jingshi Road, Jinan City, Shandong Province

Patentee before: Renmin Zhongke (Jinan) Intelligent Technology Co.,Ltd.

CP03 Change of name, title or address