CN102163286A - Pornographic image evaluating method - Google Patents

Pornographic image evaluating method Download PDF

Info

Publication number
CN102163286A
CN102163286A CN 201010113826 CN201010113826A CN102163286A CN 102163286 A CN102163286 A CN 102163286A CN 201010113826 CN201010113826 CN 201010113826 CN 201010113826 A CN201010113826 A CN 201010113826A CN 102163286 A CN102163286 A CN 102163286A
Authority
CN
China
Prior art keywords
mrow
image
pornographic
msub
skin color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010113826
Other languages
Chinese (zh)
Other versions
CN102163286B (en
Inventor
胡卫明
左海强
吴偶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Renmin Zhongke Beijing Intelligent Technology Co ltd
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN 201010113826 priority Critical patent/CN102163286B/en
Publication of CN102163286A publication Critical patent/CN102163286A/en
Application granted granted Critical
Publication of CN102163286B publication Critical patent/CN102163286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a pornographic image evaluating method. The method comprises the following steps of: dividing the complexion of an image; extracting 31-dimensional characteristics, namely the integral characteristics of the image, the local part characteristics of a human body and human body shape characteristics, to form characteristic vectors; and training an evaluation model, evaluating the pornographic degree of an input image, and outputting an evaluation result. By the method, whether the input image is a pornographic image or not can be judged, the pornographic degree of the image can be evaluated, 0 to 100 percent of evaluation results can be output, and a larger numerical value means the higher pornographic degree of the image, so that pornographic images is convenient to filter in a classified mode.

Description

Pornographic image evaluation method
Technical Field
The invention relates to the technical field of computer application, in particular to a pornographic image evaluation method.
Background
Currently, special actions for jointly remedying obscency and customs information of internet and mobile phone media are widely carried out nationwide and even worldwide. Because of the poor self-control ability of teenagers, nearly 80% of people are tempted by network bad information in the case of adolescent crimes. The system is especially necessary for helping teenagers avoid the erosion of pornographic information, reducing the possibility that the teenagers walk on a criminal road, effectively evaluating and filtering the pornographic degree of the image and intercepting the harmful information before the teenagers acquire the harmful information technically.
In the prior art, simple features such as skin color proportion and the like are mostly adopted, and the overall features of the image, the local part features of the human body and the shape features of the human body trunk cannot be comprehensively considered, so that the recognition rate is low, most output results are pornographic or normal logic values, and the pornographic degree of the image cannot be measured.
Disclosure of Invention
Technical problem to be solved
In view of the above, the present invention is directed to solve the problems that the recognition rate is low and the pornographic degree of an image cannot be measured in the prior art, and therefore, the present invention provides a pornographic image evaluation method capable of evaluating the pornographic degree of an input image and outputting an evaluation result.
(II) technical scheme
In order to achieve the above object, the present invention provides a pornographic image evaluating method comprising:
step 1: segmenting the skin color of an input image to obtain a skin color area;
step 2: extracting 31-dimensional features including the overall features of an input image, the local part features of a human body and the shape features of the human body on a large number of artificially marked image samples to form feature vectors;
and step 3: training an evaluation model, evaluating the pornographic degree of the input image, and outputting a pornographic image evaluation result.
(III) advantageous effects
According to the technical scheme, the invention has the following advantages:
the invention can not only judge whether the input image is the pornographic image, but also evaluate the pornographic degree of the image, improves the recognition rate and measures the pornographic degree of the image.
1. The erotic image evaluation method provided by the invention can effectively evaluate the erotic degree of the image due to comprehensive consideration of the overall characteristics of the image, the characteristics of the local part of the human body and the shape characteristics of the human body.
2. According to the method for evaluating the pornographic image, provided by the invention, the evaluation result of the pornographic degree of the image is output, the image is simply judged whether the pornographic degree is the same or not, 0% -100% of the evaluation result is output, and the higher the numerical value is, the higher the pornographic degree of the image is, so that the pornographic image can be conveniently graded and filtered.
Drawings
Fig. 1 is a flowchart of a pornographic image evaluation method.
FIG. 2 shows the results of the detection of the local part and the trunk of the human body
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
The execution environment of the invention is composed of three modules which are realized by a computer:
the skin color extraction module has the main function of carrying out skin color segmentation on an input image to obtain a skin color area.
And the feature extraction module is mainly used for extracting 31-dimensional features including image overall features, human body local part features and human body trunk shape features on the input image to form feature vectors.
And thirdly, a model training and evaluating module, wherein the module has the main function of obtaining characteristics on a large number of manually marked image samples, inputting the characteristics into an evaluating model, and training the evaluating model to obtain model parameters. And evaluating the input image by using the trained evaluation model, and outputting an evaluation result.
A detailed description of the various step details involved in the solution of the invention is given below in connection with fig. 1.
Step 101: and segmenting the skin color of the input image. Any skin color segmentation algorithm may be employed, there are mainly two types of skin color segmentation algorithms, one being at the pixel level and the other being at the region level. The skin color segmentation algorithm based on the region can comprehensively consider the attributes of skin color, such as texture, and the like, so that a better segmentation effect can be generally obtained.
Step 102: and extracting 31-dimensional features including the integral image features, the local human body part features and the human body trunk shape features to form feature vectors.
Integral characteristics of image
The overall features of the image consist of 5 parts:
Figure GSA00000031272800031
image aspect ratio;
Figure GSA00000031272800032
entropy of the image;
the entropy of a grayscale image is defined as:
<math><mrow><mi>Entropy</mi><mo>=</mo><mo>-</mo><munderover><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>0</mn></mrow><mn>255</mn></munderover><msub><mi>p</mi><mi>i</mi></msub><mo>&CenterDot;</mo><msub><mi>log</mi><mn>2</mn></msub><msub><mi>p</mi><mi>i</mi></msub><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>1</mn><mo>)</mo></mrow></mrow></math>
in the formula, piIs the percentage of the number of pixels with gray value i to the total number of pixels in the whole image.
Figure GSA00000031272800034
The skin color proportion of the image;
skin color ratio:
Figure GSA00000031272800035
Figure GSA00000031272800036
the number of skin color connected regions;
Figure GSA00000031272800037
geometric moment of skin tone region (this feature has 10 dimensions);
the (p + q) order geometric moment of the image I (x, y) is defined as:
<math><mrow><msub><mi>M</mi><mi>pq</mi></msub><mo>=</mo><munder><mi>&Sigma;</mi><mi>x</mi></munder><munder><mi>&Sigma;</mi><mi>y</mi></munder><msup><mi>x</mi><mi>p</mi></msup><msup><mi>y</mi><mi>q</mi></msup><mi>I</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>3</mn><mo>)</mo></mrow></mrow></math>
wherein x represents the image horizontal coordinate, y represents the image vertical coordinate, and p and q are arbitrary non-negative integers.
We extract the first 3 rd order geometric moments of the skin color region:
geometric moment of 0 th order (p + q ═ 0): m00
Geometric moment of order 1 (p + q ═ 1): m10,M01
Geometric moment of order 2 (p + q ═ 2): m20,M11,M02
Geometric moment of order 3 (p + q ═ 3): m30,M21,M12,M03
These 10 geometric moments are used as the geometric moment feature of the skin color region, which is expressed as follows: m00,M10,M01,M20,M11,M02,M30,M21,M12,M03
Second, local part characteristics of human body
Next 4 features are related to the detection of the local part of the human body (see fig. 2), and we use a cascade classifier based on Haar (Haar) features to perform the detection of the local part of the human body, wherein the 4 features are expressed as follows:
Figure GSA00000031272800041
the number of detected faces;
Figure GSA00000031272800042
the number of breasts detected;
the detected private part number;
Figure GSA00000031272800044
the proportion of the human face skin color to the whole image skin color;
shape characteristics of human body
The last 7 features, which are extracted from the maximum skin tone connected region and its fitting ellipse (see fig. 2), mainly describe the shape of the human torso, are expressed as follows:
the ratio of the maximum skin color communication area to the total skin color area;
Figure GSA00000031272800046
the distance from the center of the maximum skin color connected region to the center of the image;
Figure GSA00000031272800047
geometric invariant of the maximum skin tone connected region (this feature has 7 dimensions);
the geometrically invariant moments of the image I (x, y) have the properties of rotation, translation, scale invariance, etc. In order to calculate the geometric invariant moment of the image, the central moment μ of the image needs to be introduced firstpq
<math><mrow><msub><mi>&mu;</mi><mi>pq</mi></msub><mo>=</mo><munder><mi>&Sigma;</mi><mi>x</mi></munder><munder><mi>&Sigma;</mi><mi>y</mi></munder><msup><mrow><mo>(</mo><mi>x</mi><mo>-</mo><mover><mi>x</mi><mo>&OverBar;</mo></mover><mo>)</mo></mrow><mi>p</mi></msup><msup><mrow><mo>(</mo><mi>y</mi><mo>-</mo><mover><mi>y</mi><mo>&OverBar;</mo></mover><mo>)</mo></mrow><mi>q</mi></msup><mi>I</mi><mrow><mo>(</mo><mi>x</mi><mo>,</mo><mi>y</mi><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>4</mn><mo>)</mo></mrow></mrow></math>
Wherein p, q are any non-negative integer,
Figure GSA00000031272800049
Figure GSA000000312728000410
is the centroid of the image area, M10,M01The image geometric moments defined for equation (3).
The 0-3 order central moments of the image can be expressed as:
μ00=M00
μ01=0,
μ10=0,
μ11=M11-xM01=M11-yM10
μ20=M20-xM10
μ02=M02-yM01
μ21=M21-2xM11-yM20+2x2M01, (5)
μ12=M12-2yM11-xM02+2y2M10
μ30=M30-3xM20+2x2M10
μ03=M03-3yM02+2y2M01.
the central moments of the images have only the property of being shift invariant. For p + q ≧ 2, the moment η with scale invariant properties can be constructed using the following formulapq
<math><mrow><msub><mi>&eta;</mi><mi>pq</mi></msub><mo>=</mo><mfrac><msub><mi>&mu;</mi><mi>pq</mi></msub><msubsup><mi>&mu;</mi><mn>00</mn><mrow><mrow><mo>(</mo><mi>p</mi><mo>+</mo><mi>q</mi><mo>)</mo></mrow><mo>/</mo><mn>2</mn><mo>+</mo><mn>1</mn></mrow></msubsup></mfrac><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>6</mn><mo>)</mo></mrow></mrow></math>
On the basis, the following geometric invariant moment is extracted as a feature from the maximum skin color connected region of the image, and the feature h is used as a feature1To characteristic h7As shown below:
h1=η2002
h2=(η2002)2+(2η11)2
h3=(η30-3η12)2+(3η2103)2
h4=(η3012)2+(η2103)2
h5=(η30-3η12)(η3012)[(η3012)2-3(η2103)2]+
(3η2102)[3(η3012)2-(η2103)2],
h6=(η20-3η12)[(η3012)2-3(η2103)2]+4η113012)(η2103)
h7=(3η21-3η03)(η3012)[(η3012)2-3(η2103)2]+ (7)
30-3η12)(η2103)[3(η3012)2-(η2103)2].
Figure GSA00000031272800052
roundness of the maximum skin color connected region;
we calculate the circularity R of the maximum skin tone connected region using the following formula:
R = | | F 1 | | 2 | | F - 1 | | 2 - - - ( 8 )
wherein, <math><mrow><msub><mi>F</mi><mn>1</mn></msub><mo>=</mo><mfrac><mn>1</mn><mi>N</mi></mfrac><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mrow><mi>N</mi><mo>-</mo><mn>1</mn></mrow></munderover><mrow><mo>(</mo><msub><mi>x</mi><mi>k</mi></msub><mi>cos</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>+</mo><msub><mi>y</mi><mi>k</mi></msub><mi>sin</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>)</mo></mrow><mo>+</mo><mi>j</mi><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mrow><mi>N</mi><mo>-</mo><mn>1</mn></mrow></munderover><mrow><mo>(</mo><msub><mi>y</mi><mi>k</mi></msub><mi>cos</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>-</mo><msub><mi>x</mi><mi>k</mi></msub><mi>sin</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>9</mn><mo>)</mo></mrow></mrow></math>
<math><mrow><msub><mi>F</mi><mrow><mo>-</mo><mn>1</mn></mrow></msub><mo>=</mo><mfrac><mn>1</mn><mi>N</mi></mfrac><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mrow><mi>N</mi><mo>-</mo><mn>1</mn></mrow></munderover><mrow><mo>(</mo><msub><mi>x</mi><mi>k</mi></msub><mi>cos</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>-</mo><msub><mi>y</mi><mi>k</mi></msub><mi>sin</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>)</mo></mrow><mo>+</mo><mi>j</mi><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mrow><mi>N</mi><mo>-</mo><mn>1</mn></mrow></munderover><mrow><mo>(</mo><msub><mi>y</mi><mi>k</mi></msub><mi>cos</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>+</mo><msub><mi>x</mi><mi>k</mi></msub><mi>sin</mi><mrow><mo>(</mo><mfrac><mrow><mn>2</mn><mi>&pi;k</mi></mrow><mi>N</mi></mfrac><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>10</mn><mo>)</mo></mrow></mrow></math>
n is the number of pixel points on the outline of the maximum skin color communication area, k is an integer of 0k,yk) Is the coordinate of the point on the outline of the maximum skin tone communication area, and j is the imaginary unit (j)2=-1),F1、F-1Respectively shows the wheels for calculating the maximum skin color connected region in the anticlockwise direction and the clockwise directionThe first order component of the fourier transform of the profile. When R is 0, the maximum skin color connected region is a circular region; when R is 1, the maximum skin color connected region is a straight line; when R is between 0 and 1, it indicates that the region of maximum skin tone cell is an ellipse.
Figure GSA00000031272800064
Irregularity of the maximum skin tone connected region;
the irregularity IR of the maximum skin color connected region represents the degree of deviation of the region outline from a circle, and the irregularity IR of the maximum skin color connected region is calculated by adopting the following formula:
<math><mrow><mi>IR</mi><mo>=</mo><mn>1.0</mn><mo>-</mo><mfrac><mrow><msup><mrow><mo>|</mo><mo>|</mo><msub><mi>F</mi><mn>1</mn></msub><mo>|</mo><mo>|</mo></mrow><mn>2</mn></msup><mo>+</mo><msup><mrow><mo>|</mo><mo>|</mo><msub><mi>F</mi><mrow><mo>-</mo><mn>1</mn></mrow></msub><mo>|</mo><mo>|</mo></mrow><mn>2</mn></msup></mrow><msup><mi>&sigma;</mi><mn>2</mn></msup></mfrac><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>11</mn><mo>)</mo></mrow></mrow></math>
wherein σ2The variance of all points on the contour of the maximum skin tone connected region is represented and is given by:
<math><mrow><msup><mi>&sigma;</mi><mn>2</mn></msup><mo>=</mo><mfrac><mn>1</mn><mi>N</mi></mfrac><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mi>N</mi></munderover><mrow><mo>(</mo><msubsup><mi>x</mi><mi>k</mi><mn>2</mn></msubsup><mo>+</mo><msubsup><mi>y</mi><mi>k</mi><mn>2</mn></msubsup><mo>)</mo></mrow><mo>-</mo><mrow><mo>(</mo><msup><mrow><mo>(</mo><mfrac><mn>1</mn><mi>N</mi></mfrac><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mi>N</mi></munderover><msub><mi>x</mi><mi>k</mi></msub><mo>)</mo></mrow><mn>2</mn></msup><mo>+</mo><msup><mrow><mo>(</mo><mfrac><mn>1</mn><mi>N</mi></mfrac><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>0</mn></mrow><mi>N</mi></munderover><msub><mi>y</mi><mi>k</mi></msub><mo>)</mo></mrow><mn>2</mn></msup><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>12</mn><mo>)</mo></mrow></mrow></math>
fitting the ratio of the major axis and the minor axis of an ellipse to the maximum skin color connected region;
Figure GSA00000031272800068
fitting the inclination angle of the main shaft of the ellipse in the maximum skin color communication area;
step 103: and training an evaluation model, evaluating the pornographic degree of the input image and outputting an evaluation result.
The evaluation model selects a random forest classifier of a regression mode, the random forest classifier is composed of a series of decision trees, and in the regression mode, the output of a random forest algorithm is the average value of the output results of all the decision trees.
Firstly, a training image library is marked manually, wherein a normal image is marked with a value of 0.0, and a pornographic image is marked with a value of 1.0. Extracting features from the marked image library samples, inputting the features into an evaluation model, training the evaluation model to obtain model parameters, and automatically learning the distribution characteristics of the mood images and the normal images on the extracted 31-dimensional feature space by training the evaluation model. And evaluating the input image by using the trained evaluation model, judging the input image independently by each decision tree in the random forest classifier, and outputting a decision result of 0 or 1. The final output result is the average value of the decision results, namely the evaluation result between 0% and 100%, and represents the pornographic degree of the image, and the larger the numerical value is, the higher the pornographic degree of the image is, so that the hierarchical filtering of the pornographic image is facilitated.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can understand the changes or substitutions within the technical scope of the present invention, and therefore, the scope of the present invention should be subject to the claims.

Claims (3)

1. A pornographic image evaluation method, comprising:
step 1: segmenting the skin color of an input image to obtain a skin color area;
step 2: extracting 31-dimensional features including the overall features of an input image, the local part features of a human body and the shape features of the human body on a large number of artificially marked image samples to form feature vectors;
and step 3: training an evaluation model, evaluating the pornographic degree of the input image, and outputting a pornographic image evaluation result.
2. The pornographic image evaluation method according to claim 1, wherein the evaluation model adopts a random forest algorithm and a regression mode of the random forest algorithm.
3. The pornographic image evaluation method according to claim 1, wherein the evaluation result is an evaluation result that outputs 0% to 100%, and a larger numerical value indicates a higher pornographic degree of the image, facilitating hierarchical filtering of the pornographic image.
CN 201010113826 2010-02-24 2010-02-24 Pornographic image evaluating method Active CN102163286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010113826 CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010113826 CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Publications (2)

Publication Number Publication Date
CN102163286A true CN102163286A (en) 2011-08-24
CN102163286B CN102163286B (en) 2013-03-20

Family

ID=44464503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010113826 Active CN102163286B (en) 2010-02-24 2010-02-24 Pornographic image evaluating method

Country Status (1)

Country Link
CN (1) CN102163286B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542304A (en) * 2012-01-12 2012-07-04 郑州金惠计算机系统工程有限公司 Region segmentation skin-color algorithm for identifying WAP (Wireless Application Protocol) mobile porn image
CN103093180A (en) * 2011-10-28 2013-05-08 阿里巴巴集团控股有限公司 Method and system for detecting pornography images
CN103839076A (en) * 2014-02-25 2014-06-04 中国科学院自动化研究所 Network sensitive image identification method based on light characteristics
CN104281833A (en) * 2013-07-08 2015-01-14 深圳市腾讯计算机系统有限公司 Method and device for recognizing pornographic images
CN104484683A (en) * 2014-12-31 2015-04-01 小米科技有限责任公司 Porn picture detection method and device
CN105631015A (en) * 2015-12-31 2016-06-01 宁波领视信息科技有限公司 Intelligent multimedia player
CN103577831B (en) * 2012-07-30 2016-12-21 国际商业机器公司 For the method and apparatus generating training pattern based on feedback
US9659258B2 (en) 2013-09-12 2017-05-23 International Business Machines Corporation Generating a training model based on feedback
US11475879B2 (en) 2020-03-20 2022-10-18 Beijing Xiaomi Pinecone Electronics Co., Ltd. Method and device for evaluating quality of content, electronic equipment, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020159630A1 (en) * 2001-03-29 2002-10-31 Vasile Buzuloiu Automated detection of pornographic images
CN1704966A (en) * 2004-05-28 2005-12-07 中国科学院计算技术研究所 Method for detecting pornographic images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020159630A1 (en) * 2001-03-29 2002-10-31 Vasile Buzuloiu Automated detection of pornographic images
CN1704966A (en) * 2004-05-28 2005-12-07 中国科学院计算技术研究所 Method for detecting pornographic images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《财务与金融》 20090215 王志红等 基于随机森林的基金评级模型选择 全文 1-3 , 第01期 *
《通信学报》 20040725 杨金锋等 一种新型的基于内容的图像识别与过滤方法 第3-5部分 1-3 第25卷, 第07期 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103093180B (en) * 2011-10-28 2016-06-29 阿里巴巴集团控股有限公司 A kind of method and system of pornographic image detecting
CN103093180A (en) * 2011-10-28 2013-05-08 阿里巴巴集团控股有限公司 Method and system for detecting pornography images
CN102542304A (en) * 2012-01-12 2012-07-04 郑州金惠计算机系统工程有限公司 Region segmentation skin-color algorithm for identifying WAP (Wireless Application Protocol) mobile porn image
US10346746B2 (en) 2012-07-30 2019-07-09 International Business Machines Corporation Generating a training model based on feedback
CN103577831B (en) * 2012-07-30 2016-12-21 国际商业机器公司 For the method and apparatus generating training pattern based on feedback
US11132618B2 (en) 2012-07-30 2021-09-28 International Business Machines Corporation Generating a training model based on feedback
WO2015003606A1 (en) * 2013-07-08 2015-01-15 Tencent Technology (Shenzhen) Company Limited Method and apparatus for recognizing pornographic image
CN104281833A (en) * 2013-07-08 2015-01-14 深圳市腾讯计算机系统有限公司 Method and device for recognizing pornographic images
CN104281833B (en) * 2013-07-08 2018-12-18 深圳市腾讯计算机系统有限公司 Pornographic image recognizing method and device
US9659258B2 (en) 2013-09-12 2017-05-23 International Business Machines Corporation Generating a training model based on feedback
CN103839076B (en) * 2014-02-25 2017-05-10 中国科学院自动化研究所 Network sensitive image identification method based on light characteristics
CN103839076A (en) * 2014-02-25 2014-06-04 中国科学院自动化研究所 Network sensitive image identification method based on light characteristics
CN104484683A (en) * 2014-12-31 2015-04-01 小米科技有限责任公司 Porn picture detection method and device
CN104484683B (en) * 2014-12-31 2019-08-02 小米科技有限责任公司 Yellow map chip detection method and device
CN105631015A (en) * 2015-12-31 2016-06-01 宁波领视信息科技有限公司 Intelligent multimedia player
US11475879B2 (en) 2020-03-20 2022-10-18 Beijing Xiaomi Pinecone Electronics Co., Ltd. Method and device for evaluating quality of content, electronic equipment, and storage medium

Also Published As

Publication number Publication date
CN102163286B (en) 2013-03-20

Similar Documents

Publication Publication Date Title
CN102163286A (en) Pornographic image evaluating method
CN106096668B (en) The recognition methods and identifying system of watermarked image
CN105095856B (en) Face identification method is blocked based on mask
CN101866421B (en) Method for extracting characteristic of natural image based on dispersion-constrained non-negative sparse coding
CN103177458B (en) A kind of visible remote sensing image region of interest area detecting method based on frequency-domain analysis
CN103218832B (en) Based on the vision significance algorithm of global color contrast and spatial distribution in image
CN109493342B (en) Skin disease picture lesion type classification method based on deep learning
CN102799872B (en) Image processing method based on face image characteristics
CN109558806A (en) The detection method and system of high score Remote Sensing Imagery Change
CN102844766A (en) Human eyes images based multi-feature fusion identification method
CN104915676A (en) Deep-level feature learning and watershed-based synthetic aperture radar (SAR) image classification method
CN104268593A (en) Multiple-sparse-representation face recognition method for solving small sample size problem
CN103116763A (en) Vivo-face detection method based on HSV (hue, saturation, value) color space statistical characteristics
CN103955922A (en) Method for detecting flaws of printed fabric based on Gabor filter
CN108960288B (en) Three-dimensional model classification method and system based on convolutional neural network
CN106570183B (en) A kind of Color Image Retrieval and classification method
Zhu et al. Logarithm gradient histogram: A general illumination invariant descriptor for face recognition
CN1760887A (en) The robust features of iris image extracts and recognition methods
CN107067407B (en) Contour detection method based on non-classical receptive field and linear nonlinear modulation
CN106529395A (en) Signature image recognition method based on deep brief network and k-means clustering
CN104778466A (en) Detection method combining various context clues for image focus region
Feng et al. A new technology of remote sensing image fusion
CN109993124A (en) Based on the reflective biopsy method of video, device and computer equipment
CN103020959A (en) Gravity model-based oceanic front information extraction method
CN104834909A (en) Image characteristic description method based on Gabor synthetic characteristic

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191203

Address after: 250101 2F, Hanyu Jingu new media building, high tech Zone, Jinan City, Shandong Province

Patentee after: Renmin Zhongke (Shandong) Intelligent Technology Co.,Ltd.

Address before: 100080 Zhongguancun East Road, Beijing, No. 95, No.

Patentee before: Institute of Automation, Chinese Academy of Sciences

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200310

Address after: Room 201, 2 / F, Hanyu Jingu new media building, no.7000, Jingshi Road, Jinan City, Shandong Province, 250000

Patentee after: Renmin Zhongke (Jinan) Intelligent Technology Co.,Ltd.

Address before: 250101 2F, Hanyu Jingu new media building, high tech Zone, Jinan City, Shandong Province

Patentee before: Renmin Zhongke (Shandong) Intelligent Technology Co.,Ltd.

CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 100176 1401, 14th floor, building 8, No. 8 courtyard, No. 1 KEGU street, Beijing Economic and Technological Development Zone, Daxing District, Beijing (Yizhuang group, high-end industrial area, Beijing Pilot Free Trade Zone)

Patentee after: Renmin Zhongke (Beijing) Intelligent Technology Co.,Ltd.

Address before: Room 201, 2 / F, Hangu Jinggu new media building, 7000 Jingshi Road, Jinan City, Shandong Province

Patentee before: Renmin Zhongke (Jinan) Intelligent Technology Co.,Ltd.