CN111275111B - Method for classifying homozoon furs - Google Patents

Method for classifying homozoon furs Download PDF

Info

Publication number
CN111275111B
CN111275111B CN202010065668.XA CN202010065668A CN111275111B CN 111275111 B CN111275111 B CN 111275111B CN 202010065668 A CN202010065668 A CN 202010065668A CN 111275111 B CN111275111 B CN 111275111B
Authority
CN
China
Prior art keywords
fur
dimensional
elements
tested
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010065668.XA
Other languages
Chinese (zh)
Other versions
CN111275111A (en
Inventor
郭拓
王海燕
陈晓
马令坤
孙连山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi University of Science and Technology
Original Assignee
Shaanxi University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi University of Science and Technology filed Critical Shaanxi University of Science and Technology
Priority to CN202010065668.XA priority Critical patent/CN111275111B/en
Publication of CN111275111A publication Critical patent/CN111275111A/en
Application granted granted Critical
Publication of CN111275111B publication Critical patent/CN111275111B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Abstract

The application discloses a classification method of homozoon fur, which comprises the following steps: photographing the tested fur for N times to obtain N pieces of two-dimensional tested fur image information; sequentially carrying out two-dimensional Fourier transform and column direction summation on each piece of two-dimensional tested fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain detection vectors of N elements; comparing N elements in the detection vector with N elements in the reference vector in pairs to obtain an N multiplied by N comparison matrix; if 95% of the elements in the comparison matrix are larger than 1 or smaller than 1, the tested fur and the reference fur do not belong to the same kind of fur, otherwise, the tested fur and the reference fur belong to the same kind of fur. And shooting by a camera, sequentially carrying out two-dimensional Fourier transform, adding and converting the obtained image data in the same column into one-dimensional frequency domain data, extracting the maximum value of the one-dimensional data, and automatically judging the type of the fur according to the frequency domain energy ratio, thereby realizing unmanned, automatic and rapid classification of the fur.

Description

Method for classifying homozoon furs
Technical Field
The application relates to the technical field of fur dyeing, in particular to a classification method of homoplasmic fur of animals.
Background
In the fur dyeing industry, fur of different species, the absorption of dye by animal fibers is significantly different, and the fur is necessarily dyed with different dye formulas. However, for the same pure-color fur of the same kind of animals, the fur density or the fur thinning can be caused by different reasons such as producing places, home-use, wild, and the like, for example, white rabbit fur, and two kinds of pure-white rabbit fur with the fur density or the fur thinning can be found in the actual dyeing process for the two kinds of fur, and the two kinds of fur can have obvious difference in dye absorption, so that the fur density or the fur thinning of the same kind of animals is dyed with different dye formulas, and therefore, the pure-color fur of the same kind of animals needs to be classified before dyeing.
At present, the classification of the pure-color fur of the same kind of animals is carried out by manual touch of a worker master, and time and labor are consumed for mass dyeing, which belongs to an important link affecting the automation of the whole fur dyeing process. However, no report on classification of the same-color fur of the same kind of animals is found in China.
Disclosure of Invention
The embodiment of the application provides a classification method of homotopy fur of animals, which is used for solving the problems in the background technology.
The embodiment of the application provides a classification method of homoplasmic fur of animals, which comprises the following steps: photographing the tested fur for N times to obtain N pieces of two-dimensional tested fur image information;
sequentially carrying out two-dimensional Fourier transform and column direction summation on each piece of two-dimensional tested fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain detection vectors of N elements;
comparing N elements in the detection vector with N elements in the reference vector in pairs to obtain an N multiplied by N comparison matrix; if 95% of the elements in the comparison matrix are larger than 1 or smaller than 1, the tested fur and the reference fur do not belong to the same kind of fur, otherwise, the tested fur and the reference fur belong to the same kind of fur.
Further, a common camera is adopted in the camera bellows to take photos of the fur for N times; the LED light source is arranged on the inner side wall of the camera bellows, the common camera is arranged at the inner top of the camera bellows, and fur is tiled at the inner bottom of the camera bellows.
Further, for a two-dimensional fur image t (x, y) of K rows and S columns, the two-dimensional fourier transform formula is as follows:
further, summing the two-dimensional Fourier transformed information according to the column direction to obtain one-dimensional information, wherein the formula is as follows:
further, for a one-dimensional signal, the frequency domain maximum is calculated as follows:
further, the detection vectors of the N elements constituted by the N two-dimensional fur image information are expressed as follows:
V=[MEng 1 ,MEng 2 ,MEng 3 ,…,MEng N-1, MEng N ]
wherein, MEng i Is the frequency domain maximum of the ith image.
Further, the step of obtaining the reference vectors of the N elements includes:
photographing the reference fur for N times to obtain N two-dimensional reference fur image information;
and sequentially carrying out two-dimensional Fourier transform and column direction summation on each two-dimensional reference fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain the reference vectors of N elements.
The embodiment of the application provides a classification method of homoplasmic fur of animals, which has the following beneficial effects compared with the prior art:
according to the application, the fur is photographed by the common camera in the camera, the manufacturing cost of the camera and the photographing cost of the common camera are low, and the photographed image only needs to be subjected to two-dimensional Fourier transform, so that the fur can be realized by a low-cost singlechip system without great calculated amount, and is easy to convert into an actual production line system. Meanwhile, the application can build the same-color fur classification system of the same kind of animals with low cost, the classification system is simple and easy to realize, and can replace fur classification technicians to finish the rapid classification of a large number of furs.
Drawings
FIG. 1 is a schematic flow chart of a method for classifying homozoon furs according to an embodiment of the application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Referring to fig. 1, an embodiment of the present application provides a method for classifying same-color fur of animals, the method comprising:
and 1, photographing the tested fur for N times to obtain N pieces of two-dimensional tested fur image information.
And 2, sequentially carrying out two-dimensional Fourier transform and column direction summation on each piece of two-dimensional tested fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain detection vectors of N elements.
The fur type difference information of the same-color fur caused by the density of the fur or the thinning of the fur or the like is obtained by performing two-dimensional Fourier transform on the fur image, converting the fur image into a one-dimensional signal and obtaining the maximum value of the one-dimensional signal.
Step 3, comparing N elements in the detection vector with N elements in the reference vector in pairs to obtain an N multiplied by N comparison matrix; if 95% of the elements in the comparison matrix are larger than 1 or smaller than 1, the tested fur and the reference fur do not belong to the same kind of fur, otherwise, the tested fur and the reference fur belong to the same kind of fur.
The method comprises the steps of carrying out two-dimensional Fourier transformation on typical fur of each animal, converting the fur into one-dimensional signals, and obtaining a reference value by a method of solving the maximum value of the one-dimensional signals; the method for obtaining the contrast matrix A by two-phase two of N element vectors of a tested fur and elements of N element vectors of a reference fur; and judging whether the tested fur is similar to the reference fur or not through the matrix A.
The specific description of the above steps 1 to 3 is as follows:
in order to accurately classify the same-color fur of the same kind of animal, the interference of ambient light needs to be removed as much as possible when the fur is photographed, so that a camera bellows needs to be designed, the fur needs to be placed into the camera bellows when the fur is photographed, LED light sources are arranged on the left side and the right side of the camera bellows, a common camera is arranged on the top of the camera bellows, and the fur needs to be tiled at the bottom of the camera bellows for photographing.
For a two-dimensional fur image t (x, y) of K rows and S columns, the two-dimensional Fourier transform can be expressed as follows:
after the two-dimensional Fourier transform, summing according to the column direction to obtain a one-dimensional signal, wherein the calculation formula is as follows:
after obtaining the one-dimensional signal, the maximum value of the frequency domain is obtained, and the calculation formula is as follows:
n images are collected for each fur, and the maximum value of the frequency domain of the ith image is MEng i Vector V of N images:
V=[MEng 1 ,MEng 2 ,MEng 3 ,…,MEng N-1 ,MEng N ]。
wherein, the description of the reference vector is: before fur dyeing, a fur sorting technician selects a certain type of typical fur of a certain animal to be dyed, puts the typical fur into a camera bellows for N times of photographing, respectively carries out two-dimensional Fourier transform, converts the two-dimensional Fourier transform into one-dimensional signals and obtains the maximum value of the one-dimensional signals on each picture, N images can obtain a vector V1 with N elements, the vector V1 is used as a reference value, and a certain period of time, for example, a technician only needs to select the typical fur once within one year, namely, the step is carried out once within one year, and the obtained fur is stored for multiple fur sorting of the same type as reference within one year.
Furthermore, after a typical hide V1 vector of a certain animal Mao Pimou hide exists in the system, an operator only needs to select the V1 vector of the animal as a reference, directly sends the fur of the animal into a camera bellows one by one through a conveyor belt to take pictures, each hide rapidly takes N pictures, and each picture is resolved to perform two-dimensional Fourier transform, convert the two pictures into one-dimensional signals and obtain the maximum value of the one-dimensional signals, so that the hide under classification also forms a vector V2 of N elements, and the elements of the V1 vector are compared with the elements of the V2 vector two by two, so that an N multiplied by N matrix A can be obtained.
If 95% of all the elements of the matrix A are greater than 1 or less than 1, the tested skin and the fur appointed by the classification technician are considered to be not in one type, and a classification result is output.
Working principle: for the same kind of pure-colored fur, the human eyes can look like Mao Mi or the fur, and the colors of the fur are the same, but the fur is obviously different in sense, and the root cause is that the difference of the density of the fur leads to different light reflection, namely, the difference of the light reflection energy of the human eyes for obtaining the two kinds of fur. Based on the principle, the fur is placed in a camera to remove the interference of uncertain light rays of the surrounding environment, the camera is used for photographing, the obtained image data is firstly subjected to two-dimensional Fourier transform, then the transformed two-dimensional data are added in the same column to be converted into one-dimensional frequency domain data, and the maximum value of the one-dimensional data is extracted as a criterion to classify Mao Tongchong pure-color fur. The application automatically judges the type of the fur according to the energy ratio by carrying out frequency domain energy analysis on the images of the same kind of pure-color fur, thereby realizing unmanned, automatic and rapid classification of the fur.
In addition, the existing identification method of the easy-to-mix fur adopts infrared spectrum to irradiate the fur, the fur is identified according to the characteristic information of the reflection spectrum, the cost of the infrared spectrum equipment is far higher than that of the method, the optical instrument has more severe requirements on the environment, the cost of the technology and the equipment required by the method is lower, the method is easy to realize, and the rapid classification of the fur can be realized without the knowledge of the deep spectrum field.
Examples:
for example, white rabbit fur is classified into two types, mao Mi and lanugo, before fur dyeing. A typical Mao Shu fur is selected as a reference, N image data are collected in a self-designed camera bellows, where N is selected to be 10, and the V1 vector is:
at present, mao Mipi is photographed by a camera, then two-dimensional Fourier transform is performed, the signals are converted into one-dimensional signals, the maximum value of the one-dimensional signals is obtained, and the V2 vector is obtained as follows:
the two-phase matrix a is then:
by judging the matrix A, the 100 elements of the matrix A are all less than 1, so that the tested fur and the typical fur are not similar fur, which is consistent with the practical situation, namely, white Mao Mi rabbit fur and white fur are two types of fur, which is the practical classification standard, and the classification of large batches of same-color fur can be rapidly realized through the application.
The foregoing disclosure is only a few specific embodiments of the present application and various changes and modifications may be made by those skilled in the art without departing from the spirit and scope of the application, and it is intended that the application also includes such changes and modifications as fall within the scope of the claims and their equivalents.

Claims (6)

1. A method for classifying homozoon pelts, which is characterized by comprising the following steps:
photographing the tested fur for N times to obtain N pieces of two-dimensional tested fur image information;
sequentially carrying out two-dimensional Fourier transform and column direction summation on each piece of two-dimensional tested fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain detection vectors of N elements;
comparing N elements in the detection vector with N elements in the reference vector in pairs to obtain an N multiplied by N comparison matrix; if 95% of the elements in the comparison matrix are larger than 1 or smaller than 1, the tested fur and the reference fur do not belong to the same kind of fur, otherwise, the tested fur and the reference fur belong to the same kind of fur.
2. The method for classifying same-color fur of animals as set forth in claim 1, wherein the fur is photographed N times by a common camera in a camera box; the LED light source is arranged on the inner side wall of the camera bellows, the common camera is arranged at the inner top of the camera bellows, and fur is tiled at the inner bottom of the camera bellows.
3. Method for classifying pelts of the same kind of animals according to claim 1, characterised in that for a two-dimensional pelt image t (x, y) of K rows and S columns, the two-dimensional fourier transformation formula is as follows:
4. a method for classifying homozoon pelts according to claim 3, wherein the information after the two-dimensional fourier transform is summed in the column direction to obtain one-dimensional information, the formula is as follows:
5. the method for classifying same-species animal homotopy pelts according to claim 4, wherein the frequency domain maximum of the one-dimensional signal is calculated according to the following formula:
the detection vectors of N elements formed by the N two-dimensional fur image information are expressed as follows:
V=[MEng 1 ,MEng 2 ,MEng 3 ,…,MEng N-1 ,MEng N ]
wherein, MEng i Is the frequency domain maximum of the ith image.
6. The method for classifying same-species animal homomorphism pelts according to claim 1, wherein the step of obtaining the reference vectors of the N elements comprises:
photographing the reference fur for N times to obtain N two-dimensional reference fur image information;
and sequentially carrying out two-dimensional Fourier transform and column direction summation on each two-dimensional reference fur image information to obtain one-dimensional information, and obtaining the frequency domain maximum value of the one-dimensional information to obtain the reference vectors of N elements.
CN202010065668.XA 2020-01-20 2020-01-20 Method for classifying homozoon furs Active CN111275111B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010065668.XA CN111275111B (en) 2020-01-20 2020-01-20 Method for classifying homozoon furs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010065668.XA CN111275111B (en) 2020-01-20 2020-01-20 Method for classifying homozoon furs

Publications (2)

Publication Number Publication Date
CN111275111A CN111275111A (en) 2020-06-12
CN111275111B true CN111275111B (en) 2023-10-31

Family

ID=71003425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010065668.XA Active CN111275111B (en) 2020-01-20 2020-01-20 Method for classifying homozoon furs

Country Status (1)

Country Link
CN (1) CN111275111B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003204798A (en) * 2001-10-31 2003-07-22 Japan Spinners Inspecting Foundation Method for discriminating animal hair fiber product
CN101196983A (en) * 2006-12-08 2008-06-11 北京皎佼科技有限公司 Image recognition method
CN101996322A (en) * 2010-11-09 2011-03-30 东华大学 Method for extracting fractal detail feature for representing fabric texture
CN103364397A (en) * 2012-03-31 2013-10-23 佛山市南海天富科技有限公司 Fabric weft density measurement method and device
CN104318260A (en) * 2014-10-28 2015-01-28 常州大学 Fur near-infrared spectral discrimination method based on packet support vector machine
WO2016039348A1 (en) * 2014-09-10 2016-03-17 一般財団法人ニッセンケン品質評価センター Method for identifying protein fiber
CN107037050A (en) * 2017-04-05 2017-08-11 东华大学 A kind of method for automatic measurement of textile image Texture-period
CN108088808A (en) * 2016-11-23 2018-05-29 西派特(北京)科技有限公司 The near infrared spectrum quick nondestructive method for qualitative analysis of fur
CN109211830A (en) * 2018-08-01 2019-01-15 嘉兴市皮毛和制鞋工业研究所 A kind of method of principal component analysis and the easily mixed fur of multicategory discriminant combination identification
CN110263708A (en) * 2019-06-19 2019-09-20 郭玮强 Image sources recognition methods, equipment and computer readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003204798A (en) * 2001-10-31 2003-07-22 Japan Spinners Inspecting Foundation Method for discriminating animal hair fiber product
CN101196983A (en) * 2006-12-08 2008-06-11 北京皎佼科技有限公司 Image recognition method
CN101996322A (en) * 2010-11-09 2011-03-30 东华大学 Method for extracting fractal detail feature for representing fabric texture
CN103364397A (en) * 2012-03-31 2013-10-23 佛山市南海天富科技有限公司 Fabric weft density measurement method and device
WO2016039348A1 (en) * 2014-09-10 2016-03-17 一般財団法人ニッセンケン品質評価センター Method for identifying protein fiber
CN104318260A (en) * 2014-10-28 2015-01-28 常州大学 Fur near-infrared spectral discrimination method based on packet support vector machine
CN108088808A (en) * 2016-11-23 2018-05-29 西派特(北京)科技有限公司 The near infrared spectrum quick nondestructive method for qualitative analysis of fur
CN107037050A (en) * 2017-04-05 2017-08-11 东华大学 A kind of method for automatic measurement of textile image Texture-period
CN109211830A (en) * 2018-08-01 2019-01-15 嘉兴市皮毛和制鞋工业研究所 A kind of method of principal component analysis and the easily mixed fur of multicategory discriminant combination identification
CN110263708A (en) * 2019-06-19 2019-09-20 郭玮强 Image sources recognition methods, equipment and computer readable storage medium

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Chi-ho Chan 等.Fabric Defect Detection by Fourier Analysis.《IEEE TRANSACTIONS ON INDUSTRY APPLICATION》.2000,第36卷(第5期),1267-1276. *
F. Gao 等.Technical note: Characterization of lipid constitution in Fourier transform infrared spectra and spectroscopic discrimination of animal-derived feedstuffs from different species.《2017 American Society of Animal Science》.2017,2794–2800. *
SHIH-HSUAN CHIU 等.Textural Defect Segmentation Using a Fourier-Domain Maximum Likelihood Estimation Method.《TEXTILE RESEARCH JOURNAL》.2002,253-258. *
吴妍娴 等.一种新型光谱多元分析模式识别方法.《光谱学与光谱分析》.2017,第37卷(第8期),2493-2499. *
张红 等.基于红外光谱主成分分析的易混毛皮的鉴别研究.《皮革科学与工程》.2019,第29卷(第3期),47- 52. *
杨静茹 等.动物毛皮纤维的鉴别.《上海纺织科技》.2014,第42卷(第9期),21-23. *

Also Published As

Publication number Publication date
CN111275111A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
Al-Hiary et al. Fast and accurate detection and classification of plant diseases
Li et al. A multi-scale cucumber disease detection method in natural scenes based on YOLOv5
CN109253975A (en) Apple slight damage hyperspectral detection method based on MSC-CFS-ICA
CN110929624B (en) Construction method of multi-task classification network based on orthogonal loss function
Asmara et al. Chicken meat freshness identification using the histogram color feature
CN106570855A (en) Method and system for quickly judging pork freshness
Fadchar et al. Prediction model for chicken egg fertility using artificial neural network
Turi et al. Classification of Ethiopian coffee beans using imaging techniques
CN115512123A (en) Multi-period key growth characteristic extraction and time period classification method for hypsizygus marmoreus
CN113435254A (en) Sentinel second image-based farmland deep learning extraction method
Patel Bacterial colony classification using atrous convolution with transfer learning
Sayama Seeking open-ended evolution in swarm chemistry II: Analyzing long-term dynamics via automated object harvesting
CN111275111B (en) Method for classifying homozoon furs
Lainez et al. Deep learning applied to identification of commercial timber species from Peru
CN116912674A (en) Target detection method and system based on improved YOLOv5s network model under complex water environment
Yu et al. Optical filter net: A spectral-aware rgb camera framework for effective green pepper segmentation
CN116416523A (en) Machine learning-based rice growth stage identification system and method
CN113537131B (en) Land resource analysis model training method and analysis method based on image recognition
Saifullah et al. Palm Oil Maturity Classification Using K-Nearest Neighbors Based on RGB and L* a* b Color Extraction
Mansur et al. An Image Processing Techniques Used for Soil Moisture Inspection and Classification
Campos et al. Robust computer vision system for marbling meat segmentation
Cui et al. Automatic Identification Technology of Optical Fiber based on Genetic Neural Network Algorithm
Song et al. Analysis on Chlorophyll Diagnosis of Wheat Leaves Based on Digital Image Processing and Feature Selection.
Chen et al. Application of Plant Phenotype Extraction Using Virtual Data with Deep Learning
CN116310810B (en) Cross-domain hyperspectral image classification method based on spatial attention-guided variable convolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant