CN103149163A - Multispectral image textural feature-based beef tenderness detection device and method thereof - Google Patents

Multispectral image textural feature-based beef tenderness detection device and method thereof Download PDF

Info

Publication number
CN103149163A
CN103149163A CN2013100473068A CN201310047306A CN103149163A CN 103149163 A CN103149163 A CN 103149163A CN 2013100473068 A CN2013100473068 A CN 2013100473068A CN 201310047306 A CN201310047306 A CN 201310047306A CN 103149163 A CN103149163 A CN 103149163A
Authority
CN
China
Prior art keywords
beef
formula
row
textural characteristics
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013100473068A
Other languages
Chinese (zh)
Inventor
陈坤杰
孙鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Agricultural University
Original Assignee
Nanjing Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Agricultural University filed Critical Nanjing Agricultural University
Priority to CN2013100473068A priority Critical patent/CN103149163A/en
Publication of CN103149163A publication Critical patent/CN103149163A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a multispectral image textural feature-based beef tenderness detection device, which includes a lamp box, an illumination system, a shooting system, an object stage and a computer. The object stage is installed at the bottom of the lamp box. The shooting system includes a CCD digital camera, a multispectral filter and a camera support. The CCD digital camera is fixed on the camera support arranged at the lamp box top. The multispectral filter is loaded above a lens of the CCD digital camera, and the lens aligns with the center of the object stage surface. The computer is connected to the CCD digital camera and is used for processing and storing acquired multispectral images. The illumination system includes fluorescent lamp assemblies and halogen lamp light source assemblies. The fluorescent lamp assemblies are mounted at the bottom of the object stage to simulate natural light, and the halogen lamp light source assemblies are mounted on two sides of the lamp box top and used for auxiliary light sources. By means of the multispectral beef tenderness prediction method, effective online tenderness prediction of a beef sample can be conducted, thus laying the foundation for a future beef online quality detection system.

Description

Devices and methods therefor based on the detection tenderness of beef utilizing of multispectral image textural characteristics
Technical field
The present invention relates to a kind of devices and methods therefor of the detection tenderness of beef utilizing based on the multispectral image textural characteristics, belong to processing of farm products and detect research field.
Background technology
Meat quality has become growing research topic in the whole world.The quality of beef is the principal element that determines consumer's purchase intention and the market price, carry out the timely examination and controlling of beef quality in the production of beef and the process of circulation, to ensureing beef food safety, promoting Beef production and reasonable consumption, have a very important role and meaning.In meat industry, be important research topic to the quantification of meat quality index always.From the angle of meat industry with keen competition, continuing to the meat products of consumer's delivery quality grade excellence is the fundamental that vast meat enterprises is depended on for existence.The leading indicator that affects meat quality has: color, tender degree, texture, local flavor, water cut and succulence, wherein the tender degree of beef is the main factor that determines Quality Beef.At present, the detection method of tenderness of beef utilizing mainly contains sensory evaluation method and shearing force determination method, and the former is by manually commenting the mode of tasting to determine the tender degree of meat, and latter decides the tender degree of beef according to the WBSF value of beef.Comment manually that to taste be a kind of assessment method of subjectivity, evaluating member need pass through special training, and the evaluation process is consuming time, consumption power, and the evaluation result subjectivity is strong, poor repeatability; Although the drawn result of shearing force method for measuring is objective, accurate, its process is loaded down with trivial details, complicated, and detection efficiency is low, can't satisfy Site Detection and the online detection requirements of beef quality at all.Therefore, be necessary beef quality detection method and technology are furtherd investigate, to develop a kind of efficiently, beef quality detection technique fast, realize harmless online detection the to beef quality.
Machine vision, merged the modern technologies such as computing machine, information and digital image processing, not only can replace the mankind's part visual activity, and have the ability to see things in their true light sharper than human vision, observe human vision details can't be obtained, obtain more visual information.By technological means such as Region Segmentation, digital operations, can utilize machine vision to carry out quantitative test to the principal character of object of observation.Machine vision technique is applied to processing of farm products and detection field, the developed country take the U.S., Japan as representative oneself through commonplace, theoretical level and the ability of practice all will obviously be better than China.Machine vision, merged the modern technologies such as computing machine, information and digital image processing, not only can replace the mankind's part visual activity, and have the ability to see things in their true light sharper than human vision, observe human vision details can't be obtained, thereby obtain more visual information.By identifying (Pattern recognition), cutting apart the digital image processing methods such as (Image segmentation) and binaryzation (Threshholding), machine vision can be used for the principal character of object of observation is carried out quantitative test and description, this has a large amount of research and application reports in agricultural and food quality context of detection.
The textural characteristics of beef image plays a very important role in analysis.By using computer vision technique, the textural characteristics parameter that extracts from the beef image of the tender degree grade of difference can reflect the index of the appreciable impact beef quality such as the size of beef fiber, arrangement and tender degree accurately.The detection accuracy of computer vision technique is higher than artificial judgement, and determination data is objective stable simultaneously, is highly suitable for the automatic classification of tenderness of beef utilizing.
Summary of the invention
Technical matters to be solved by this invention is to provide a kind of tenderness of beef utilizing detection method based on the multispectral image textural characteristics, the decision-making foundation of science is provided for objective detection tenderness of beef utilizing index, and this helps to detect online Quality Beef and realizes the beef automatic grading system.
For solving the problems of the technologies described above, main technical content of the present invention is as follows:
a kind of device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics comprises lamp box, illumination system, camera system, objective table and computing machine, described objective table is arranged on the bottom of lamp box, described camera system comprises the CCD digital camera, multispectral wave filter and camera support, described CCD digital camera is fixed on camera support, be arranged on the lamp box top, load multispectral wave filter in the middle of described CCD digital camera and its camera lens, the center of described alignment lens objective table table top, described computing machine is connected with the CCD digital camera, the multispectral image that arrives for the treatment of storage of collected, described illumination system comprises that daylight lamp is according to assembly and halogen light source assembly, described daylight lamp is arranged on bottom objective table according to assembly, be used for simulating nature light, described halogen light source assembly is arranged on both sides, lamp box top, is used for secondary light source.
Above-mentioned objective table is to be movably arranged on bottom lamp box, can freely take out.
Above-mentioned daylight lamp comprises the fluorescent tube of two 50W according to assembly.
The fluorescent tube of above-mentioned two 50W is arranged on respectively the objective table two bottom sides.
Two circular ports are offered in both sides, above-mentioned lamp box top, are used for the halogen light source assembly and insert.
Above-mentioned multispectral wave filter comprises the filter plate of 4 different-wavebands, is respectively 440nm, 550nm, 710nm, 810nm.
A kind of method of the detection tenderness of beef utilizing based on the multispectral image textural characteristics comprises the following steps:
(1), utilize the device based on the detection tenderness of beef utilizing of multispectral image textural characteristics, obtain the beef sample multispectral image of black background;
(2), carry out image segmentation and image and process, obtain the muscle region image of each wave band of beef of information completely;
(3), utilize gray level co-occurrence matrixes, fast two-dimensional fourier transformation, three kinds of texture characteristic extracting methods of Gabor wavelet transformation extract 217 textural characteristics parameters of image;
(4), set up successive Regression and the tender degree forecast model of support vector machine, respectively to tender, tough beef carries out that model is set up and model testing, then carries out model result relatively.
In above-mentioned steps (2), image segmentation and image are processed and are comprised the following steps:
(a), image is transformed into gray scale;
(b), the beef multispectral image of four wave bands being carried out pixel reads;
(c), by to the orientation and segmentation of area-of-interest (ROI) pixel, obtain the multispectral image muscle region of interest area image under different-waveband;
In above-mentioned steps (3), the gray level co-occurrence matrixes method has been extracted 88 textural characteristics of four direction, and detailed statement and the computing formula of described feature are as follows:
If f (x, y) is a width two-dimensional digital image, its size is M * N, and grey level is 0-n, and its gray level co-occurrence matrixes is p (i, j);
The F1 energy:
Σ i Σ j p ( i , j ) 2 (formula one)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F2 entropy:
- Σ i Σ j p ( i , j ) log ( p ( i , j ) ) (formula two)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F3 dissimilarity:
Σ i Σ j n k p ( i , j ) , | i - j | = n (formula three)
Wherein, horizontal stroke and row in i, j difference representing matrix, k=1 and i ≠ j; N is gray level;
The F4 contrast:
Σ i = 0 n Σ j = 0 n ( i - j ) p ( i , j ) 2 (formula four)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F5 unfavorable variance:
Σ i Σ j p ( i , j ) 1 + n , | i - j | = n (formula five)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F6 correlativity:
Σ i Σ j ( ij ) p ( i , j ) - μ x μ y σ x σ y (formula six)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ y, σ x, σ yP xAnd p yAverage and standard deviation;
The F7 homogeney:
Σ i Σ j 1 1 + ( i - μ ) 2 p ( i , j ) (formula seven)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
The F8 auto-correlation:
Σ i Σ j ( i , j ) p ( i , j ) - μ t 2 σ t 2 (formula eight)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ tAnd σ tAverage or the standard deviation of row or row vector;
F9 boundling group:
Σ i = 1 n Σ j = 1 n { i + j - μ x - μ y } 3 × p ( i , j ) (formula nine)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
F10 boundling conspicuousness:
Σ i = 1 n Σ j = 1 n { i + j - μ x - μ y } 4 × p ( i , j ) (formula ten)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
The F11 maximum probability:
Maxp (i, j) (formula 11)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F12 quadratic sum: Σ i Σ j ( i - μ ) 2 p ( i , j ) (formula 12)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
F13 all the side and:
Σ i = 2 2 n i P x + y ( i ) (formula 13)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F14 and variance:
Σ i = 2 2 n ( i - f 8 ) 2 P x + y ( i ) (formula 14)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system, f 8See F15; N is gray level;
F15 and entropy:
- Σ i = 2 2 n P x + y ( i ) log { P x + y ( i ) } = f 8 (formula 15)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor variance of F16:
Σ i = 0 n - 1 i 2 P x - y ( i ) (formula 16)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor entropy of F17:
- Σ i = 0 n - 1 P x - y ( i ) log { P x - y ( i ) } (formula 17)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F18 correlativity decisive (1):
HXY - HXY max { HX , HY } (formula 18)
Wherein HXY = - Σ i Σ j p ( i , j ) log ( p ( i , j ) ) , HX, HY are P xAnd P yEntropy;
F19 correlativity decisive (2):
( 1 - exp [ - 2 ( HXY 2 - HXY ) ] ) 1 2 (formula 19)
Wherein HXY = - Σ i Σ j p ( i , j ) log ( p ( i , j ) ) , HX, HY are P xAnd P yEntropy,
HXY 2 = - Σ i Σ j p x ( i ) p y ( j ) log { p x ( i ) , p y ( j ) } ;
The F20 greatest coefficient of being correlated with:
Q ( i , j ) = Σk p ( i , k ) p ( j , k ) p x ( i ) p y ( k ) (formula 20)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, k=1;
F21 unfavorable variance normalizing value (INN):
Σ i Σ j 1 1 + ( i - j ) σ p ( i , j ) (formula 21)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance;
F22 unfavourable balance square normalizing value (IDN):
Σ i Σ j 1 1 + ( i - j ) 2 σ 2 p ( i , j ) (formula 22)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance.
In above-mentioned steps (3), the fast two-dimensional fourier transformation method has been extracted 81 two-dimension fourier frequency domain characters, and its concrete discrete features is expressed as follows,
Function f (x, the y) two dimensional discrete Fourier transform of one width M * N image is as follows, and x and y are discrete real variables, and u and v are the discrete frequency variable;
F ( u , v ) = 1 MN Σ x = 0 M - 1 Σ y = 0 N - 1 f ( x , y ) e [ - j 2 π ( ux M + vy N ) ]
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
The inverse transformation formula is as follows:
f ( u , v ) = Σ u = 0 M - 1 Σ v = 0 N - 1 F ( u , v ) e [ - j 2 π ( ux M + vy N ) ]
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
f ( x , y ) = Σ u = 0 M - 1 Σ v = 0 N - 1 F ( u , v ) e [ j 2 π ( ux M + vy N ) ]
In formula: x=0,1,2 ..., M-1; Y=0,1,2 ..., N-1;
Two-dimensional discrete function Fourier transform has conjugate symmetry, as shown below,
|F(u,v)|=|F(-u,-v)|
According to the fast two-dimensional fourier transformation conjugate symmetry, choose the two dimensional logarithmic spectral window data log[F (u of l * l, v)] (l=1,2,9) describe the beef textural characteristics, the window upper left corner is always two-dimentional Fourier spectrum center, and namely the beef textural characteristics can be expressed as x=([x 1] ..., x p[) T], p=l * l is the Characteristic Number of beef texture, [x] iBe the frequency spectrum data of i feature of beef sample texture, T is dimension; Consider from calculated amount and actual tests, choose 9 * 9, totally 81 beef sample fourier spectrum textural characteristics.
In above-mentioned steps (3), the Gabor Wavelet Transform has been extracted 48 textural characteristics, and its specific implementation is as follows:
(a), the design of Gabor wave filter
Nonopiate the meaning in the image of filtering of Gabor small echo exists redundant information, and in order to reduce redundancy, central frequency range is U lAnd U h, k represents direction number, and s represents scale parameter, and in many resolution decomposition, the strategy of design is exactly to guarantee multi-channel filter; Calculated by following formula, and use respectively a, σ u, σ vRepresent channel filtering:
a = ( U h / U l ) 1 S - 1 , σ u = ( a - 1 ) U h ( a + 1 ) 2 ln 2 ′ ,
σ v = tan ( π 2 k ) [ U h - 2 ln ( 2 σ u 2 U h ) ] [ 2 ln 2 - ( 2 ln 2 ) 2 σ u 2 U h 2 ] - 1 2 ,
M=0 wherein, 1 ..., S-1; N=0,1 ..., S-1; U, v are the spatial frequency variable;
(b), Gabor textural characteristics value is extracted
Provide image I (x, y), its Gabor wavelet transformation is defined as
W mn ( x , y ) = ∫ I ( x 1 , y 1 ) g mn * ( x - x 1 , y - y 1 ) dxdy ,
W mnRepresent the image after wavelet transformation, * represents complex conjugate, g mnRepresent the wavelet transformation function, xy, x 1, y 1Representative image is put coordinate anyhow respectively;
G wherein mn(x, y)=a -mG (x', y'), a>1; M=0,1 ..., S-1; N=0,1 ..., S-1
X'=a -m(xcos θ+ysin θ), and y'=a -m(xsin θ+ycos θ);
Average value mu with the mould of Gabor wavelet conversion coefficient mnWith its standard variance σ mnRepresent the gray level image clarification of objective that extracts, that is:
μ mn = ∫ ∫ | W mn ( xy ) | dxdy , With σ mn = ∫ ∫ ( | W mn ( x , y ) | - μ mn ) 2 dxdy ;
So proper vector μ mnAnd σ mnBuild, the proper vector of using s=4 and k=6 to form in system is
f _ = [ μ 00 σ 00 μ 01 · · · μ 35 σ 35 ]
Use the Gabor wave filter of design, average value mu mnWith its standard variance σ mnObtained proper vector, the texture feature vector of the beef image target of each 128 * 128 * 8bit is 24 * 2=48.
The specific implementation step of the successive Regression forecast model in above-mentioned steps (4) is as follows:
(a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(b), set up the successive Regression model according to 217 textural characteristics values of forecast set sample;
(c), the successive Regression forecast model of the substitution of test set sample being built up;
(d), successive Regression forecast model result and the tender degree result of actual test are compared, draw predictablity rate.
The specific implementation step of the SVM prediction model in above-mentioned steps (4) is as follows:
(a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(b), set up supporting vector machine model according to 217 textural characteristics values of forecast set sample;
(c), choose support vector machine penalty coefficient and kernel functional parameter;
(d), the SVM prediction model of the substitution of test set sample being built up;
(e), SVM prediction model result and the tender degree result of actual test are compared, draw predictablity rate.
By technique scheme, the present invention has following advantages at least:
the present invention passes through analysis of texture, obtain 128 * 128 beef sample images, use gray level co-occurrence matrixes, two-dimensional fast fourier transform, the Gabor wavelet transformation extracts describes beef superficial makings characteristic parameter, for tender degree forecast model provides characteristic parameter, utilize successive Regression and support vector machine that tenderness of beef utilizing is predicted, support vector machine is respectively 75%(440nm to the optimum prediction accuracy rate of beef sample), 77%(550nm), 77%(710nm), 77%(810nm), the successive Regression model is respectively 85%(440nm to the tender degree optimum prediction accuracy rate of beef sample), 75%(550nm), 75%(710nm), 90%(810nm).Energy Accurate Prediction tenderness of beef utilizing, this helps to detect online in commercial production tenderness of beef utilizing and follow-up Quality Beef automatic classification and lays the first stone.
The specific embodiment of the present invention is provided in detail by following examples and accompanying drawing thereof.
Description of drawings
Fig. 1 is the structural representation of apparatus of the present invention;
Fig. 2 is that the multispectral picture muscle of beef sample region of interest image is processed original schematic diagram;
Fig. 3 is the multispectral picture muscle of beef sample region of interest processing result image schematic diagram;
Fig. 4 is method flow diagram of the present invention.
Embodiment
Reach for further setting forth the present invention technological means and the effect that predetermined goal of the invention is taked, below in conjunction with accompanying drawing and preferred embodiment, its embodiment, structure, feature and effect thereof to foundation the present invention proposes are described in detail as follows.
as shown in Figure 1, a kind of device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics comprises lamp box 1, illumination system, camera system, objective table 2 and computing machine 3, objective table 2 is arranged on the bottom of lamp box 1, camera system comprises CCD digital camera 4, multispectral wave filter 8 and camera support 5, CCD digital camera 4 is fixed on camera support 5, be arranged on lamp box 1 top, load multispectral wave filter 8 in the middle of CCD digital camera 4 and its camera lens 9, camera lens 9 is aimed at the center of objective table 2 table tops, computing machine 3 is connected with CCD digital camera 4, the multispectral image that arrives for the treatment of storage of collected, illumination system comprises that daylight lamp is according to assembly 6 and halogen light source assembly 7, daylight lamp is arranged on objective table 2 bottoms according to assembly 6, be used for simulating nature light, halogen light source assembly 7 is arranged on lamp box 1 both sides, top, is used for secondary light source.
As preferred version, objective table 2 is to be movably arranged on lamp box 1 bottom, can freely take out.
As preferred version, daylight lamp comprises the fluorescent tube of two 50W according to assembly 6.
As preferred version, the fluorescent tube of two 50W is arranged on respectively objective table 2 two bottom sides.
As preferred version, two circular ports are offered in lamp box 1 both sides, top, are used for halogen light source assembly 7 and insert.
As preferred version, multispectral wave filter 8 comprises the filter plate of 4 different-wavebands, is respectively 440nm, 550nm, 710nm, 810nm.
As shown in Figure 2, Figure 3, Figure 4, a kind of method of the detection tenderness of beef utilizing based on the multispectral image textural characteristics comprises the following steps:
(1), utilize the device based on the detection tenderness of beef utilizing of multispectral image textural characteristics, obtain the beef sample multispectral image of black background;
(2), carry out image segmentation and image and process, obtain the muscle region image of each wave band of beef of information completely;
(3), utilize gray level co-occurrence matrixes, fast two-dimensional fourier transformation, three kinds of texture characteristic extracting methods of Gabor wavelet transformation extract 217 textural characteristics parameters of image;
(4), set up successive Regression and the tender degree forecast model of support vector machine, respectively to tender, tough beef carries out that model is set up and model testing, then carries out model result relatively.
In above-mentioned steps (2), image segmentation and image are processed and are comprised the following steps:
(a), image is transformed into gray scale;
(b), the beef multispectral image of four wave bands being carried out pixel reads;
(c), by to the orientation and segmentation of area-of-interest (ROI) pixel, obtain the multispectral image muscle region of interest area image under different-waveband;
In above-mentioned steps (3), the gray level co-occurrence matrixes method has been extracted 88 textural characteristics of four direction, repeatedly occurred by intensity profile due to texture forming on the locus, thereby can there be certain gray-scale relation between two pixels of certain distance of being separated by in image space, i.e. the spatial correlation characteristic of gray scale in image.Gray level co-occurrence matrixes is exactly the common method that a kind of spatial correlation characteristic by the research gray scale is described texture.Grey level histogram is that single pixel on image is had the result that certain gray scale is added up, and gray level co-occurrence matrixes is the situation that two pixels that keep certain distance on image have respectively certain gray scale to be added up obtain.
(any point (x, y) and depart from its another point (x+a, y+b) in M * N), establishing the right gray-scale value of this point is (g1, g2) to get image.Make point (x, y) move on whole picture, can obtain various (g1, g2) value, the progression of establishing gray-scale value is k, square kind of the total k of the combination of (g1, g2).For whole picture, count the number of times that each (g1, g2) value occurs, then be arranged in a square formation, then use the total degree of (g1, g2) appearance that they are normalized to the probability P (g1, g2) of appearance, such square formation is called gray level co-occurrence matrixes.Range difference score value (a, b) is got different combinations of values, can obtain the joint probability matrix under different situations.(a, b) value will be selected according to the characteristic that Texture-period distributes, and for thinner texture, chooses the little difference values such as (1,0), (1,1), (2,0).
Work as a=1, during b=0, pixel is to being level, and namely 0 degree scans; Work as a=0, during b=1, pixel is to being vertical, and namely 90 degree scan; Work as a=1, during b=1, pixel is to being right cornerwise, and namely 45 degree scan; Work as a=-1, during b=1, pixel is to being left diagonal line, and namely 135 degree scan.
Like this, two simultaneous probability of pixel gray level, just the volume coordinate with (x, y) is converted into the description of " gray scale to " (g1, g2), has formed gray level co-occurrence matrixes.
Native system improves raising on imagery exploitation algorithm of co-matrix basis, has extracted altogether four direction (0o, 45o, 90o, 135o) upper 22 * 4 totally 88 beef texture gray level co-occurrence matrixes features.The detailed features statement is as follows with computing formula:
If f (x, y) is a width two-dimensional digital image, its size is M * N, and grey level is 0-n, and its gray level co-occurrence matrixes is p (i, j);
The F1 energy:
Σ i Σ j p ( i , j ) 2 (formula one)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F2 entropy:
- Σ i Σ j p ( i , j ) log ( p ( i , j ) ) (formula two)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F3 dissimilarity:
Σ i Σ j n k p ( i , j ) , | i - j | = n (formula three)
Wherein, horizontal stroke and row in i, j difference representing matrix, k=1 and i ≠ j; N is gray level;
The F4 contrast:
Σ i = 0 n Σ j = 0 n ( i - j ) p ( i , j ) 2 (formula four)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F5 unfavorable variance:
Σ i Σ j p ( i , j ) 1 + n , | i - j | = n (formula five)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F6 correlativity:
Σ i Σ j ( ij ) p ( i , j ) - μ x μ y σ x σ y (formula six)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ y, σ x, σ yP xAnd p yAverage and standard deviation;
The F7 homogeney:
Σ i Σ j 1 1 + ( i - μ ) 2 p ( i , j ) (formula seven)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
The F8 auto-correlation:
Σ i Σ j ( i , j ) p ( i , j ) - μ t 2 σ t 2 (formula eight)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ tAnd σ tAverage or the standard deviation of row or row vector;
F9 boundling group:
Σ i = 1 n Σ j = 1 n { i + j - μ x - μ y } 3 × p ( i , j ) (formula nine)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
F10 boundling conspicuousness:
Σ i = 1 n Σ j = 1 n { i + j - μ x - μ y } 4 × p ( i , j ) (formula ten)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
The F11 maximum probability:
Maxp (i, j) (formula 11)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F12 quadratic sum:
Σ i Σ j ( i - μ ) 2 p ( i , j ) (formula 12)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
F13 all the side and:
Σ i = 2 2 n i P x + y ( i ) (formula 13)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F14 and variance:
Σ i = 2 2 n ( i - f 8 ) 2 P x + y ( i ) (formula 14)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system, f 8See F15; N is gray level;
F15 and entropy:
- Σ i = 2 2 n P x + y ( i ) log { P x + y ( i ) } = f 8 (formula 15)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor variance of F16:
Σ i = 0 n - 1 i 2 P x - y ( i ) (formula 16)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor entropy of F17:
- Σ i = 0 n - 1 P x - y ( i ) log { P x - y ( i ) } (formula 17)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F18 correlativity decisive (1):
HXY - HXY max { HX , HY } (formula 18)
Wherein HXY = - Σ i Σ j p ( i , j ) log ( p ( i , j ) ) , HX, HY are P xAnd P yEntropy;
F19 correlativity decisive (2):
( 1 - exp [ - 2 ( HXY 2 - HXY ) ] ) 1 2 (formula 19)
Wherein HXY = - Σ i Σ j p ( i , j ) log ( p ( i , j ) ) , HX, HY are P xAnd P yEntropy,
HXY 2 = - Σ i Σ j p x ( i ) p y ( j ) log { p x ( i ) , p y ( j ) } ;
The F20 greatest coefficient of being correlated with:
Q ( i , j ) = Σk p ( i , k ) p ( j , k ) p x ( i ) p y ( k ) (formula 20)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, k=1;
F21 unfavorable variance normalizing value (INN):
Σ i Σ j 1 1 + ( i - j ) σ p ( i , j ) (formula 21)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance;
F22 unfavourable balance square normalizing value (IDN):
Σ i Σ j 1 1 + ( i - j ) 2 σ 2 p ( i , j ) (formula 22)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance.
In above-mentioned steps (3), the fast two-dimensional fourier transformation method has been extracted 81 two-dimension fourier frequency domain characters, and Fourier transform is widely used in image processing field, is called the 2nd kind of language of image by people.Native system utilizes the frequency domain characteristic of Fourier transform, has proposed the beef image textural characteristics based on two-dimension fourier frequency domain window transform.
Function f (x, the y) two dimensional discrete Fourier transform of one width M * N image is as follows, and x and y are discrete real variables, and u and v are the discrete frequency variable;
F ( u , v ) = 1 MN Σ x = 0 M - 1 Σ y = 0 N - 1 f ( x , y ) e [ - j 2 π ( ux M + vy N ) ]
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
The inverse transformation formula is as follows:
f ( u , v ) = Σ u = 0 M - 1 Σ v = 0 N - 1 F ( u , v ) e [ - j 2 π ( ux M + vy N ) ]
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
f ( x , y ) = Σ u = 0 M - 1 Σ v = 0 N - 1 F ( u , v ) e [ j 2 π ( ux M + vy N ) ]
In formula: x=0,1,2 ..., M-1; Y=0,1,2 ..., N-1;
When showing the Fourier spectrum of image, the display capabilities that its dynamic range surpasses display device will compress the dynamic range of original image, and the most frequently used method is by logarithmic form, dynamic range to be adjusted.
Two-dimensional discrete function Fourier transform has conjugate symmetry, as shown below,
|F(u,v)|=|F(-u,-v)|
According to the fast two-dimensional fourier transformation conjugate symmetry, choose the two dimensional logarithmic spectral window data log[F (u of l * l, v)] (l=1,2,9) describe the beef textural characteristics, the window upper left corner is always two-dimentional Fourier spectrum center, and namely the beef textural characteristics can be expressed as x=([x 1] ..., x p[) T], p=l * l is the Characteristic Number of beef texture, [x] iBe the frequency spectrum data of i feature of beef sample texture, T is dimension; Consider from calculated amount and actual tests, choose 9 * 9, totally 81 beef sample fourier spectrum textural characteristics.
In above-mentioned steps (3), the Gabor Wavelet Transform has been extracted 48 textural characteristics.
The Gabor wavelet transformation is a kind of new transform analysis method, it inherits and has developed the thought of short time discrete Fourier transform localization, simultaneously overcome again window size not with shortcomings such as frequency change, can provide one with time one frequency window of frequency shift, be the ideal tools of carrying out signal time frequency analysis and processing.Its principal feature is that therefore, wavelet transformation all is successfully applied in a lot of fields by the abundant feature of some aspect of outstanding problem of conversion, and particularly the discrete digital algorithm of wavelet transformation has been widely used in the conversion research of many problems.The frequency of Gabor wavelet transformation regional area in analysing digital image and directional information have excellent performance, are widely used in computer vision and Texture Segmentation
The third beef texture extracting method of the present invention is based on the Gabor algorithm filter, and beef image is identified.Gabor wavelet filter centre frequency is a scope from low to high, wave filter adopts 6 directions, 4 yardsticks, guarantee multi-channel filter, represented the textural characteristics of the beef image target that extracts with the mean value of the mould of Gabor wavelet conversion coefficient and its standard variance.Its specific implementation is as follows:
(a), the design of Gabor wave filter
Nonopiate the meaning in the image of filtering of Gabor small echo exists redundant information, and in order to reduce redundancy, central frequency range is U lAnd U h, k represents direction number, and s represents scale parameter, and in many resolution decomposition, the strategy of design is exactly to guarantee multi-channel filter; Calculated by following formula, and use respectively a, σ u, σ vRepresent channel filtering:
a = ( U h / U l ) 1 S - 1 , σ u = ( a - 1 ) U h ( a + 1 ) 2 ln 2 ′ ,
σ v = tan ( π 2 k ) [ U h - 2 ln ( 2 σ u 2 U h ) ] [ 2 ln 2 - ( 2 ln 2 ) 2 σ u 2 U h 2 ] - 1 2 ,
M=0 wherein, 1 ..., S-1; N=0,1 ..., S-1; U, v are the spatial frequency variable;
(b), Gabor textural characteristics value is extracted
Provide image I (x, y), its Gabor wavelet transformation is defined as
W mn ( x , y ) = ∫ I ( x 1 , y 1 ) g mn * ( x - x 1 , y - y 1 ) dxdy ,
W mnRepresent the image after wavelet transformation, * represents complex conjugate, g mnRepresent the wavelet transformation function, xy, x 1, y 1Representative image is put coordinate anyhow respectively;
G wherein mn(x, y)=a -mG (x', y'), a>1; M=0,1 ..., S-1; N=0,1 ..., S-1
X'=a -m(xcos θ+ysin θ), and y'=a -m(xsin θ+ycos θ);
Average value mu with the mould of Gabor wavelet conversion coefficient mnWith its standard variance σ mnRepresent the gray level image clarification of objective that extracts, that is: μ mn = ∫ ∫ | W mn ( xy ) | dxdy , With σ mn = ∫ ∫ ( | W mn ( x , y ) | - μ mn ) 2 dxdy ;
So proper vector μ mnAnd σ mnBuild, the proper vector of using S=4 and K=6 to form in system is
f _ = [ μ 00 σ 00 μ 01 · · · μ 35 σ 35 ]
Use the Gabor wave filter of design, average value mu mnWith its standard variance σ mnObtained proper vector. the texture feature vector of the beef image target of each 128 * 128 * 8bit is 24 * 2=48.
Native system is set up 2 different tender degree forecast models according to 217 textural characteristics values, is respectively successive Regression forecast model and SVM prediction model.
the implementation process of stepwise regression analysis is all to calculate to the beef image textural characteristics variable of introducing regression equation its sum of squares of partial regression (i.e. contribution) each step, then select the beef image textural characteristics variable of a sum of squares of partial regression minimum, carry out significance test under level given in advance, if significantly this beef image textural characteristics variable needn't be rejected from regression equation, at this moment in equation, other several beef image textural characteristics variablees do not need to reject (because the sum of squares of partial regression of other several beef image textural characteristics variablees does not all more need to reject greater than one of minimum) yet.On the contrary, if not remarkable, this beef image textural characteristics variable needs to reject, and then ascendingly successively other beef image textural characteristics variablees in equation is tested by sum of squares of partial regression.To all reject affecting inapparent beef image textural characteristics variable, reservation be all significant.Then again the beef image textural characteristics variable of not introducing in regression equation is calculated respectively its sum of squares of partial regression, and select partial regression equation wherein and a maximum beef image textural characteristics variable, do significance test equally under given level, if significantly this beef image textural characteristics variable is introduced regression equation, this process is sustained, until the beef image textural characteristics variable in regression equation all can not be rejected and can introduce without new beef image textural characteristics variable the time, at this moment the successive Regression process finishes.The specific implementation step is as follows:
(a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(b), set up the successive Regression model according to 217 textural characteristics values of forecast set sample;
(c), the successive Regression forecast model of the substitution of test set sample being built up;
(d), successive Regression forecast model result and the tender degree result of actual test are compared, draw predictablity rate.
Support vector machine (Support Vector Machine, SVM) in the mid-90 in last century, as a kind of new model recognition methods--be suggested based under Statistical Learning Theory, this Novel classification method shows many distinctive advantages in solving small sample, non-linear and higher-dimension pattern recognition problem.The core concept of SVM is: by introducing kernel function, will be mapped to high-dimensional feature space at input space linearly inseparable sample, reaching linear separability or approximately linear can divide.
Support vector machine has the following advantages:
(1) based on structural risk minimization (SRM, Structural Risk Minimization) principle, has good generalization ability;
(2) algorithm changes into a protruding optimization problem the most at last, has guaranteed the Global Optimality of algorithm, has solved the unavoidable local minimum problem of neural network;
(3) use nuclear technology, with the nonlinear problem in the input space, be mapped in high-dimensional feature space by nonlinear function, construct linear discriminant function in higher dimensional space;
(4) have strict theory and Fundamentals of Mathematics, avoided the experience composition in the neural network realization;
(5) its algorithm complex and sample dimension are irrelevant.
SVM is in being applied to solving practical problems the time, kernel function and adopting parameters thereof are keys,, the quality that kernel function and parameter thereof are selected directly affects the quality of svm classifier device performance, so How to choose kernel function and parameter thereof just become an importance studying SVM.Use radial basis (RBF) kernel function need to determine two structural parameters: penalty coefficient C and nuclear parameter σ.The performance impact more complicated of these two parameters to support vector machine, closely related with concrete application, do not instruct us to go to select penalty coefficient C in support vector machine and the value of radial basis function parameter σ but have at present unified method.Native system has been developed voluntarily and has been write a cover and automatically calculated and extracted the method for penalty coefficient C and radial basis function parameter σ, can obtain penalty coefficient C and radial basis function parameter σ value to model prediction rate the best.Concrete implementation step is the scope of delimiting penalty coefficient C and radial basis function parameter σ, and the value in drawing the line is successively brought the support vector machine system into and tested, until draw the optimum prediction result.
The specific implementation step of SVM prediction model is as follows:
(a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(b), set up supporting vector machine model according to 217 textural characteristics values of forecast set sample;
(c), choose support vector machine penalty coefficient and kernel functional parameter;
(d), the SVM prediction model of the substitution of test set sample being built up;
(e), SVM prediction model result and the tender degree result of actual test are compared, draw predictablity rate.
Native system is predicted the classification of having carried out of the textural characteristics of beef sample with successive Regression and support vector machine respectively, result shows: the beef sample is carried out in the situation of multispectral picture collection, support vector machine is respectively 75%(440nm to the optimum prediction accuracy rate of beef sample), 77%(550nm), 77%(710nm), 77%(810nm), the successive Regression model is respectively 85%(440nm to the tender degree optimum prediction accuracy rate of beef sample), 75%(550nm), 75%(710nm), 90%(810nm).So the devices and methods therefor based on the detection tenderness of beef utilizing of multispectral image textural characteristics by the present invention takes can carry out effectively online tender degree prediction to the beef sample, and lay the first stone for later beef online quality control system.
the above, it is only preferred embodiment of the present invention, be not that the present invention is done any pro forma restriction, although the present invention discloses as above with preferred embodiment, yet be not to limit the present invention, any those skilled in the art, within not breaking away from the technical solution of the present invention scope, when the technology contents that can utilize above-mentioned announcement is made a little change or is modified to the equivalent embodiment of equivalent variations, in every case be the content that does not break away from technical solution of the present invention, any simple modification that foundation technical spirit of the present invention is done above embodiment, equivalent variations and modification, all still belong in the scope of technical solution of the present invention.

Claims (13)

1. the device based on the detection tenderness of beef utilizing of multispectral image textural characteristics, is characterized in that: comprise lamp box, illumination system, camera system, objective table and computing machine, described objective table is arranged on the bottom of lamp box, described camera system comprises the CCD digital camera, multispectral wave filter and camera support, described CCD digital camera is fixed on camera support, be arranged on the lamp box top, load multispectral wave filter in the middle of described CCD digital camera and its camera lens, the center of described alignment lens objective table table top, described computing machine is connected with the CCD digital camera, the multispectral image that arrives for the treatment of storage of collected, described illumination system comprises that daylight lamp is according to assembly and halogen light source assembly, described daylight lamp is arranged on bottom objective table according to assembly, be used for simulating nature light, described halogen light source assembly is arranged on both sides, lamp box top, is used for secondary light source.
2. the device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 1 is characterized in that: described objective table is to be movably arranged on the lamp box bottom, can freely take out.
3. the device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 1, it is characterized in that: described daylight lamp comprises the fluorescent tube of two 50W according to assembly.
4. according to claim 1, the device of the described detection tenderness of beef utilizing based on the multispectral image textural characteristics of 3 arbitrary claims, it is characterized in that: the fluorescent tube of described two 50W is arranged on respectively the objective table two bottom sides.
5. the device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 1, it is characterized in that: two circular ports are offered in both sides, described lamp box top, are used for the halogen light source assembly and insert.
6. the device of the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 1, it is characterized in that: described multispectral wave filter comprises the filter plate of 4 different-wavebands, is respectively 440nm, 550nm, 710nm, 810nm.
7. method based on the detection tenderness of beef utilizing of multispectral image textural characteristics is characterized in that comprising the following steps:
(1), utilize the device based on the detection tenderness of beef utilizing of multispectral image textural characteristics, obtain the beef sample multispectral image of black background;
(2), the spectral coverage that carries out multispectral picture cuts apart, and obtains the beef muscle position of area-of-interest (ROI);
(3), utilize gray level co-occurrence matrixes, fast two-dimensional fourier transformation, three kinds of texture characteristic extracting methods of Gabor wavelet transformation extract 217 textural characteristics parameters of image;
(4), set up successive Regression and the tender degree forecast model of support vector machine, respectively to tender, tough beef carries out that model is set up and model testing, then carries out model result relatively.
8. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: in described step (2), the multispectral image spectral coverage is cut apart to process with image and is comprised the following steps:
(8a), the beef multispectral image of four wave bands being carried out pixel reads;
(8b), by to the orientation and segmentation of area-of-interest pixel, obtain the multispectral image muscle region of interest area image under different-waveband.
9. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: in described step (3), the gray level co-occurrence matrixes method has been extracted 88 textural characteristics of four direction, and detailed statement and the computing formula of described feature are as follows:
If f (x, y) is a width two-dimensional digital image, its size is M * N, and grey level is 0-n, and its gray level co-occurrence matrixes is p (i, j);
The F1 energy:
ijP (i, j) 2(formula one)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F2 entropy:
-∑ ijP (i, j) log (p (i, j)) (formula two)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F3 dissimilarity:
ijNkp (i, j), | i-j|=n (formula three)
Wherein, horizontal stroke and row in i, j difference representing matrix, k=1 and i ≠ j; N is gray level;
The F4 contrast:
Figure FDA00002818704000031
(formula four)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F5 unfavorable variance:
Figure FDA00002818704000032
(formula five)
Wherein, horizontal stroke and row in i, j difference representing matrix, n is gray level;
The F6 correlativity:
(formula six)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ y, σ x, σ yP xAnd p yAverage and standard deviation;
The F7 homogeney:
Figure FDA00002818704000034
(formula seven)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
The F8 auto-correlation:
Figure FDA00002818704000035
(formula eight)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ tAnd σ tAverage or the standard deviation of row or row vector;
F9 boundling group:
Figure FDA00002818704000036
(formula nine)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
F10 boundling conspicuousness:
Figure FDA00002818704000041
(formula ten)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, μ x, μ yP xAnd p yAverage; N is gray level;
The F11 maximum probability:
Maxp (i, j) (formula 11)
Wherein i, j distinguish horizontal stroke and the row in representing matrix;
The F12 quadratic sum:
ij(i-μ) 2P (i, j) (formula 12)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and μ is average;
F13 all the side and:
Figure FDA00002818704000042
(formula 13)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F14 and variance:
Figure FDA00002818704000043
(formula 14)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system, f 8See F15; N is gray level;
F15 and entropy:
Figure FDA00002818704000044
(formula 15)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor variance of F16:
Figure FDA00002818704000051
(formula 16)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
The poor entropy of F17:
Figure FDA00002818704000052
(formula 17)
The horizontal stroke in the i representing matrix wherein, x and y are the starting points of row or row co-occurrence matrix in coordinate system; N is gray level;
F18 correlativity decisive (1):
Figure FDA00002818704000053
(formula 18)
HXY=-∑ wherein ijP (i, j) log (p (i, j)), HX, HY are P xAnd P yEntropy;
F19 correlativity decisive (2):
Figure FDA00002818704000054
(formula 19)
HXY=-∑ wherein ijP (i, j) log (p (i, j)), HX, HY are P xAnd P yEntropy, the HXY2=-∑ ijp x(i) p y(j) log{p x(i), p y(j) };
The F20 greatest coefficient of being correlated with:
Figure FDA00002818704000055
(formula 20)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, k=1;
F21 unfavorable variance normalizing value (INN):
Figure FDA00002818704000061
(formula 21)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance;
F22 unfavourable balance square normalizing value (IDN):
Figure FDA00002818704000062
(formula 22)
Wherein i, j distinguish horizontal stroke and the row in representing matrix, and σ is standard variance.
10. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: in described step (3), the fast two-dimensional fourier transformation method has been extracted 81 two-dimension fourier frequency domain characters,
Function f (x, the y) two dimensional discrete Fourier transform of one width M * N image is as follows, and x and y are discrete real variables, and u and v are the discrete frequency variable;
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
The inverse transformation formula is as follows:
Figure FDA00002818704000064
In formula: u=0,1,2 ..., M-1, v=0,1,2 ..., N-1;
Figure FDA00002818704000065
In formula: x=0,1,2 ..., M-1; Y=0,1,2 ..., N-1;
Two-dimensional discrete function Fourier transform has conjugate symmetry, as shown below,
|F(u,v)|=|F(-u,-v)|
According to the fast two-dimensional fourier transformation conjugate symmetry, choose the two dimensional logarithmic spectral window data log[F (u, v) of l * l] (l=1,2 ..., 9) and the beef textural characteristics described, the window upper left corner is always two-dimentional Fourier spectrum center, and namely the beef textural characteristics can be expressed as
Figure FDA00002818704000071
P=l * l is the Characteristic Number of beef texture, [x] iBe the frequency spectrum data of i feature of beef sample texture, T is dimension; Consider from calculated amount and actual tests, choose 9 * 9, totally 81 beef sample fourier spectrum textural characteristics.
11. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: in described step (3), the Gabor Wavelet Transform has been extracted 48 textural characteristics, and its specific implementation is as follows:
(11a), the design of Gabor wave filter
Nonopiate the meaning in the image of filtering of Gabor small echo exists redundant information, and in order to reduce redundancy, central frequency range is U lAnd U h, k represents direction number, and s represents scale parameter, and in many resolution decomposition, the strategy of design is exactly to guarantee multi-channel filter; Calculated by following formula, and use respectively a, σ u, σ vRepresent channel filtering:
Figure FDA00002818704000072
Figure FDA00002818704000073
Figure FDA00002818704000074
M=0 wherein, 1 ..., S-1; N=0,1 ..., S-1; U, v are the spatial frequency variable;
(b), Gabor textural characteristics value is extracted
Provide image I (x, y), its Gabor wavelet transformation is defined as
W mn(x,y)=∫I(x 1,y 1)g mn*(x-x 1,y-y 1)dxdy,
W mnRepresent the image after wavelet transformation, * represents complex conjugate, g mnRepresent the wavelet transformation function, x y, x 1, y 1Representative image is put coordinate anyhow respectively;
G wherein mn(x, y)=a -mG (x', y'), a>1; M=0,1 ..., S-1; N=0,1 ..., S-1x'=a -m(xcos θ+ysin θ), and y'=a -m(xsin θ+ycos θ);
Average value mu with the mould of Gabor wavelet conversion coefficient mnWith its standard variance σ mnRepresent the gray level image clarification of objective that extracts, that is:
μ mn=∫ ∫ | W mn(xy) | dxdy, and
Figure FDA00002818704000081
So proper vector μ mnAnd σ mnBuild, the proper vector of using S=4 and K=6 to form in system is
Figure FDA00002818704000082
Use the Gabor wave filter of design, average value mu mnWith its standard variance σ mnObtained proper vector. the texture feature vector of the beef image target of each 128 * 128 * 8bit is 24 * 2=48.
12. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: the specific implementation step of the successive Regression forecast model in described step (4) is as follows:
(12a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(12b), set up the successive Regression model according to 217 textural characteristics values of forecast set sample;
The successive Regression forecast model of (12c), the substitution of test set sample being built up;
(12d), successive Regression forecast model result and the tender degree result of actual test are compared, draw predictablity rate.
13. the method for the detection tenderness of beef utilizing based on the multispectral image textural characteristics according to claim 7 is characterized in that: the specific implementation step of the SVM prediction model in described step (4) is as follows:
(13a), tender degree detected value actual in beef carry out labeling to the beef sample, is divided into forecast set and test set;
(13b), set up supporting vector machine model according to 217 textural characteristics values of forecast set sample;
(13c), choose support vector machine penalty coefficient and kernel functional parameter;
The SVM prediction model of (13d), the substitution of test set sample being built up;
(13e), SVM prediction model result and the tender degree result of actual test are compared, draw predictablity rate.
CN2013100473068A 2013-02-05 2013-02-05 Multispectral image textural feature-based beef tenderness detection device and method thereof Pending CN103149163A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013100473068A CN103149163A (en) 2013-02-05 2013-02-05 Multispectral image textural feature-based beef tenderness detection device and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013100473068A CN103149163A (en) 2013-02-05 2013-02-05 Multispectral image textural feature-based beef tenderness detection device and method thereof

Publications (1)

Publication Number Publication Date
CN103149163A true CN103149163A (en) 2013-06-12

Family

ID=48547368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013100473068A Pending CN103149163A (en) 2013-02-05 2013-02-05 Multispectral image textural feature-based beef tenderness detection device and method thereof

Country Status (1)

Country Link
CN (1) CN103149163A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182997A (en) * 2014-08-15 2014-12-03 浙江科技学院 Hyperspectral image compression method and application
CN105701805A (en) * 2016-01-07 2016-06-22 中国农业大学 Pork intramuscular fat content nondestructive testing method based on computer vision
CN105929121A (en) * 2016-04-25 2016-09-07 吉林大学 Mastication robot for detecting tenderness of beef
CN106940292A (en) * 2017-04-25 2017-07-11 合肥工业大学 Bar denier wood raw material quick nondestructive discrimination method of damaging by worms based on multi-optical spectrum imaging technology
CN109632081A (en) * 2018-11-23 2019-04-16 积成电子股份有限公司 Vibration of wind generating set feature extraction and otherness sentence method for distinguishing
CN109883966A (en) * 2019-02-26 2019-06-14 江苏大学 A method of Eriocheir sinensis amount of cure is detected based on multispectral image technology
CN110163101A (en) * 2019-04-17 2019-08-23 湖南省中医药研究院 The difference of Chinese medicine seed and grade quick discrimination method
US11497221B2 (en) 2019-07-19 2022-11-15 Walmart Apollo, Llc Systems and methods for managing meat cut quality

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201464772U (en) * 2009-07-20 2010-05-12 南京农业大学 Portable collecting device for rib-eye images of beef carcass
CN102854148A (en) * 2012-08-30 2013-01-02 中国农业大学 Detection and grading system for tenderness of fresh beef based on multispectral imagery

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201464772U (en) * 2009-07-20 2010-05-12 南京农业大学 Portable collecting device for rib-eye images of beef carcass
CN102854148A (en) * 2012-08-30 2013-01-02 中国农业大学 Detection and grading system for tenderness of fresh beef based on multispectral imagery

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
X.SUN等: "Predicting beef tenderness using color and multispectral image texture features", 《MEAT SCIENCE》, vol. 92, no. 4, 31 December 2012 (2012-12-31) *
陆秋琰等: "牛肉图像采集光照系统的设计与研究", 《农机化研究》, 30 June 2008 (2008-06-30) *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104182997B (en) * 2014-08-15 2017-05-10 浙江科技学院 Hyperspectral image compression method and application
CN104182997A (en) * 2014-08-15 2014-12-03 浙江科技学院 Hyperspectral image compression method and application
CN105701805B (en) * 2016-01-07 2019-01-08 中国农业大学 A kind of pork intramuscular fat content lossless detection method based on computer vision
CN105701805A (en) * 2016-01-07 2016-06-22 中国农业大学 Pork intramuscular fat content nondestructive testing method based on computer vision
CN105929121A (en) * 2016-04-25 2016-09-07 吉林大学 Mastication robot for detecting tenderness of beef
CN105929121B (en) * 2016-04-25 2018-07-03 吉林大学 Detect the chewing robot of tenderness of beef utilizing
CN106940292A (en) * 2017-04-25 2017-07-11 合肥工业大学 Bar denier wood raw material quick nondestructive discrimination method of damaging by worms based on multi-optical spectrum imaging technology
CN109632081A (en) * 2018-11-23 2019-04-16 积成电子股份有限公司 Vibration of wind generating set feature extraction and otherness sentence method for distinguishing
CN109883966A (en) * 2019-02-26 2019-06-14 江苏大学 A method of Eriocheir sinensis amount of cure is detected based on multispectral image technology
CN109883966B (en) * 2019-02-26 2021-09-10 江苏大学 Method for detecting aging degree of eriocheir sinensis based on multispectral image technology
CN110163101A (en) * 2019-04-17 2019-08-23 湖南省中医药研究院 The difference of Chinese medicine seed and grade quick discrimination method
CN110163101B (en) * 2019-04-17 2022-09-23 湖南省中医药研究院 Method for rapidly distinguishing seeds of traditional Chinese medicinal materials and rapidly judging grades of seeds
US11497221B2 (en) 2019-07-19 2022-11-15 Walmart Apollo, Llc Systems and methods for managing meat cut quality
US11864562B2 (en) 2019-07-19 2024-01-09 Walmart Apollo, Llc Systems and methods for managing meat cut quality

Similar Documents

Publication Publication Date Title
CN103149163A (en) Multispectral image textural feature-based beef tenderness detection device and method thereof
CN104392463B (en) Image salient region detection method based on joint sparse multi-scale fusion
Al Bashish et al. A framework for detection and classification of plant leaf and stem diseases
CN103186904B (en) Picture contour extraction method and device
CN107977671A (en) A kind of tongue picture sorting technique based on multitask convolutional neural networks
CN103729842B (en) Based on the fabric defect detection method of partial statistics characteristic and overall significance analysis
CN102043945B (en) License plate character recognition method based on real-time vehicle tracking and binary index classification
CN109635875A (en) A kind of end-to-end network interface detection method based on deep learning
CN108765408A (en) Build the method in cancer pathology image virtual case library and the multiple dimensioned cancer detection system based on convolutional neural networks
CN110210362A (en) A kind of method for traffic sign detection based on convolutional neural networks
CN102496023B (en) Region of interest extraction method of pixel level
CN102096824B (en) Multi-spectral image ship detection method based on selective visual attention mechanism
CN101667245B (en) Human face detection method by cascading novel detection classifiers based on support vectors
CN107527023B (en) Polarized SAR image classification method based on superpixels and topic models
CN106447646A (en) Quality blind evaluation method for unmanned aerial vehicle image
CN109034224A (en) Hyperspectral classification method based on double branching networks
CN103185731A (en) Device for detecting beef tenderness based on color image textural features and method thereof
CN101604382A (en) A kind of learning fatigue recognition interference method based on human facial expression recognition
CN105809173B (en) A kind of image RSTN invariable attribute feature extraction and recognition methods based on bionical object visual transform
CN106506901A (en) A kind of hybrid digital picture halftoning method of significance visual attention model
CN106326834A (en) Human body gender automatic identification method and apparatus
CN106844739A (en) A kind of Remote Sensing Imagery Change information retrieval method based on neutral net coorinated training
Chen et al. Agricultural remote sensing image cultivated land extraction technology based on deep learning
Muthevi et al. Leaf classification using completed local binary pattern of textures
CN109635726A (en) A kind of landslide identification method based on the symmetrical multiple dimensioned pond of depth network integration

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20130612

RJ01 Rejection of invention patent application after publication