IE922603A1 - Method and apparatus to be used by individual treatment of¹pieces of meat - Google Patents

Method and apparatus to be used by individual treatment of¹pieces of meat

Info

Publication number
IE922603A1
IE922603A1 IE260392A IE922603A IE922603A1 IE 922603 A1 IE922603 A1 IE 922603A1 IE 260392 A IE260392 A IE 260392A IE 922603 A IE922603 A IE 922603A IE 922603 A1 IE922603 A1 IE 922603A1
Authority
IE
Ireland
Prior art keywords
picture
meat
certain
carcass
video camera
Prior art date
Application number
IE260392A
Inventor
Per Lundsfryd Jensen
Torben Nielsen
Hans Henrik Thodberg
Original Assignee
Slagteriernes Forskningsinst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=8105530&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=IE922603(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Slagteriernes Forskningsinst filed Critical Slagteriernes Forskningsinst
Publication of IE922603A1 publication Critical patent/IE922603A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0017Apparatus for cutting, dividing or deboning carcasses
    • A22B5/0058Removing feet or hooves from carcasses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Abstract

In the automatic shape analysis of individual carcasses (2) it may be necessary to localize certain areas of the meat surface, such as certain anatomical areas. For this purpose the meat is illuminated by a source of light (4), and a picture of the meat surface is recorded by means of a video camera (3). The picture is stored and data processed in a computer (6), after which the computer transmits a signal that depends upon the localization, to be used in the subsequent treatment of the carcass, for example for controlling an automatic cutting machine. The method is characterised by data processing of the stored picture including the use of a directional filtration method which highlights a certain direction in the picture. In this way the said area of the meat surface is located with increased certainty. This can be performed by means of a matrix applied along the said direction.

Description

Method and apparatus to be used by individual treatment of pieces of meat The present invention relates to a method by* individual treatment of pieces of meat, comprising the steps of illuminat5 ing a piece of meat by a light source, recording a video picture of the meat surface by means of a video camera, registering the recorded picture, data processing the picture in a computer to localize certain areas of the meat surface, such as certain anatomical areas, outputting a signal depending upon the localization, and using the signal to control the subsequent treatment of the piece of meat.
In connection with automated treatment of carcasses it has been suggested to use video recordings to determine the external characteristics of the carcass, carrying out the treatment on the basis of the characteristics found. Video recordings may for instance be used in connection with semi or fully automatic classification of carcasses. A video picture of the carcass is registered, and the registered picture is data processed to find the actual shape and colour which are significant parame20 ters for the classification of a carcass.
Another application of video recordings is for the determination of the meat/fat ratio of a carcass. A picture is recorded of the surface produced by the splitting-up of a carcass into halves and the recorded picture is data processed. The grey areas of the picture are considered to be meat, whereas the light areas represent fat. The black areas are ignored, since they form the background.
Video recordings may also be used in connection with semi or fully automatic processing of carcasses, e.g. for the automatic adjustment of a saw that is to cut up carcass halves. By appropriate data processing of a picture of the carcass, the position of predetermined anatomical parts which correlate with the cutting position can be found. The position of the saw or the carcass may then be adjusted in accordance with the position found, so that the cut is placed correctly. Anatomical parts which may be used for this purpose are e.g. the forelegs or backbone of a split-up carcass.
Mostly, the registered picture is used only for the determination of the contour of the carcass, and the localization of the anatomical area must therefore be made on the basis of the contour data. However, there is a limit to the accuracy and variation of treatment that may be performed on carcasses on the basis of such localizations.
Internal anatomical parts, which may be seen in split-up carcasses, constitute a more precise reference to the automated treatment. The location of the individual spinal members could for instance give a considerably higher accuracy in the determination of the cutting positions than e.g. the contour. Precision is essential in cutting a split-up carcass into three parts.
The problem is, however, that it is extremely difficult to provide a video picture in which the anatomical parts may be identified and located with sufficient certainty. This may be due partly to a relatively poor contrast in the picture between the individual parts, partly to the fact that the carcasses may be of different build, and partly there may in some cases be loose membranes or shreds of meat which by the splitting-up operation have been pulled in to cover the looked-for anatomical parts and which therefore blur or conceal their presence.
The objective of the present invention is to provide a method by which anatomical parts may be localized with good certainty under conditions prevailing in slaughterhouses.
The method of the invention is characterized in that the data processing of the registered picture in the computer comprises a directional filtration, highlighting a certain direction in the picture.
By means of the present directional filtration there will be an elucidation of the anatomical structures which run in the same direction as the said direction in the picture. The position of the anatomical parts comprising such structures may therefore be determined with increased certainty, and in this way it will be possible to perform an improved automated treatment of the pieces of meat in slaughterhouses.
By means of the present method it is possible to localize certain anatomical areas on a split-up pig carcass with sufficient accuracy to be used in an automatic cutting machine. Controlled by the localization data obtained, a carcass half may be cut up automatically into three parts, fore-end, middle and ham.
Tests have shown that the cutting operations are performed as accurately as the manual cuttings being performed today.
Anatomical areas which have proved suitable by tripartition of 10 a carcass are the joints of the backbone, especially the discs between specific vertebrae and the disc between the last vertebra and the first caudal vertebra (the sharp bend of the backbone).
An advantage of the present measuring method is that no 15 manning is required, and furthermore it is not destructive, i.e. there is no deterioration of the meat as a consequence of the measuring.
It has shown that the method may be performed quickly enough to allow e.g. 360 carcasses to be measured and treated per hour, which is satisfactory under most slaughterhouse conditions.
In the following a number of embodiments of the present invention is mentioned.
The directional filtration may be performed by means of a matrix, the numbers of which are higher in one direction that in other directions.
A square matrix with at least 3*3 and at most 7*7 elements may be applied.
The data processing may furthermore be designed to highlight areas of a certain width.
For each line that can be drawn in the picture parallel with a certain direction a summation of the light values of the pixels of the picture may be performed, and the obtained sum values may be applied for localization of a certain area.
The sum values may be subjected to a non-linear transformation.
The set of data derived from the sum values may be used for the matching of a template representing the searched-for anatomical areas, the template being preferably dislocated and stretched until the greatest similarity with the curve formed by the values of the set of data is obtained.
The source of light may be placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
Based upon a localized area, e.g. between two specific vertebral joints of a carcass, a calculation may be made by the computer according to an algorithm, which - if desired - may comprise data about the dimension, weight and sex of the carcass, and a signal depending upon the calculation may be output to a treatment plant for the piece of meat to adjust the positions of its tools, e.g. in a cutting machine to adjust the position of the saw in order to perform a correct cutting operation on e.g. the ham of a carcass.
An apparatus to be used by individual treatment of pieces of meat comprises a light source to illuminate a piece of meat, a video camera to record a video picture of the meat surface and a computer for registering and data processing of the picture to localize certain areas of the meat surface, such as certain anatomical areas and a signal output module in the computer for the output of a signal depending upon the localization, to be used by the subsequent treatment of the piece of meat.
The data processing unit of the computer comprises a process of directional filtration highlighting a certain direction in the picture.
The apparatus is able, with improved certainty, to localize anatomical parts which run parallel with the said direction. In this way the subsequent treatment of the piece of meat, controlled by the localization, may take place with increased accuracy.
The light source may be placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
The invention is explained in greater detail in the following, with reference to the drawings, in which Figure 1 shows an embodiment of a measuring apparatus according to the invention to be used by the partition of split-up pig carcasses, Figure 2 a video picture recorded of the ham area with a marked backbone curve, Figure 3 a data processed section of the picture, Figure 4 the same section after directional filtration, Figure 5 a curve of sum values, and Figures 6a-d data processed curves.
The apparatus of Figure 1 comprises a conveyor with a black conveyor belt 1, on which split-up pig carcasses 2 are placed with the rind facing downwards. The carcasses are conveyed in the direction of the arrow P with the back first. Located over the belt are three CCD video cameras 3, the fields of vision being the fore-end, the ham, and the hind leg of the carcass, respectively. The fore-end camera and the ham cameras are provided with green filters, whereas the hind-leg camera has a red filter. Three light sources 4 illuminate the carcass. Two of the light sources direct the light onto the carcass at an angle of approx. 45°, so that a shadow area is formed in the cavity of the carcass, one side of the shadow area will border immediately on the backbone of the carcass. The third light source directs the light onto the carcass at an angle of 90°.
A framegrabber 5 is connected with the output of each of the cameras. The framegrabber stores a video picture when an electronic trigger signal is given from a central computer 6. It may e.g. be initiated by a light relay 7 placed by the belt, which detects the presence of a carcass inside the field of vision of the camera or by a signal from the conveyance control.
The computer 6 comprises a control and computing unit which retrieves data from the grabber of current interest and processes them according to a pre-determined process. If required, other measuring data may be used, e.g. information of the current weight or meat/fat thicknesses measured by means of a probe.
The process results in a signal which is an expression of the cutting position. It is used as a control signal for automatic adjustment of a subsequent band saw to obtain a correct cutting position in proportion to the anatomical parts.
In the following is a more detailed description of the treatment of the ham picture and fixing of the point of reference of the ham cut (the sharp point of transition between backbone and caudal vertebra, which is sometimes described as the sharp bend).
Searching for the backbone Due to the special oblique lighting of the carcass the backbone is fully illuminated, whereas the meat area, which is adjacent to one side of the backbone, lies in shadow, see Figure 2.
The recorded and stored picture is built up of pixels which are placed in a raster in lines and columns at uniform inter30 vals. In the first 20 columns of the picture a search is made in the area which contains the pixels with the lowest light values within an area of 15 * 20 pixels. From this shadow area the backbone is found over a 10 pixels wide area as the place where there is the largest positive change (gradient) in the light value and where the average light values before the gradient equal a predetermined value.
When this place has been found, the backbone will be searched for within an interval of +/- 15 pixels. The backbone point is defined as the point where there is the biggest positive gradient and where the average pixel values before the gradi5 ent equal a pre-determined value.
When the backbone point has been found, the next point is searched for according to the same criteria as those described above. If the co-ordinate is not detected, it is put to be equivalent to the previous co-ordinate. The co10 ordinates for the backbone are averaged before they are used in the following calculations. The points found are marked as a backbone curve in Figure 2.
Calculation of provisional position of sharp bend The provisional position of the sharp bend is determined as the position where the change in the curvature of the backbone curve is biggest. It is marked with the line F in Figure 2.
Picture processing of the backbone By means of the backbone curve a picture section of 50 x 300 pixels is formed, comprising the backbone and the sharp bend. The upper edge of the section corresponds to the curve. Correction is made for the distortion, which is caused by the straightening of the section.
The obtained part picture of the backbone, which is shown in Figure 3, is subjected to a directional filtration which highlights structures at right angles to the backbone, and furthermore perhaps structures of a certain width. For this purpose the following matrix is applied: -112 1-1 -1121-1 -2242-2 -1121-1 -1121-1 A 5 * 5 matrix is formed by the pixel-light values in one corner of the part picture. The scalar product of the two matrices is calculated from the found numerical values and is inserted in the stored picture section instead of the origIE 922603 inal pixel-values. A new 5*5 matrix is formed by the pixelvalues which is a single pixel column to the right of the first matrix. The product of this matrix and the matrix shown above is formed, after which the found values are inserted in the stored picture section instead of the original values.
The procedure is continued in this way until the right-hand edge of the picture section is reached. Then the process is moved up by pixel, and the procedure is repeated. When the whole line of pixels at this level has been processed, you move up again by one pixel, and the procedure is continued until the pixel-values of the entire picture section have been data processed by means of the matrix that highlights the direction. The picture now looks like the one in Figure 4, in which the discs between the vertebral joints are more clearly visible than on the original picture (Figure 3).
A five pixels wide edge is cut off the picture section all the way round, after which a simple summing-up of the pixellight values in each of the columns of pixels in the picture is made. The sum curve is shown in Figure 5.
Processing of sum curve In Figures 6a-b are shown two sum curves which often occur in practice. The sharp bend searched for is known to be located near position 50, but in order to determine the exact position of the sharp bend the curves will have to be data processed, by which is utilized that the distance between the discs in the backbone is largely equal in the same individual carcass.
In Figure 2 it is faintly shown that the discs appear as white stripes between dark bones. By use of gradient filtra30 tion of the curves in Figures 6a-b a positive signal is obtained at the start of a stripe and a negative signal at the conclusion of the stripe. The disc is assumed to be located where the curve on its way from the top to the bottom crosses the average level.
On the curves in Figures 6a-b the following transformation is first performed p2(x)-abs(p(x-3)-p(x+3)) After the transformation the curves look as shown in Figures 5 6c-d. p2(x) is big when p(x-3) is big and/or p(x+3) is small. It may be seen that a peak has come at those places where there is a heavy fall over 6 units (pixels) in the horizontal direction of picture.
There are heavy falls at the discs and small falls many places due to noise. If a fall is twice as large as another fall, its importance should not be just twice as big, but it should be even more significant. In order to reduce the noise the following non-linear transformation of p2(x) is applied: p3(x)=p2(x)3 The positions of the discs are now rather obvious with a deviation corresponding to max. one pixel.
In order to accentuate weak peaks in fairly noiseless areas, a local renorming by folding is performed with the function k: k(s)-exp(-abs(s/a)) p4(x)«k o p3(x)= ps) p3(x-s) ds The obtained curves are shown in Figures 6e-f.
Calculation of the position of the discs The function p4(x) is matched with a template with six equidistant edges which shall represent the positions of the discs between the vertebral joints. The idea is that the curve, after suitable dislocation and scaling, is to look like an average curve, which will thus serve as a prototype or template. When the curve is interpreted, it is stretched or dislocated within certain predetermined limits. The transformation which gives the biggest overlap with the template is the correct one. The size of the overlap is an expression of the certainty with which the profile has been interpreted.
The procedure of calculation applied thus performs a pattern recognition based upon a total appraisal.
It is predetermined within which limits the first and the last discs are to be searched. Within the limits every poss5 ible position is calculated, stretching the template more or less, and based upon a total appraisal the template is found which gives the biggest similarity to the curve. The position of the discs, including the invisible disc indicated in Figures 6b, 6d and 6f at the transition point between back10 bone and caudal vertebra (the sharp bend), has thus been determined with great certainty. The positions of the discs are marked in Figure 2 as vertical lines. The provisional position of the sharp bend being used can now be replaced by the more precise position which is marked by K.
The entire procedure mentioned above is performed automatically in the computer by means of electronic data processing of the stored picture of the ham part of the carcass. On the basis of the found position of the sharp bend the point of partition is calculated by an algorithm, which among other things may include data about the dimension, weight and sex of the carcass, and a signal is sent to the cutting plant to adjust the saw for ham cutting to a corresponding position, after which the saw automatically performs a correct cutting operation of the ham when the carcass passes the saw during the conveyance on the belt.
By corresponding picture and data processing of the picture which is recorded of the fore-end of the carcass by the second video camera, the saw for cutting-off of the fore-end may be adjusted in the same way. As point of reference is here used the disc between specific vertebral joints, whereas the template is 14-toothed with the first edge located at the genik of the carcass.
The picture recording from the third camera is used to localize, among other things, the malleolus of the hind leg of the carcass. A profile curve of the hind leg is formed which is examined in a way known per se. On the basis of a signal about the position of the malleolus a third saw can be adjusted automatically and remove the hind toe in a correct way.

Claims (15)

1. A method by individual treatment of pieces of meat, comprising the steps of: illuminating a piece of meat by a light source, recording a video picture of the meat surface is made 5 by means of a video camera, registering the recorded picture, processing the registered picture in a computer to localize certain areas of the meat surface, such as certain anatomical areas, outputting a signal depending upon the localization, and using the signal to control the subsequent treatment of 10 the piece of meat, characterized in that the data processing of the registered picture in the computer comprises a directional filtration highlighting a certain direction in the picture.
2. A method according to Claim 1, characterized 15 in that the directional filtration is performed by means of a matrix the numbers of which are higher in one direction than in other directions.
3. A method according to Claim 2, characterized in that a square matrix with at least 3*3 and at most 7*7 20 elements is applied.
4. A method according to any of the Claims 1-3, c h a r a c t e r i z e d in that the data processing is furthermore designed to highlight areas of a certain width.
5. A method according to any of the Claims 1-4, charact 25 e r i z e d in that for each line which can be drawn in the picture parallel with a certain direction a summation of the light values of the pixels of the pictures is performed, and that the obtained sum values are applied for localization of a certain area. 30
6. A method according to Claim 5, characterized in that the sum values are subjected to a non-linear transformation.
7. A method according to any of the Claims 5-6, c h a racteriz ed in that the set of data derived from the sum values is used for the matching of a template representing the searched-for anatomical areas, the template being 5 preferably dislocated and stretched until the greatest similarity with the curve formed by the values of the set of data is obtained.
8. A method according to any of the Claims 1-7, c h a r a c t e r i z e d in that the source of light is placed in such a 10 way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
9. A method according to any of the Claims 1-8, c h a r a c t e r i zed in that, based upon a localized area, e.g. 15 between two specific vertebral joints of a carcass, a calculation is made by the computer according to an algorithm, which, if desired, may comprise data about the dimension, weight and sex of the carcass, and that a signal depending upon the calculation is output to a treatment plant for the 20 piece of meat to adjust the positions of its tools, e.g. in a cutting machine to adjust the position of the saw in order to perform a correct cutting operation on e.g. the ham of a carcass.
10. An apparatus to be used by individual treatment of pieces 25 of meat comprising a light source to illuminate a piece of meat, a video camera to record a video picture of the meat surface and a computer for registering and data processing of the picture to localize certain areas of the meat surface such as certain anatomical areas, and a signal output module 30 in the computer for the output of a signal depending upon the localization, to be used by the subsequent treatment of the piece of meat, characterized in that the data processing unit of the computer comprises a process of directional filtration highlighting a certain direction in the 35 picture. »
11. An apparatus according to Claim 10, characterized in that the process of directional filtration comprises a matrix, the numbers of which in one direction are higher than in other directions. 5
12. An apparatus according to Claim 11, characterized in that the matrix is square with at least 3*3 and at most 7*7 elements.
13. An apparatus according to any of the Claims 10-12, characterized in that the source of light is 10 placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle< to each other.
14. A method of individual treatment of pieces of meat substantially as hereinbefore described with reference to and as illustrated in the accompanying drawings.
15. An apparatus substantially as hereinbefore described with reference to and as illustrated in the accompanying drawings.
IE260392A 1991-08-23 1992-08-21 Method and apparatus to be used by individual treatment of¹pieces of meat IE922603A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DK199101504A DK167462B2 (en) 1991-08-23 1991-08-23 Method and plant for use in treating a meat subject

Publications (1)

Publication Number Publication Date
IE922603A1 true IE922603A1 (en) 1993-02-24

Family

ID=8105530

Family Applications (1)

Application Number Title Priority Date Filing Date
IE260392A IE922603A1 (en) 1991-08-23 1992-08-21 Method and apparatus to be used by individual treatment of¹pieces of meat

Country Status (6)

Country Link
DE (1) DE4228068A1 (en)
DK (1) DK167462B2 (en)
FR (1) FR2680449B1 (en)
GB (1) GB2258916B (en)
IE (1) IE922603A1 (en)
NL (1) NL9201472A (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NZ251947A (en) * 1992-04-13 1996-11-26 Meat Research Corp Image analysis for meat inspection
GB9510171D0 (en) * 1995-05-19 1995-07-12 Univ Bristol A method of and apparatus for locating a spine in a half-carcass
US6860804B2 (en) 1999-08-27 2005-03-01 Kj Maskinfabriken A/S Laying-down system and vision-based automatic primal cutting system in connection therewith
WO2001015538A2 (en) * 1999-08-27 2001-03-08 K.J. Maskinfabriken A/S Arrangement for laying-down alf carcasses, and based automatic cutting-up system in connection therewith
EP1289374B1 (en) * 2000-05-30 2007-07-25 Marel HF. An integrated meat processing and information handling method
DE102007017899B4 (en) * 2007-04-13 2017-02-16 Innotech Ingenieursgesellschaft Mbh Apparatus and method for cutting food material
WO2019210421A1 (en) * 2018-05-04 2019-11-07 Xpertsea Solutions Inc A scale for determining the weight of organisms
DE102020006482A1 (en) 2020-10-14 2022-04-14 Innotech Ingenieursgesellschaft Mbh Device for cutting agricultural products and central processing unit with at least one data memory for controlling the device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2728913A1 (en) * 1977-06-27 1979-01-18 Hans Breitsameter METHOD AND DEVICE FOR CLASSIFYING MEAT
DK157380C (en) * 1986-11-06 1991-08-12 Lumetech As METHODS OF OPTICAL, BODY-FREE MEASUREMENT OF MEAT TEXTURE
FR2608899B1 (en) * 1986-12-29 1990-02-23 Simonet Andre PROCESS FOR QUALIFYING CARCASSES OF BUTCHER ANIMALS, AND CORRESPONDING INSTALLATION
DK676487A (en) * 1987-12-22 1989-06-23 Slagteriernes Forskningsinst PROCEDURE FOR DETERMINING QUALITY CHARACTERISTICS OF INDIVIDUAL CREATURE GENERATOR AND PLANT FOR USE IN DETERMINING THE PROPERTIES

Also Published As

Publication number Publication date
DK167462B2 (en) 1999-11-01
DK150491A (en) 1993-02-24
GB9217299D0 (en) 1992-09-30
FR2680449A1 (en) 1993-02-26
DE4228068A1 (en) 1993-03-11
NL9201472A (en) 1993-03-16
GB2258916B (en) 1995-08-02
FR2680449B1 (en) 1994-05-20
DK150491D0 (en) 1991-08-23
DK167462B1 (en) 1993-11-01
GB2258916A (en) 1993-02-24

Similar Documents

Publication Publication Date Title
AU722769B2 (en) Method and apparatus for using image analysis to determine meat and carcass characteristics
US6891961B2 (en) Image analysis systems for grading of meat, predicting quality of meat and/or predicting meat yield of an animal carcass
AU665683B2 (en) Image analysis for meat
DE112010002174B4 (en) Method and device for a practical 3D vision system
Whittaker et al. Fruit location in a partially occluded image
Shimizu et al. Computer-vision-based system for plant growth analysis
KR100948406B1 (en) Chicken carcass quality grade automatic decision and weight measuring system
WO1999004361A1 (en) Method for acquiring, storing and analyzing crystal images
EP2503331A2 (en) Method and system for the real-time automatic analysis of the quality of samples of processed fish meat based on the external appearance thereof, said samples travelling on a conveyor belt such that surface defects can be detected and the fish meat can be sorted according to quality standards
Chen et al. Evaluating fabric pilling with light-projected image analysis
CN106530294A (en) Method for carrying out processing on meibomian gland image to obtain gland parameter information
IE922603A1 (en) Method and apparatus to be used by individual treatment of¹pieces of meat
GB2247524A (en) Automatic carcass grading apparatus and method
CN109300150B (en) Hand bone X-ray image texture feature extraction method for bone age assessment
EP1174034A1 (en) Method for trimming pork bellies
KR102285112B1 (en) Non-destructive measurement system of meat yield of beef carcasses using digital image analysis
CN111259883B (en) Image recognition device and image recognition method
Poelzleitner et al. Automatic inspection of leather surfaces
CN113554579A (en) Leather defect detection method
Hoang et al. Image processing techniques for leather hide ranking in the footwear industry
De Mezzo et al. Weed leaf recognition in complex natural scenes by model-guided edge pairing
JP2020183876A (en) Feature point recognition system and workpiece processing system
EP0743618A2 (en) A method of and apparatus for locating a spine in a half-carcass
Yu et al. Compact imaging system and deep learning based segmentation for objective longissimus muscle area in Korean beef carcass
Chen et al. Inspection of tire tread defects using image processing and pattern recognition techniques

Legal Events

Date Code Title Description
MM9A Patent lapsed through non-payment of renewal fee