GB2258916A - Method of and apparatus for individually image processing pieces of meat - Google Patents

Method of and apparatus for individually image processing pieces of meat Download PDF

Info

Publication number
GB2258916A
GB2258916A GB9217299A GB9217299A GB2258916A GB 2258916 A GB2258916 A GB 2258916A GB 9217299 A GB9217299 A GB 9217299A GB 9217299 A GB9217299 A GB 9217299A GB 2258916 A GB2258916 A GB 2258916A
Authority
GB
United Kingdom
Prior art keywords
meat
picture
certain
light
video camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9217299A
Other versions
GB2258916B (en
GB9217299D0 (en
Inventor
Per Lundsfryd Jensen
Torben Nielsen
Hans Henrik Thodberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Slagteriernes Forskningsinstitut
Original Assignee
Slagteriernes Forskningsinstitut
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=8105530&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=GB2258916(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Slagteriernes Forskningsinstitut filed Critical Slagteriernes Forskningsinstitut
Publication of GB9217299D0 publication Critical patent/GB9217299D0/en
Publication of GB2258916A publication Critical patent/GB2258916A/en
Application granted granted Critical
Publication of GB2258916B publication Critical patent/GB2258916B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0017Apparatus for cutting, dividing or deboning carcasses
    • A22B5/0058Removing feet or hooves from carcasses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Food Science & Technology (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Description

221 _) a) 16 METHOD OF AND APPARATUS FOR INDIVIDUALLY TREATING PIECES OF
MEAT The present invention relates to a method of individually treating pieces of meat, comprising the steps of illuminating a piece of meat by a light source, recording a video picture of the meat surface by means of a video camera, registering the recorded picture, data processing the picture in a computer to localize certain areas of the meat surface, such as certain anatomical areas, outputting a signal depending upon the localization, and using the signal to control the subsequent treatment of the piece of meat.
In connection with automated treatment of carcasses it has been suggested to use video recordings to determine the external characteristics of the carcass and carrying out the treatment on the basis of the characteristics found. video recordings may for instance be used in connection with semi or fully automatic classification of carcasses. A video picture of the carcass is registered, and the registered picture is data processed to find the actual shape and colour which are significant parameters for the classification of a carcass.
Another application of video recordings is for the determination of the meat/fat ratio of a carcass. A picture is recorded of the surface produced by the splitting-up of a carcass into halves and the recorded picture is data processed. The grey areas of the picture are considered to be meat, whereas the light areas represent fat. The black areas are ignored, since they form the background.
Video recordings may also be used in connection with semi or fully automatic processing of carcasses, for example for the automatic adjustment of a saw that is to cut up carcass halves. By appropriate data processing of a picture of the carcass, the position of predetermined anatomical parts which correlate with the cutting position can be found. The position of the saw or the carcass may then be adjusted in accordance with the position found, so that the cut is placed correctly. Anatomical parts which may be used for this purpose are for example the forelegs or backbone of a split-up carcass.
Mostly, the registered picture is used only for the determination of the contour of the carcass. and the localization of the anatomical area must therefore be made on the basis of the contour data. However, there is a limit to the accuracy and variation of treatment that may be performed on carcasses on the basis of such localizations.
Internal anatomical parts, which may be seen in split-up carcasses, constitute a more precise reference for the automated treatment The location of the individual spinal members could for instance give a considerably higher accuracy in the determination of the cutting positions than for example the contour. Precision is essential in cutting a split-up carcass into three parts.
The problem is, however, that it is extremely difficult to provide a video picture in which the anatomical parts may be identified and located with sufficient certainty. This may be due partly to a relatively poor contrast in the picture between the individual parts, partly to the fact that the carcasses may be of different build, and partly there may in some cases be loose membranes or shreds of meat which by the splitting-up operation have been pulled in to cover the looked-for anatomical parts and which therefore blur or conceal their presence.
The objective of the present invention is to provide a method by which anatomical parts may be h localized with good certainty under conditions prevailing in slaughterhouses.
The method of the invention is characterized in that the data processing of the registered picture in the computer comprises a directional filtration, highlighting a certain direction in the picture.
By means of the present directional filtration there will be an elucidation of the anatomical structures which run in the same direction as the said direction in the picture. The position of the anatomical parts comprising such structures may therefore be determined with increased certainty, and in this way it will be possible to perform an improved automated treatment of the pieces of meat in slaughterhouses.
By means of the present method it is possible to localize certain anatomical areas on a split-up pig carcass with sufficient accuracy to be used in an automatic cutting machine. Controlled by the localization data obtained, a carcass half may be cut up automatically into three parts, fore-end, middle and ham.
Tests have shown that the cutting operations are performed as accurately as the manual cuttings being performed today.
Anatomical areas which have proved suitable by tripartition of a carcass are the joints of the backbone, especially the discs between specific vertebrae and the disc between the last vertebra and the first caudal vertebra (the sharp bend of the backbone).
An advantage of the present measuring method is that no manning is required, and furthermore it is not destructive, that is there is no deterioration of the meat as a consequence of the measuring.
It has been shown that the method may be performed quickly enough to allow for example 360 carcasses to be measured and treated per hour, which is satisfactory under most slaughterhouse conditions.
In the following a number of preferred embodiments of the present invention are mentioned.
The directional filtration may be performed by means of a matrix, the numbers of which are higher in one direction than in other directions.
A square matrix with at least 3 3 and at most 7 7 elements may be applied.
The data processing may furthermore be designed to highlight areas of a certain width.
For each line that can be drawn in the picture parallel with a certain direction a summation of the light values of the pixels of the picture may be performed, and the obtained sum values may be applied for localization of a certain area.
The sum values may be subjected to a non-linear transformation.
The set of data derived from the sum values may be used for the matching of a template representing the searched-for anatomical areas, the template being preferably dislocated and stretched until the greatest similarity with the curve formed by the values of the set of data is obtained.
The source of light may be placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
Based upon a localised area, for example between two specific vertebral joints of a carcass, a calculation may be made by the computer according to an algorithm which - if desired - may comprise data about the dimension, weight and sex of the carcass, and a signal depending upon the calculation may be output to a treatment plant for the piece of meat to adjust the positions of its tool, for example in a cutting machine to adjust the position of the saw in order to perform a correct cutting operation on for example the ham of a carcass.
An apparatus for individually treating pieces of meat comprises a light source to illuminate a piece of meat, a video camera to record a video picture of the meat surface and a computer for registering and data processing of the picture to localize certain areas of the meat surface, such as certain anatomical areas and a signal output module in the computer for the output of a signal depending upon the localization, to be used for the subsequent treatment of the piece of meat.
The data processing unit of the computer comprises a process of directional filtration highlighting a certain direction in the picture.
The apparatus is able, with improved certainty, to localize anatomical parts which run parallel with the said direction. In the this way the subsequent treatment of the piece of meat, controlled by the localization, may take place with increased accuracy.
The light source may be placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
In the accompanying drawings:
Figure 1 shows a measuring apparatus to be used for partition of split-up pig carcasses, Figure 2 is a video picture recorded of the ham area with a marked backbone curve, Figure 3 is a data processed section of the picture, Figure 4 is the same section after directional filtrationt Figure 5 shows a curve of sum values, and Figures 6a-d shows data processed curves.
6 - The apparatus of Figure 1 comprises a conveyor with a black conveyor belt 1, on which the split-up pig carcasses 2 are placed with the rind facing downwards. The carcasses are conveyed in the direction of the arrow P with the back first. Located over the belt are three CCD video cameras 3, the fields of vision being the fore-end, the ham, and the hind leg of the carcass, respectively. The fore-end camera and the ham camera are provided with green filters, whereas the hind-leg camera has a red filter. Three light sources 4 illuminate the carcass. Two of the light sources direct the light onto the carcass at an angle of approximately 45 0 01 so that a shadow area is formed in the cavity of the carcass, one side of the shadow area bordering immediately on the backbone of the carcass. The third light source directs the light onto the carcass 0 at an angle of 90
A framegrabber 5 is connected with the output of each of the cameras. The framegrabber stores a video picture when an electronic trigger signal is given from a central computer 6. It may for example be initiated by a light relay 7 placed by the belt, which detects the presence of a carcass inside the field of vision of the camera or by a signal from the conveyance control.
The computer 6 comprises a control and computing unit which retrieves data from the grabber of current interest and processes them according to a predetermined process. If required, other measuring data may be used, for example information on the current weight or meat/fat thicknesses measured by means of a probe.
The process results in a signal which is an expression of the cutting position. It is used as a control signal for automatic adjustment of a subsequent band saw to obtain a correct cutting position in relation to the anatomical parts.
In the following is a more detailed description of the treatment of the ham picture and fixing of the point of reference of the ham cut (the sharp point of transition between backbone and caudel vertebra, which is sometimes described as the "sharp bend").
Searching for the backbone Due to the special oblique lighting of the carcass the backbone is fully illuminated, whereas the meat area, which is adjacent to one side of the backbone, lies in shadow, see Figure 2.
The recorded and stored picture is built up of pixels which are placed in a raster in lines and columns at uniform intervals. In the first 20 columns of the picture a search is made in the area which contains the pixels with the lowest light values within an area of 15 20 pixels. From this shadow area the backbone is found over a 10 pixels wide area as the place where there is the largest positive change (gradient) in the light value and where the average light values before the gradient equal a predetermined value.
When this place has been found, the backbone will be searched for within an interval of +/- 15 pixels. The backbone point is defined as the point where there is the biggest positive gradient and where the average pixel values before the gradient equal a pre-determined value.
When the backbone point has been found, the next point is searched for according to the same criteria as those described above. If the coordinate is not detected, it is put to be equivalent to the previous coordinate. The co-ordinates for the backbone are averaged before they are used in the following calculations. The points found are marked as a backbone curve in Figure 2.
Calculation of provisional position of "sharp bendt# The provisional position of the "sharp bend" is determined as the position where the change in the curvature of the backbone curve is biggest. It is marked with the line 'IF" in Figure 2.
Picture processing of the backbone, By means of the backbone curve a picture section of 50 x 300 pixels is formed, comprising the backbone and the "sharp bend". The upper edge of the section corresponds to the curve. Correction is made for the distortion, which is caused by the straightening of the section. The obtained part picture of the backbone, which is shown in Figure 3, is subjected to a directional filtration which highlights structures at right angles to the backbone, and furthermore perhaps structures of a certain width. For this purpose the following matrix is applied:
-1 1 2 1 -1 -1 1 2 1 -1 -2 2 4 2 -2 -1 1 2 1 -1 -1 1 2 1 1 A 5 5 matrix is formed by the pixel-light values in one corner of the part picture. The scalar product of the two matrices is calculated from the found numerical values and is inserted in the stored picture section instead of the original pixel-values. A new 5 5 matrix is formed by the pixel-values which is a single pixel column to the right of the first matrix. The product of this matrix and the matrix shown above is formed, after which the found values are inserted in the stored picture section instead of the original values.
The procedure is continued in this way until the right-hand edge of the picture section is reached. Then the process is moved up by one pixel, and the procedure is repeated. When the whole line of pixels at this level has been processed, you move up again by one pixel, and the procedure is continued until the pixel-values of the entire picture section have been data processed by means of the matrix that highlights the direction. The picture now looks like the one in Figure 4, in which the discs between the vertebral joints are more clearly visible than on the original picture (Figure 3).
A five pixels wide edge is cut off the picture section all the way round, after which a simple summing-up of the pixel-light values in each of the columns of pixels in the picture is made. The sum curve is shown in Figure 5.
Processing of sum curve In Figures 6a-b are shown two sum curves which often occur in practice. The "sharp bend" searched for is known to be located near position 50, but in order to determine the exact position of the "sharp bend" the curves will have to be data processed, by which is utilized the fact that the distance between the discs in the backbone is largely equal in the same individual carcass.
In Figure 2 it is faintly shown that the discs appear as white strips between dark bones. By use of gradient filtration of the curves in Figures 6a-b a positive signal is obtained at the start of a stripe and a negative signal at the conclusion of the - 10 stripe. The disc is assumed to be located where the curve on its way from the top to the bottom crosses the average level. on the curves in Figures 6a-b the following transformation is first performed p2(x)=abs(p(x-3)-p(x+3)) After the transformation the curves look as shown in Figures 6c-d. p2(x) is big when p(x-3) is big andlor p(x+3) is small. It may be seen that a peak occurs at those places where there is a heavy fall over 6 units (pixels) in the horizontal direction of the picture.
There are heavy falls at the discs and small falls at many places due to noise. If a fall is twice as large as another fall, its importance should not be just twice as big, but it should be even more significant. In order to reduce the noise the following non-linear transformation of p2(x) is applied:
p3(x)=p2(x) 3 The positions of the discs are now rather obvious with a deviation corresponding to a maximum of one pixel.
In order to accentuate weak peaks in fairly noiseless areas, a local renorming by folding is performed with the function k:
k(s)=exp(-abs(s/a)) p4(x)=k o p3(x)= fk(s) p3(x-s) ds The obtained curves are shown in Figures 6e-f.
Calculation of the position of the discs The function p4(x) is matched with a template with six equidistant edges which shall represent the positions of the discs between the vertebral joints. The idea is that the curve, after suitable dislocation and scaling, is to look like an average curve, which will thus serve as a prototype or template. When the curve is interpreted, it is stretched or dislocated within certain predetermined limits. The transformation which gives the biggest overlap with the template is the correct one. The size of the overlap is an expression of the certainty with which the profile has been interpreted.
The procedure of calculation applied thus performs a pattern recognition based upon a total appraisal.
It is predetermined within which limits the first and the last discs are to be searched. Within the limits every possible position is calculated, stretching the template more or less, and based upon a total appraisal the template is found which gives the biggest similarity to the curve. The position of the discs, including the invisible disc indicated in Figures 6b, 6d and 6f at the transition point between backbone and caudal vertebra ("the sharp bend"), has thus been determined with great certainty. The positions of the discs are marked in Figure 2 as vertical lines. The provisional position of the "sharp bend" being used can now be replaced by the more precise position which is marked by 11K91.
The entire procedure mentioned above is performed automatically in the computer by means of electronic data processing of the stored picture of the ham part of the carcass. on the basis of the found position of the "sharp bend" the point of 12 - partition is calculated by an algorithm, which among other things may include data about the dimension, weight and sex of the carcass, and a signal is sent to the cutting plant to adjust the saw for ham cutting to a corresponding position, after which the saw automatically performs a correct cutting operation of the ham when the carcass passes the saw during the conveyance on the belt.
By corresponding picture and data processing of the picture which is recorded of the fore-end of the carcass by the second video camera, the saw for cutting-off of the fore-end may be adjusted in the same way. The point of reference used here is the disc between specific vertebral joints, whereas the template is 14-toothed with the first edge located at the IlgenikIl of the carcass.
The picture recording from the third camera is used to localize, among other things, the malleolus of the hind leg of the carcass. A profile curve of the hind leg is formed which is examined in a way known per se. on the basis of a signal about the position of the malleolus a third saw can be adjusted automatically and remove the hind toe in a correct way.

Claims (15)

  1. A method of individually treating pices of meat, comprising the steps of: illuminating a piece of meat by a light source, recording a video picture of the meat surface by means of a video camera, registering the recorded picture, processing the registered picture in a computer to localize certain areas of the meat surface, such as certain anatomical areas, outputting a signal depending upon the localization, and using the signal to control the subsequent treatment of the piece of meat, characterised in that the data processing of the registered picture in the computer comprises a directional filtration highlighting a certain direction in the picture.
  2. 2. A method according to Claim 1, characterised in that the direction filtration is performed by means of a matrix the numbers of which are higher in one direction than in other directions.
  3. 3. A method according to Claim 2, characterised in that a square matrix with at least 3 3 and at most 7 7 elements is applied.
  4. 4. A method according to any preceding claims, characterised in that the data processing is furthermore designed to highlight areas of a certain width.
  5. 5. A method according to any preceding claims, characterised in that for each line which can be drawn in the picture parallel with a certain direction a summation of the light values of the pixels of the pictures is performed, and that the obtained sum values are applied for localization of a certain area.
  6. 6. A method according to Claim 5, characterised in that the sum values are subjected to a non-linear transformation.
  7. 7. A method according to Claim 5 or 6, characterised in that the set of data derived from the sum values is used for the matching of a template representing the searched-for anatomical areas, the template being dislocated and stretched until the greatest similarity with the curve formed by the values of the set of data is obtained.
  8. 8. A method according to any preceding claim, characterised in that the source of light is placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes of the video camera and the source of light forming an angle to each other.
  9. 9. A method according to any preceding claim, characterised in that, based upon a localized area, a calculation is made by the computer according to an algorithm, and that a signal depending upon the calculation is output to a treatment plant for the piece of meat to adjust the positions of its tools.
  10. 10. An apparatus for individually treating pieces of meat comprising a light source to illuminate a piece of meat, a video camera to record a video picture of the meat surface and a computer for registering and data processing of the picture to localize certain areas of the meat surface such as certain anatomical areas, and a signal output module in the computer for the output of a signal depending upon the localization, to be used for the subsequent treatment of the piece of meat, characterised in that the data processing unit of the computer carries out a process of directional filtration highlighting a certain direction in the picture.
  11. 11. An apparatus according to Claim 10, characterised in that the process of directional filtration comprises a matrix, the numbers of which in one direction are higher than in other directions.
  12. 12. An apparatus according to Claim 11, characterised in that the matrix is square with at least 3 3 and at most 7 7 elements.
  13. 13. An apparatus according to Claim 10, 11 or 12, characterised in that the source of light is placed in such a way that it forms a shadow area in the field of vision of the video camera, the optical axes J of the video camera and the source of light forming an angle to each other.
  14. 14. A method of individually treating pieces of meat substantially as herein described with reference to and as shown in the accompanying drawings.
  15. 15. An apparatus for individually treating pieces of meat substantially as herein described with reference to and as shown in the accompanying drawings.
GB9217299A 1991-08-23 1992-08-14 Method of and apparatus for individually treating pieces of meat Expired - Fee Related GB2258916B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DK199101504A DK167462B2 (en) 1991-08-23 1991-08-23 Method and plant for use in treating a meat subject

Publications (3)

Publication Number Publication Date
GB9217299D0 GB9217299D0 (en) 1992-09-30
GB2258916A true GB2258916A (en) 1993-02-24
GB2258916B GB2258916B (en) 1995-08-02

Family

ID=8105530

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9217299A Expired - Fee Related GB2258916B (en) 1991-08-23 1992-08-14 Method of and apparatus for individually treating pieces of meat

Country Status (6)

Country Link
DE (1) DE4228068A1 (en)
DK (1) DK167462B2 (en)
FR (1) FR2680449B1 (en)
GB (1) GB2258916B (en)
IE (1) IE922603A1 (en)
NL (1) NL9201472A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793879A (en) * 1992-04-13 1998-08-11 Meat Research Corporation Image analysis for meat
WO2019210421A1 (en) * 2018-05-04 2019-11-07 Xpertsea Solutions Inc A scale for determining the weight of organisms

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9510171D0 (en) * 1995-05-19 1995-07-12 Univ Bristol A method of and apparatus for locating a spine in a half-carcass
US6860804B2 (en) 1999-08-27 2005-03-01 Kj Maskinfabriken A/S Laying-down system and vision-based automatic primal cutting system in connection therewith
ES2230144T3 (en) * 1999-08-27 2005-05-01 K.J. Maskinfabriken A/S VISION BASED AUTOMATIC CUTTING SYSTEM.
AU2001260601B2 (en) * 2000-05-30 2005-03-03 Marel Hf. An integrated meat processing and information handling method
DE102007017899B4 (en) * 2007-04-13 2017-02-16 Innotech Ingenieursgesellschaft Mbh Apparatus and method for cutting food material
DE102020006482A1 (en) 2020-10-14 2022-04-14 Innotech Ingenieursgesellschaft Mbh Device for cutting agricultural products and central processing unit with at least one data memory for controlling the device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2728913A1 (en) * 1977-06-27 1979-01-18 Hans Breitsameter METHOD AND DEVICE FOR CLASSIFYING MEAT
DK157380C (en) * 1986-11-06 1991-08-12 Lumetech As METHODS OF OPTICAL, BODY-FREE MEASUREMENT OF MEAT TEXTURE
FR2608899B1 (en) * 1986-12-29 1990-02-23 Simonet Andre PROCESS FOR QUALIFYING CARCASSES OF BUTCHER ANIMALS, AND CORRESPONDING INSTALLATION
DK676487A (en) * 1987-12-22 1989-06-23 Slagteriernes Forskningsinst PROCEDURE FOR DETERMINING QUALITY CHARACTERISTICS OF INDIVIDUAL CREATURE GENERATOR AND PLANT FOR USE IN DETERMINING THE PROPERTIES

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793879A (en) * 1992-04-13 1998-08-11 Meat Research Corporation Image analysis for meat
US6104827A (en) * 1992-04-13 2000-08-15 Meat & Livestock Australia Limited Image analysis for meat
WO2019210421A1 (en) * 2018-05-04 2019-11-07 Xpertsea Solutions Inc A scale for determining the weight of organisms

Also Published As

Publication number Publication date
FR2680449A1 (en) 1993-02-26
DK167462B2 (en) 1999-11-01
FR2680449B1 (en) 1994-05-20
GB2258916B (en) 1995-08-02
NL9201472A (en) 1993-03-16
DK150491A (en) 1993-02-24
IE922603A1 (en) 1993-02-24
DE4228068A1 (en) 1993-03-11
DK167462B1 (en) 1993-11-01
DK150491D0 (en) 1991-08-23
GB9217299D0 (en) 1992-09-30

Similar Documents

Publication Publication Date Title
AU722769B2 (en) Method and apparatus for using image analysis to determine meat and carcass characteristics
US6891961B2 (en) Image analysis systems for grading of meat, predicting quality of meat and/or predicting meat yield of an animal carcass
Whittaker et al. Fruit location in a partially occluded image
AU665683B2 (en) Image analysis for meat
US3800363A (en) Tuna butchering method and system
EP2370952B1 (en) Arrangement and method for determining a body condition score of an animal
US5241365A (en) Method of area localization of meat, in particular fish, which is initially subjected to illumination
KR20090049487A (en) Chicken carcass quality grade automatic decision and weight measuring system
GB2258916A (en) Method of and apparatus for individually image processing pieces of meat
Chen et al. Evaluating fabric pilling with light-projected image analysis
US20010048758A1 (en) Image position matching method and apparatus
JP2020183876A (en) Feature point recognition system and workpiece processing system
Les et al. Automatic recognition of the kidney in CT images
Hoang et al. Image processing techniques for leather hide ranking in the footwear industry
Talukder et al. Modified binary watershed transform for segmentation of agricultural products
Fiallos et al. Automatic detection of injuries in mammograms using image analysis techniques
JPH06278087A (en) Method and device for automatic cutting of large fish
EP0743618A2 (en) A method of and apparatus for locating a spine in a half-carcass
CN108007929A (en) A kind of automatic judging method of beef physiological makeup
KR100919462B1 (en) Chicken carcass individual management system
US20230389559A1 (en) Determining measure of gaping in fish fillet item
Cesar et al. Shape characterization by using the Gabor transform
Pronin Algorithm for detecting artificial objects against natural backgrounds
CN117173490A (en) Marine product detection classification method and system based on separated and extracted image data
Bertolotti Automatic segmentation of pelvic floor ring on 3D ultrasound

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20010814