EP1060391B1 - Fleischfarbbilderzeugungssystem zum voraussagen des geschmackes und ertrages - Google Patents

Fleischfarbbilderzeugungssystem zum voraussagen des geschmackes und ertrages Download PDF

Info

Publication number
EP1060391B1
EP1060391B1 EP99908231A EP99908231A EP1060391B1 EP 1060391 B1 EP1060391 B1 EP 1060391B1 EP 99908231 A EP99908231 A EP 99908231A EP 99908231 A EP99908231 A EP 99908231A EP 1060391 B1 EP1060391 B1 EP 1060391B1
Authority
EP
European Patent Office
Prior art keywords
lean
meat
video image
image data
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP99908231A
Other languages
English (en)
French (fr)
Other versions
EP1060391A1 (de
Inventor
Keith E. Belk
J. Daryl Tatum
Gary C. Smith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Colorado State University Research Foundation
Colorado State University
Original Assignee
Colorado State University Research Foundation
Colorado State University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Colorado State University Research Foundation, Colorado State University filed Critical Colorado State University Research Foundation
Publication of EP1060391A1 publication Critical patent/EP1060391A1/de
Application granted granted Critical
Publication of EP1060391B1 publication Critical patent/EP1060391B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • A22C17/008Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat for measuring quality, e.g. to determine further processing
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22BSLAUGHTERING
    • A22B5/00Accessories for use during or after slaughtering
    • A22B5/0064Accessories for use during or after slaughtering for classifying or grading carcasses; for measuring back fat
    • A22B5/007Non-invasive scanning of carcasses, e.g. using image recognition, tomography, X-rays, ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Definitions

  • the field of the present invention is prediction of meat palatability and yield. More specifically, the present invention relates to the prediction of meat palatability and yield by use of video image analysis (VIA) to determine the color parameters L* (psychometric lightness), a* (red vs . green), and b* (yellow vs . blue) of the lean and fat portions of a meat animal carcass or cut.
  • VIP video image analysis
  • Marbling score of a carcass has been shown to generally correlate with subsequent cooked meat palatability across a wide range of marbling levels for beef, pork, and lamb. However, between carcasses with the same marbling level, there are substantial differences in palatability. Other factors of the carcass believed to predict palatability include maturity score, muscle pH, and muscle color; these factors may be more valuable in the prediction of palatability of chicken, turkey, and fish. Among those with expertise in carcass examination, e . g . meat scientists and U.S. Department of Agriculture (USDA) graders, some of these factors can be scored and palatability predicted by assigning a USDA Quality Grade, given sufficient examination time.
  • USDA U.S. Department of Agriculture
  • Yield Grades are intended to estimate the cutahility and composition of a carcass.
  • Factors used to determine Yield Grades include hot carcass weight, ribeye area (cross-sectional area of the longissimus m. at the 12-13th rib interface), estimated kidney, pelvic, and heart fat percentage, and actual and adjusted subcutaneous fat thickness at the carcass exterior.
  • the time constraints described above for the calculation of Quality Grades also apply to the calculation of Yield Grades.
  • the parameters that underlie the assignment of Quality Grades and Yield Grades are published by the USDA Agricultural Marketing Service, Livestock and Seed Division, e . g ., for beef, the United States Standards for Grades of Carcass Beef.
  • a device for scoring factors predictive of palatability of a meat carcass or cut in addition to an examination of the carcass or cut by a USDA grader would allow meat palatability to be more accurately predicted and USDA Quality Grades can be more accurately assigned. This would allow greater consumer confidence in the Quality Grading system, as well as any additional system for certification of conformance to product quality specifications, as would be desired in a "brand-name" program. In either event, more precise sortation of carcasses for determining meat prices would be allowed. This superior sortation would provide economic benefit to those at all segments of the meat production system: restaurateurs, foodservice operators, and retailers; packers; feed lot operators; and ranchers, farmers, and harvesters of pork, lamb, beef and dairy cattle, chicken, turkey, and various fish species. This superior sortation would also benefit scientists in the collection of carcass and cut data for research, and the previous owners of livestock in making genetic and other management decisions.
  • One such device uses a "duo-scan" or “dual-component” image analysis system. Two cameras are used; a first camera on the slaughter floor scans an entire carcass, and a second camera scans the ribeye after the carcass is chilled and ribbed for quartering.
  • video data are recorded from a beef carcass and transferred to a computer.
  • a program run by the computer determines the percentages of the carcass comprised of fat and lean from the recorded image and additional data available, e . g . hot carcass weight.
  • the quantities of cuts at various levels of lean that can be derived from the carcass are then predicted.
  • the system is not able to predict palatability of the observed carcass for augmenting the assignment of a USDA Quality Grade or other purpose related to sorting carcasses based on eating quality.
  • an apparatus for scoring factors predictive of the palatability of a meat animal carcass It is desirable for such an apparatus to collect and process data and provide output within the time frame that a carcass is examined by a USDA grader under typical conditions in the packing house, commonly 5-15 sec. It is desirable for such an apparatus to return scores for at least one of, for example, color of lean tissue, color of fat tissue, extent of marbling, average number and variance of marbling flecks per unit area, average size of marbling and the variance of average marbling size, average texture, and firmness of lean tissue. It is desirable for the apparatus to use these measures to assign a grade or a score to carcasses in order that the carcasses can be sorted into groups that reflect accurate differences in cooked meat palatability.
  • an apparatus for measuring the cross-sectional surface area of an exposed, cut muscle e.g. ribeye
  • the apparatus uses this measure to assign a grade or score to carcasses in order that the carcasses can be sorted into groups that reflect accurate differences in yield.
  • this apparatus also measure relative areas of cross-section surfaces comprised of fat and/or bone.
  • an apparatus for measuring, predicting; and sorting carcasses on the bases of both palatability arid yield is desirable.
  • Such an apparatus it is desirable for such an apparatus to be portable, e . g . small and lightweight. It is desirable for the apparatus to be capable of withstanding packing plant environments, e . g . to be mounted in a protective housing.
  • the present invention is related to a method for predicting the palatability of meat, according to claims 1-11, and to apparatus for predicting the palatability of meat according to claims 12-21.
  • US 4,226,540 discloses a method and apparatus tor the contact-free determination of features of quality of a lest object selected from meat products.
  • the test object is radiated with a light source. Radiation emanating from the test object is detected to create definite radiation values. These definite radiation values are then analysed, preferably in comparison to the reference values.
  • a scanning device is employed for the production of a scanning ray. The scanning ray is deflected over the test object and a detector is employed for determining characteristics of points on the test object scanned by the scanning ray. With the method and apparatus disclosed, quality features such as the fresh condition, colour and meat/fat ratio of the meat may be determined.
  • WO 93/21597 discloses a process and apparatus for identifying a target section within an image of a meat section. Colour data for each pixel is stored and the pixels having a predetermined colour characteristic of the target section e.g. a certain red/green ratio, are discriminated.
  • the process involves excising from a refined data set for the discriminated pixels at least one data set for a cluster of pixels representing an external image section which is adjacent to and contacting the target section but which does not form part of the target section, the excising step including analysing the shape of the refined image section represented by the refined data set to identify concavities and examining properties of links and/or properties of sub-sections formed by links to identify valid links demarcating the target section from adjacent touching extemal image sections.
  • the refined data set after the excising step has been carried out is processed as the data set representing the target section of the image.
  • the present invention is also related to an apparatus for predicting the palatability of meat, comprising: a video camera adapted to provide a video image data of at least a portion of the meat; a data processing unit adapted to execute program instructions; a program storage device encoded with program instructions that, when executed, perform a method for predicting the palatability of meat, the method comprising: analyzing the video image data to distinguish at least one lean section of the meat from a non-lean section of the meat; analyzing the video image data corresponding to the lean section; measuring a characteristic of the lean section based on the video image data; and correlating the characteristic to the palatability of the meat.
  • FIG. 1 shows a schematic view of an apparatus of the present invention.
  • FIG. 2 shows a flowchart of a method of the present invention.
  • FIG. 3 shows a flowchart of a computer program analyzing video image data to distinguish at least one lean section of the meat from a non-lean section of the meat, analyzing the video image data corresponding to the lean section, and measuring a characteristic of the lean section based on the video image data.
  • the present invention provides a video image analysis (VIA) system for scoring factors predictive of the palatability of a meat animal carcass.
  • the VIA system is preferably a color VIA system.
  • the VIA system includes a video camera 12, preferably a 3-CCD color video camera, preferably mounted in a camera enclosure (not shown).
  • the video camera 12 optionally features an illumination system 26 mounted either on the camera, on the camera enclosure, or not on the camera but in the camera enclosure.
  • the VIA system also includes a data processing unit 16, the data processing unit 16 interfaced with a program storage device 20 by a program storage device interface 18, and at least one output device 24 by an output device interface 22.
  • the program storage device 20 contains a computer program or programs required for proper processing of video image data, preferably color video image data, by the data processing unit 16.
  • the data processing unit 16 is linked to, and receives data from, the video camera 12 via either a transfer cable 14 or a wireless transmission device (not shown).
  • the data processing unit 16 comprises a standard central processing unit (CPU), and preferably also a software module or hardware device for conversion of analog data to digital data, and processes video image data according to instructions encoded by a computer program stored in the program storage device 20.
  • Video image data can be used in subsequent calculation of the values of characteristics, the values being predictive of palatability, the characteristics including color of lean tissue, color of fat tissue, extent of marbling, average number and variance of marbling flecks per unit area, average size of marbling and the variance of average marbling size, average texture of marbling and lean tissue, and firmness of lean tissue.
  • These values can then be used to sort meat (herein defined as-a meat animal carcass, side, or cut, or any portion of a carcass, side, or cut) into groups that vary in predicted subsequent cooked eating quality.
  • the color parameters L*, a*, and b* are used to calculate the values of factors predictive of yield, such as the cross-sectional area of a muscle of interest and other surrounding organs such as fat, bone, and connective tissue. These values can then be used to sort meat into groups that vary in predicted composition.
  • the data processing unit 16 is linked to, and transmits results of data processing to, at least one output device 24 by output device interface 22.
  • results of data processing can also be written to a file in the program storage device 20 via program storage device interface 18 .
  • An output device 24 can be a-video screen, printer, or other device. It is preferred that at least one output device 24 provide a physical or electronic tag to label the meat 10 with results of data processing, in order to facilitate sortation of meat animal carcasses, cuts, or both into groups with similar predicted palatability and/or yield.
  • the present invention also provides a method of predicting the palatability of the meat 10 and determining the cross-sectional area of the meat 10.
  • video image data collected from meat 10 is recorded by the video camera 12, processed by the data processing unit 16, and the values of palatability and/or muscle cross-sectional area is output by the output device 24 to augment the observations made by a USDA line grader, or other operator responsible for sorting or characterizing meat animal carcasses, in order to allow more accurate assignment of Quality Grades, Yield Grades, and/or other sorting or classification criteria based on the characteristics.
  • An apparatus for use in the present invention comprises a video camera 12 and a data processing unit 16.
  • the video camera 12 can be any such camera known to those of skill in the art. It is important for the video camera 12 to provide output within the time frame allotted for meat carcass examination, typically 5-15 seconds. Preferably the output is in real-time.
  • Such real-time output can be the same technology as the viewfinder on a known camcorder or video camera, the real-time output can be the same technology as a known digital camcorder, the real-time output can be a known computer-generated real-time display as are known in video-conferencing applications, or can be any other technology known to those of skill in the art.
  • the video camera 12 is preferable for the video camera 12 to be a color video camera, for reasons discussed below.
  • the video camera 12 be small and lightweight, to provide the advantages of portability and flexibility of positioning, i . e . adjusting the camera angle by the user to provide for optimal collection of video image data from the meat 10. It is also preferred the video camera 12 be durable, in order to better withstand the environment of the packing plant.
  • the power source of the video camera 12 can be either direct current, i . e . a battery secured to electrical contacts from which the video camera 12 can draw power, or alternating current provided from either an electrical outlet or from the data processing unit 16 .
  • An illumination system 26 can optionally be used to illuminate the meat surface. This is desirable when ambient lighting is poor or uneven, or when it is desired to examine regions of the meat 10 that are not illuminated by ambient light. Any known illumination system 26 can be used.
  • the power source of the illumination system 26 can be either direct current, i . e . a battery, or alternating current drawn from either an electrical outlet, the video camera 12, or the data processing unit 16 . It is preferred that the illumination system 26 be small and lightweight, for reasons discussed in reference to the video camera 12 , above.
  • the illumination system 26 can be mounted on the camera, on the outer surface of a camera enclosure, or within a camera enclosure, the camera enclosure described in the following paragraph.
  • the video camera 12 and optional illumination system 26 can be unenclosed or enclosed.
  • the video camera 12 is enclosed in a camera enclosure (not shown) for protection against the environment of packing and processing plants. It is important for the camera enclosure to provide a first aperture for the lens of the video camera 12 to observe the meat 10 .
  • the illumination system 26 can be mounted either on the outer surface of the camera enclosure or within the camera enclosure. If mounted within the camera enclosure, the illumination system 26 can be mounted either on the camera or not on the camera. If the illumination system 26 is mounted in the camera enclosure, it is important for an aperture to be provided for illumination of the meat 10 , either the first aperture used by the lens of the video camera 12 or a second aperture. In either case, the aperture can be unencased or it can be encased by a pane of a transparent material.
  • the camera enclosure can provide an aperture for the cable to exit the enclosure.
  • This aperture can be the first aperture used by the lens of the video camera 12 , the second aperture that can be used by the illumination system 26 , or a third aperture. If the cable exits the enclosure from the first or second aperture. and the first or second aperture is encased by a pane of transparent material, it is important to provide a first cable-passage aperture in the pane for passage of the cable. It is preferred that the camera enclosure be constructed from a lightweight material and be only large enough to conveniently fit the video camera 12 , and optionally the illumination system 26 described above.
  • alternating current is to be used as the power source of the video camera 12 , it is important for an aperture to be provided to pass the power cable from the video camera 12 to the power source. Any one of the first, second, or third apertures can be used, or a fourth aperture can be used. If the aperture to be used is encased by a pane of transparent material, it is important to provide a second cable-passage aperture in the pane for passage of the power cable. Alternatively, both the power cable and the data-transfer cable can exit the camera enclosure through a single cable-passage aperture.
  • the camera enclosure can be designed with features to more readily allow user grip and manipulation, e.g. handles, helmet mounting, etc., and/or with features to allow fixing in position without user grip and manipulation, e.g. brackets for wall mounting, ceiling mounting, or tripod mounting, among other features.
  • wall, ceiling, or tripod mounting can be to motorized rotatable heads for adjusting camera angle and focal length.
  • the camera enclosure can be designed to be easily opened to allow for convenient maintenance of the video camera 12 or replacement of a battery if direct current is used as the power source of the video camera 12 .
  • Maintenance of the illumination system 26 may also be needed, and preferably in this option will be allowed by the same easy-opening design described for the video camera 12 .
  • the easy-opening design can be affected by the use of screws, clamps, or other means widely known in the art. Ease of maintenance is desirable to minimize any downtime that may be encountered.
  • Video image data is photographed by the video camera 12 , it is transferred in real-time to the data processing unit 16 .
  • Data can be transferred by a transfer cable 14 or by a wireless data transmission device (not shown).
  • transfer cable 14 is the preferred medium of transmission based on superior shielding and lower cost.
  • a wireless data transmission device can be a more practical medium of transmission. Any technique of data transfer known to those of skill in the art can be used.
  • the video image data can be sent from the video camera 12 to the data processing unit 16 as either analog or digital data. If sent as analog data, it is important to convert the data to digital data before processing by sending the data to a hardware device (not shown) or software module capable of converting the data. Such a software module may be termed a "video frame-grabber''. If the video image data is sent as digital data, no conversion is required before processing the data.
  • a "data processing unit” is defined as including, but not limited to, desktop computers, laptop computers, handheld computers, and dedicated electronic devices. Any data processing unit known in the art can be used in the present invention.
  • the data processing unit 16 can be small and lightweight to provide portability.
  • the data processing unit 16 can be a microcomputer, minicomputer, or mainframe that is not portable.
  • the present invention is not limited to any specific data processing unit, computer, or operating system.
  • An exemplary embodiment, but one not to be construed as limiting, is a PC-compatible computer running an operating system such as DOS, Windows, or UNIX.
  • the choice of hardware device or software module for conversion of analog data to digital data for use in the present invention is dependent on the video camera 12, data processing unit 16, and operating system used, but given these constraints the choice will be readily made by one of skill in the art.
  • the data processing unit 16 comprises a software module that converts RGB color to L*a*b* color.
  • An exemplary software module is found in Hunter Color Vision Systems (Hunter Associates Laboratory, Inc.).
  • the data processing unit 16 include other input devices, e . g . a keyboard, a mouse or trackball, a lightpen, a touchscreen, a stylus, etc., to allow convenient exercise of user options in camera and software operation, data processing, data storage, program output, etc.
  • program storage device 20 there are several pieces of software which it is important for the data processing unit 16 to store in a program storage device 20 (examples of program siorage devices being a hard drive, a floppy disk drive, a tape drive, a ROM, and a CD-ROM, among others), access from the program storage device 20 via program storage device interface 18, and execute. It is important for the data processing unit 16 to have an operating system, and any necessary software drivers to properly control and retrieve data from the video camera 12 and send output to the at least one output device 24. It is important for the data processing unit 16 to execute a program or programs that can process received video image data, calculate various parameters of the muscle imaged in the received video image data, and output the results of the calculations to an output device 24. An exemplary code for such a program or programs is given in an appendix hereto. An exemplary flowchart for such a program or programs is given as FIG. 3.
  • the video image data can be analyzed for color scale parameters.
  • the video image data is analyzed for the color scale parameters L*, a*, and b*, as defined by the Commission Internationale d'Eclairage (CIE).
  • CIE Commission Internationale d'Eclairage
  • a set of L*a*b* parameters is recorded for each frame.
  • L*, a*, and b* are dimensions of a three-dimensional color space which is standardized to reflect how color is perceived by humans.
  • the L* dimension corresponds to lightness (a value of zero being black, a value of 100 being white)
  • the a* dimension corresponds to relative levels of green and red (a negative value being green, a positive value being red)
  • the b* dimension corresponds to relative levels of blue and yellow (a negative value being blue, a positive value being yellow).
  • the system can capture pixelated video images from areas of 12 to 432 square inches from the muscle of interest, comprising up to 350,000 pixels per measurement, and determine L*, a*, and b* for each pixel.
  • L*a*b* it is desirable for determination of L*a*b* to be performed using the Hunter Associates software conversion module. Once the value of L*a*b* is determined, at least one of the L*, a*, and b* components can be used in subsequent data processing.
  • a program After determination of L*, a*, and b* for each pixel, a program then calculates several parameters of the image for each frame. First, the program outlines the muscle of interest by choosing areas that have tolerances of b* compatible with muscle. A sorting of at least one area of the image into one of two classifications, as in muscle and non-muscle, may be termed a "binary mask.” Areas with values of b* compatible with the muscle of interest are then examined for their L* and a* scores for verification and rejection of surrounding tissues invading the outline of the muscle of interest. Further examination need not be performed on areas with L*, a*, and b* scores suggestive of bone, connective tissue, and fat. The surface area of the cross-section of the muscle of interest is determined.
  • the lean tissue and fat tissue of the muscle can be distinguished and raw L*, a*, and b* scores for the lean tissues of the muscle can be determined. These scores can then be sent to the output device 24 to be displayed in numerical format and/or retained to calculate quality- and yield-determining characteristics as described below. It is known that higher values of b* for lean tissues of muscle correlate with greater tenderness (Wulf et al., 1996). In addition, the fat color of intermuscular fat can also be determined.
  • determinations can be made of the quantity, distribution, dispersion, texture, and firmness of marbling (intramuscular fat deposited within the muscle).
  • the quantity of marbling can be determined by calculating the percentage of muscle surface area with L*, a*, and b* scores compatible with fat tissue.
  • the distribution and dispersion of marbling can be determined.
  • the portion of the image derived from the muscle of interest can be divided into subcells of equal size. A size of 64 x 48 pixels can be used. Within each subcell, the number of marbling flecks can be determined as the number of discrete regions with L*, a*, and b* values corresponding to fat, and the average number of marbling flecks per subcell can be calculated. The variance of numbers of marbling flecks across all subcells can be calculated as well.
  • each marbling fleck can be determined throughout the muscle of interest from the number of pixels within each discrete region with L*, a*, and b* values corresponding to fat.
  • the variance of marbling size across all marbling flecks can be calculated as well.
  • the texture and fineness of marbling can also be measured. It is well known that generally, greater amounts of more uniformly-distributed and finer-textured marbling reflect a higher marbling score and thus meat of higher eating quality.
  • the program can use L*, a*, and b* data to calculate the average texture, i.e. cross-sectional surface roughness, of the muscle, and also the firmness of the lean tissue of the cross-sectional muscle. It is well known that the surface roughness of a muscle is inversely correlated with tenderness, and greater firmness is correlated with flavorfulness.
  • characteristics of the lean section of the meat 10 that can be measured include, but are not limited to, the color of the lean tissue, the color of fat tissue, a marbling quantity, a marbling distribution, a marbling dispersion, a marbling texture, a marbling fineness, an average texture of the lean tissue, a firmness of the lean tissue, and a surface area of the lean section.
  • Quantities of the non-lean section of the meat 10 including but not limited to the color of fat and the relative areas of cross-section surfaces comprised of fat, bone, and/or connective tissue, may be calculated as well.
  • the program can output to the output device 24 calculated values of any or all of the characteristics given above: color of lean tissue, color of fat tissue, extent of marbling, average number of marbling flecks per unit area, variance of marbling flecks per unit area, average size of marbling, variance of the average size of marbling, texture and fineness of marbling, average texture of lean tissue, and firmness of lean tissue.
  • the calculated values of the characteristics, if output are displayed as alphanumeric characters that can be conveniently read by the operator.
  • the area of the cross-sectional surface of the muscle portion of the meat 10 can be calculated and output to the output device 24 .
  • further calculations can be performed using the area of the cross-sectional surface of the muscle, other parameters readily seen by one of skill in the art of meat science as calculable from the L*a*b* data, and/or values of parameters input by the operator, to derive estimated Yield Grades or other overall indices of composition of the meat 10.
  • results reported by the program can be output to any output device 24, such as a screen, printer, speaker, etc. If operator evaluation of the results is desired, results can preferably be displayed on a screen. Preferably, the screen is readily visible to the grader, evaluator, or operator at his or her stand. Alternatively, or in addition, it is preferable that results be printed or output in such a manner that the outputted results can be transferred and affixed to the meat 10 .
  • the manner for outputting results can be text, symbols, or icons readable by personnel either in the packing plant or at later points in the meat production system. Alternatively, the manner for outputting results can be a barcode or other object that can be read by appropriate equipment and decoded into forms readable by personnel at various points in the production system. Output results can be affixed to the meat 10 by methods well-known to the art, which include, but are not limited to, pins, tacks, and adhesive.
  • the power source of the data processing unit 16 can be either direct current, i . e . a battery, or alternating current drawn from an electrical outlet.
  • the data processing unit 16 can be mounted in a data processing unit enclosure or in the camera enclosure, or can be unenclosed.
  • the data processing unit 16 is a microcomputer, minicomputer, or mainframe computing resource present in the plant or facility where the apparatus is used, enclosure is not required.
  • the data processing unit 16 is a separate, stand-alone, portable entity, preferably the data processing unit 16 is mounted in a data processing unit enclosure.
  • the data processing unit enclosure It is important for the data processing unit enclosure to provide an aperture or apertures for output of data to or display of data by the output device 24 .
  • an aperture or apertures for output of data to or display of data by the output device 24 .
  • Such an aperture can be unencased or it can be encased by a pane of transparent material, such as glass, plastic, etc.
  • display is to be performed by an external device, e . g . a remote monitor or printer, it is important for the data processing unit enclosure to provide an aperture for passage of an output cable therethrough.
  • the data processing unit 16 is powered by alternating current, it is important for the data processing unit enclosure to provide an aperture for passage of a power cable therethrough. If it is desired to store outputs to an internal floppy disk drive, it is important for the data processing unit enclosure to provide an aperture for insertion and removal of floppy disks into and from the internal floppy disk drive therethrough. If it is desired to store outputs to an external program storage device 20, it is important for data processing unit enclosure to provide an aperture for passage of a data-transfer cable therethrough.
  • the data processing unit enclosure is only large enough to conveniently fit the data processing unit 16 , and is lightweight.
  • the data processing unit enclosure can be designed with features to more readily allow user manipulation, e.g. handles.
  • the data processing unit enclosure be amenable to easy opening to allow for convenient maintenance of the data processing unit 16 .
  • the easy-opening design can be affected by means described for the camera enclosure supra .
  • the apparatus described above can be used in methods for predicting the palatability and/or yield of, or augmenting the assignment of USDA grades to, meat animal carcasses or cuts, or for sorting for other purposes (e.g. brand names, product lines, etc.).
  • the first step involves collecting video image data from the meat 10 using the video camera 12 .
  • the second step involves processing the video image data using the data processing unit 16 .
  • the third step involves using the results of the processing step in reporting quality-determining characteristics that can be used to augment USDA graders in the assignment of USDA Quality Grades, in reporting the areas of cross-sectional muscle surfaces that can be used to augment USDA graders in the assignment of USDA Yield Grades, and/or in sorting the meat 10 based on specific requirements of, for example, a brand-name or product line program.
  • the grader or operator's limited time to analyze the meat 10 can be focused on examining parameters most readily examined by a person, providing the grader or operator with more data for each sample of meat 10 in the same amount of time, and allowing more accurate prediction of palatability and assignment of Quality Grade and Yield Grade than is currently possible.
  • this method allows required computations to be performed more quickly and accurately than is currently possible.
  • a population of 324 beef carcasses were examined in an effort to segregate a subpopulation of carcasses with very low probabilities ( ⁇ 0.0003) of having ribeye shear force values of 4.5 kg or greater and subsequent unacceptably tough-eating cuts.
  • 200 were certified to meet the above standard for tenderness.
  • 17 head were preselected for the tender subpopulation on the basis of expert-determined (beef scientist or USDA Grading Supervisor) marbling scores of Modest, Moderate, or Slightly Abundant, the three highest degrees of marbling in the United States Standards for Grades of Carcass Beef.
  • tenderness values for each of the remaining 247 head were predicted using a multiple regression equation using CIE a* values for lean and fat, as well as machine measured marbling percentage squared.
  • the multiple regression equation determined that 123 out of 247 carcasses were predicted to have a probability of being not tender of 0.0003. These 123 carcasses were then segregated with the 77 that had been preselected, and certified as being tender. The remaining carcasses had a normal probability of 0.117 of having shear force values in excess of 4.5 kg.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Food Science & Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Zoology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Wood Science & Technology (AREA)
  • Immunology (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Color Television Image Signal Generators (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Meat, Egg Or Seafood Products (AREA)
  • General Preparation And Processing Of Foods (AREA)
  • Image Processing (AREA)

Claims (21)

  1. Verfahren zur Vorhersage des Geschmacks von Fleisch, bei dem:
    Farbvideobilddaten, die sich auf wenigstens einen Bereich des Fleisches beziehen, bereitgestellt werden,
    die Videobilddaten analysiert werden, um wenigstens einen mageren Abschnitt des Fleisches von einem nicht-mageren Abschnitt des Fleisches zu unterscheiden,
    die dem mageren Abschnitt entsprechenden Videobilddaten analysiert werden,
    eine Charakteristik des mageren Abschnitts auf Grundlage der Videobilddaten gemessen wird, und die Charakteristik mit dem Geschmack des Fleisches korreliert wird, wobei das Analysieren der Videobilddaten zum Unterscheiden von wenigstens einem mageren Abschnitt des Fleisches von einem nicht-mageren Abschnitt des Fleisches den Vergleich der Farbe eines ersten Bereichs der Videobilddaten mit der Farbe eines zweiten Bereichs der Videobilddaten beinhaltet und wobei der Vergleich die Berechnung von wenigstens einer der L*, b* und a* Farbkomponenten der Videobilddaten umfaßt.
  2. Verfahren nach Anspruch 1, wobei die Videobilddaten eine Vielzahl von Pixeln enthalten und die Berechnung von wenigstens einer der L*, b* und a* Farbkomponenten eine solche Berechnung für jeden Pixel umfaßt.
  3. Verfahren nach Anspruch 1, wobei die Bereitstellung der Videobilddaten das Fotografieren von wenigstens einem Bereich des Fleischs umfaßt.
  4. Verfahren nach Anspruch 1, wobei die Bereitstellung der Videobilddaten die Beleuchtung von wenigstens einem Bereich des Fleisches umfaßt.
  5. Verfahren nach Anspruch 1, wobei der magere Abschnitt mageres Gewebe und Fettgewebe enthält und die Messung der Charakteristik des mageren Abschnitts die Unterscheidung von magerem Gewebe und Fettgewebe beinhaltet.
  6. Verfahren nach Anspruch 5, wobei die Unterscheidung des mageren Gewebes von dem Fettgewebe den Vergleich der Farbe eines ersten Bereichs der Videobilddaten mit der Farbe eines zweiten Bereichs der Videobilddaten umfaßt.
  7. Verfahren nach Anspruch 6, wobei die Videobilddaten eine Vielzahl von Pixeln enthalten und die Berechnung von wenigstens einer der L*, b* und a* Farbkomponenten eine solche Berechnung für jeden Pixel umfaßt.
  8. Verfahren nach Anspruch 6, wobei das Messen einer Charakteristik des mageren Abschnitts die Messung von wenigstens einem aus der Farbe des mageren Gewebes, der Farbe des Fettgewebes, einer Marmorierungsquantität, einer Marmorierungsverteilung, einer Marmorierungsstreuung, einer Marmorierungstextur, einer Marmorierungsfeinheit, einer mittleren Textur des mageren Gewebes, einer Festigkeit des mageren Gewebes, einer Oberflächenfläche des mageren Abschnitts, und den Werten des nicht-mageren Abschnitts umfaßt.
  9. Verfahren nach Anspruch 1, wobei das Analysieren der Videobilddaten zur Unterscheidung des mageren Abschnitts von dem nicht-mageren Abschnitt beinhaltet, den mageren Bereich von wenigstens einem aus einem fetten Bereich, einem Knochenbereich und einem Bindegewebebereich zu unterscheiden.
  10. Verfahren nach Anspruch 1, das weiter die Bestimmung einer Güteklasse für das Fleisch auf Grundlage der Charakteristik beinhaltet.
  11. Verfahren nach Anspruch 1, das weiter die Bestimmung einer Ertragsklasse für das Fleisch auf Grundlage der Charakteristik beinhaltet.
  12. Vorrichtung zur Vorhersage des Geschmacks von Fleisch, mit:
    einer Videokamera, die dazu angepaßt ist, Farbvideobilddaten von wenigstens einem Bereich des Fleisches zur Verfügung zu stellen,
    einer Datenverarbeitungseinheit, die dazu angepaßt ist, Programmbefehle auszuführen,
    einer Programmspeichereinheit, in der Programmbefehle gespeichert sind, die, wenn sie ausgeführt werden, ein Verfahren zur Vorhersage des Geschmacks von Fleisch ausführen, wobei bei dem Verfahren:
    die Videobilddaten analysiert werden, um wenigstens einen mageren Abschnitt des Fleisches von einem nicht-mageren Abschnitt des Fleisches zu unterscheiden,
    die dem mageren Abschnitt entsprechenden Videobilddaten analysiert werden,
    eine Charakteristik des mageren Abschnitts auf Grundlage der Videobilddaten gemessen wird, und
    die Charakteristik mit dem Geschmack des Fleisches korreliert wird, wobei die Analyse der Videobilddaten zur Unterscheidung von wenigstens einem mageren Abschnitt des Fleisches von einem nicht-mageren Abschnitt des Fleisches den Vergleich der Farbe eines ersten Bereichs der Videobilddaten mit der Farbe eines zweiten Bereichs der Videobilddaten beinhaltet und wobei der Vergleich die Berechnung von wenigstens einer der L*, b* und a* Farbkomponenten der Videobilddaten umfaßt.
  13. Vorrichtung nach Anspruch 12, wobei die Videobilddaten analoge Daten umfassen und wobei das Verfahren weiterhin beinhaltet, die analogen Daten in digitale Daten umzuwandeln.
  14. Vorrichtung nach Anspruch 12, wobei die Videobilddaten eine Vielzahl von Pixeln enthalten und die Berechnung von wenigstens einer der L*, b* und a* Farbkomponenten eine solche Berechnung für jeden Pixel umfaßt.
  15. Vorrichtung nach Anspruch 12, das weiter ein Beleuchtungssystem umfaßt, das dazu angepaßt ist, wenigstens einen Bereich des Fleischs zu beleuchten.
  16. Vorrichtung nach Anspruch 12, wobei der magere Abschnitt mageres Gewebe und Fettgewebe enthält und das Messen der Charakteristik des mageren Abschnitts bei dem Verfahren die Unterscheidung von magerem Gewebe von dem Fettgewebe beinhaltet.
  17. Vorrichtung nach Anspruch 16, wobei das Unterscheiden des mageren Gewebes von dem Fettgewebe bei dem Verfahren beinhaltet, die Farbe eines ersten Bereichs der Videobilddaten mit der Farbe eines zweiten Bereichs der Videobilddaten zu vergleichen.
  18. Verfahren nach Anspruch 12, wobei das Messen einer Charakteristik des mageren Abschnitts in dem Verfahren beinhaltet, wenigstens eines aus der Farbe des mageren Gewebes, der Farbe des Fettgewebes, einer Mamorierungsquantität, einer Mamorierungsverteilung, einer Mamorierungsstreuung, einer Mamorierungstextur, einer Mamorierungsfeinheit, einer mittleren Textur, einer Festigkeit des mageren Gewebes, einer Oberflächenfläche des mageren Abschnitts und den Werten des nicht-mageren Abschnitts zu messen.
  19. Vorrichtung nach Anspruch 12, wobei das Analysieren der Videobilddaten zur Unterscheidung des mageren Abschnitts von dem nicht-mageren Abschnitt bei dem Verfahren es beinhaltet, den mageren Bereich von wenigstens einem aus einem Fettbereich, einem Knochenbereich und einem Bindegewebebereich zu unterscheiden.
  20. Vorrichtung nach Anspruch 12, wobei das Verfahren weiter die Bestimmung einer Güteklasse für das Fleisch auf Grundlage der Charakteristik beinhaltet.
  21. Vorrichtung nach Anspruch 12, wobei das Verfahren weiter die Bestimmung einer Ertragsklasse für das Fleisch auf Grundlage der Charakteristik beinhaltet.
EP99908231A 1998-02-20 1999-02-18 Fleischfarbbilderzeugungssystem zum voraussagen des geschmackes und ertrages Expired - Lifetime EP1060391B1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US7551798P 1998-02-20 1998-02-20
US75517P 1998-02-20
PCT/US1999/003477 WO1999042823A1 (en) 1998-02-20 1999-02-18 Meat color imaging system for palatability and yield prediction

Publications (2)

Publication Number Publication Date
EP1060391A1 EP1060391A1 (de) 2000-12-20
EP1060391B1 true EP1060391B1 (de) 2003-08-06

Family

ID=22126292

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99908231A Expired - Lifetime EP1060391B1 (de) 1998-02-20 1999-02-18 Fleischfarbbilderzeugungssystem zum voraussagen des geschmackes und ertrages

Country Status (8)

Country Link
US (1) US6198834B1 (de)
EP (1) EP1060391B1 (de)
AT (1) ATE246805T1 (de)
AU (1) AU755764B2 (de)
BR (1) BR9908065A (de)
CA (1) CA2322037C (de)
DE (1) DE69910182T2 (de)
WO (1) WO1999042823A1 (de)

Families Citing this family (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2780790B1 (fr) * 1998-07-03 2000-08-18 Vitreenne Abattage Procede et dispositif de prediction de la tendrete d'une viande sur le site de transformation a l'aide d'informations biologiques et/ou physico-chimiques et de mesures optiques dans le domaine du visible et du proche infrarouge
DE19837806C1 (de) * 1998-08-20 2000-01-20 Csb Syst Software Entwicklung Verfahren zur Bewertung von Schlachttierhälften durch optische Bildverarbeitung
US6563904B2 (en) 2000-12-01 2003-05-13 Fmc Technologies, Inc. Apparatus and method for detecting and removing undesirable material from workpieces
DE10109586A1 (de) * 2001-02-28 2002-09-05 Philips Corp Intellectual Pty Verfahren und Vorrichtung zur Bildverarbeitung von Röntgenaufnahmen
US20070104840A1 (en) * 2001-05-03 2007-05-10 Singer Michael G Method and system for the determination of palatability
US6751364B2 (en) * 2001-10-15 2004-06-15 Tyson Fresh Meats, Inc. Image analysis systems for grading of meat, predicting quality of meat and/or predicting meat yield of an animal carcass
WO2003034059A1 (en) * 2001-10-18 2003-04-24 Machinery Developments Limited Apparatus and process for analyzing cuts of meat
US6992771B2 (en) * 2001-11-28 2006-01-31 Battelle Memorial Institute Systems and techniques for detecting the presence of foreign material
US6997089B2 (en) 2002-06-25 2006-02-14 Formax, Inc. Optical grading system for slicer apparatus
US6974373B2 (en) 2002-08-02 2005-12-13 Geissler Technologies, Llc Apparatus and methods for the volumetric and dimensional measurement of livestock
US7039220B2 (en) * 2002-08-14 2006-05-02 C-Scan, L.L.P. Methods and apparatus for the dimensional measurement of livestock using a single camera
US7373217B2 (en) * 2003-04-08 2008-05-13 Hormel Foods, Llc Apparatus for slicing a food product and method therefore
US20040236191A1 (en) * 2003-05-19 2004-11-25 Poliska Steven A. System and method for identifying and labeling livestock products, and managing data associated with those products
US6877460B1 (en) * 2003-11-14 2005-04-12 Pheno Imaging, Inc. Animal sorting and grading system using MRI to predict maximum value
WO2005070114A2 (en) * 2004-01-09 2005-08-04 Nickel Brand Software, Inc. Brand recognition system
DE102004047773A1 (de) * 2004-09-27 2006-04-06 Horst Eger Verfahren zur Bestimmung physiologischer Grössen eines Schlachttierkörpers
DE102004055351B4 (de) * 2004-11-17 2006-09-07 Csb-System Ag Gewinnung von Daten zum Klassifizieren von Schlachttierkörpern sowie zur Bestimmung von Qualitäten und Quantitäten derselben
WO2006086450A1 (en) * 2005-02-08 2006-08-17 Cargill Incorporated Meat sortation
AU2007203535B2 (en) * 2005-02-08 2012-09-06 Cargill, Incorporated Meat Sortation
US7444961B1 (en) * 2005-04-11 2008-11-04 Ellis James S Animal sorting and grading system using an internal evaluation to predict maximum value
EP1887874B1 (de) * 2005-05-31 2012-08-01 Teknologisk Institut Verfahren und verwendung einer datenbank für die automatische bestimmung von qualitätseigenschaften eines schlachtkörpers am schlachtband
GB2428182A (en) * 2005-06-24 2007-01-24 Aew Delford Systems Vision system for food cutting and portioning apparatus
GB0512877D0 (en) * 2005-06-24 2005-08-03 Aew Delford Group Ltd Improved vision system
US20090216459A1 (en) * 2005-11-22 2009-08-27 Colorado Seminary Ultrasonic System for Grading Meat Tenderness
US7613330B2 (en) * 2006-04-03 2009-11-03 Jbs Swift & Company Methods and systems for tracking and managing livestock through the production process
US9159126B2 (en) 2006-04-03 2015-10-13 Jbs Usa, Llc System and method for analyzing and processing food product
NZ546808A (en) * 2006-04-26 2007-12-21 Inst Of Geol & Nuclear Science Evaluation of meat tenderness
US8280144B2 (en) * 2007-02-21 2012-10-02 Goldfinch Solutions, Llc System and method for analyzing material properties using hyperspectral imaging
US8260005B2 (en) * 2007-03-30 2012-09-04 Universidad De Santiago De Chile Portable tool for determining meat quality
WO2008127726A1 (en) * 2007-04-13 2008-10-23 Winterlab Limited System and method for determining the kosher status of fish
US20100086655A1 (en) * 2007-05-23 2010-04-08 Michaeal G Singer Process of selecting a preparation method, a packaging and shipping method, or other dispostion of a foodstuff, and process of determining if a foodstuff is fresh or has previously been frozen
US8068899B2 (en) * 2007-07-03 2011-11-29 The Board Of Trustees Of The Leland Stanford Junior University Method and system of using intrinsic-based photosensing with high-speed line scanning for characterization of biological thick tissue including muscle
US20110007151A1 (en) * 2007-07-03 2011-01-13 David Goldberg Imaging Method For Determining Meat Tenderness
US8472675B2 (en) * 2008-05-05 2013-06-25 Biotronics, Inc. Systems, methods and devices for use in filter-based assessment of carcass grading
US8447075B2 (en) * 2008-05-05 2013-05-21 Biotronics, Inc. Systems, methods and devices for using ultrasonic probe pressure information in assessing muscle tissue quality
US8494226B2 (en) * 2008-05-05 2013-07-23 Biotronics, Inc. Systems, methods and devices for use in assessing carcass grading
US8135179B2 (en) * 2008-05-05 2012-03-13 Biotronics, Inc. Systems, methods and devices for use in assessing fat and muscle depth
MX2008013791A (es) * 2008-10-16 2010-05-17 Dino Alejandro Pardo Guzman Dispositivo m?vil y m?todo de clasificaci?n universal interactivo para carnes por visi?n artificial.
AU2010203357B2 (en) 2009-01-10 2014-07-10 Carne Tender, Llc System and method for analyzing properties of meat using multispectral imaging
CN102405394A (zh) 2009-02-27 2012-04-04 体表翻译有限公司 使用三维表示估计物理参数
US20110128373A1 (en) * 2009-11-28 2011-06-02 Tenera Technology, Llc Determining Meat Tenderness
WO2011100343A2 (en) * 2010-02-09 2011-08-18 Dartmouth College System and method for collection and use of magnetic resonance data and microwave data to identify boundaries of interest
WO2012149654A1 (en) * 2011-05-02 2012-11-08 Agridigit Inc Evaluation of animal products based on customized models
DK177704B1 (da) * 2012-11-22 2014-03-24 Attec Danmark As Fremgangsmåde og middel til kontrol af og mulighed for fjernelse af fremmedlegemer i fødevarer
US9699447B2 (en) 2012-11-26 2017-07-04 Frito-Lay North America, Inc. Calibration of a dynamic digital imaging system for detecting defects in production stream
DE102013008003B4 (de) * 2013-05-08 2015-03-19 Freshdetect Gmbh Messgerät zum Messen eines Oberflächenbelags auf einem Messobjekt, insbesondere auf einem Lebensmittel, und dessen Verwendung
ES2477840B1 (es) * 2014-01-20 2015-02-27 Lenz Instruments S.L. Procedimiento para determinar parámetros de calidad en productos cárnicos e instalación correspondiente
US20160356704A1 (en) * 2015-06-07 2016-12-08 Purdue Research Foundation Nondestructive meat tenderness assessment
AU2017229690B2 (en) 2016-03-08 2021-12-16 Enspectra Health, Inc. Non-invasive detection of skin disease
WO2018201082A1 (en) 2017-04-28 2018-11-01 Zebra Medical Technologies, Inc. Systems and methods for imaging and measurement of sarcomeres
US10682018B2 (en) * 2017-09-02 2020-06-16 Anas Alfarra Automated food preparation and dispensing
US11061008B2 (en) * 2018-06-08 2021-07-13 Birko Corporation Artificial animal protein cleaning diagnostic system
NL2021647B1 (nl) * 2018-09-17 2020-05-06 Vitelco B V Computer vision systeem voor de classificatie van bevleesdheid en vetheid van een karkas
US11375739B2 (en) 2018-10-10 2022-07-05 MP Equipment, LLC Method of removing tissue from food product
US10863724B2 (en) * 2018-12-11 2020-12-15 Animal Health Analytics, Inc System and method for tracking and scoring animal health and meat quality
CA3124728A1 (en) * 2019-02-06 2020-08-13 Marel Salmon A/S Imaging based portion cutting
WO2020210203A1 (en) 2019-04-08 2020-10-15 Provisur Technologies, Inc. Apparatus and method for cutting meat products into blocks of meat
US11436716B2 (en) * 2019-04-19 2022-09-06 Canon Kabushiki Kaisha Electronic apparatus, analysis system and control method of electronic apparatus
JP7271286B2 (ja) * 2019-04-19 2023-05-11 キヤノン株式会社 電子機器およびその制御方法
JP7125802B1 (ja) * 2021-06-15 2022-08-25 有限会社 ワーコム農業研究所 牛肉品質判定装置
US11803958B1 (en) 2021-10-21 2023-10-31 Triumph Foods Llc Systems and methods for determining muscle fascicle fracturing
CN115620283B (zh) * 2022-11-17 2023-04-28 武汉理工大学 基于计算机视觉的猪肉大理石纹表型数据测量方法及装置
WO2024132421A1 (en) * 2022-12-23 2024-06-27 Gea Food Solutions Bakel B.V. Food processing line and method for processing foodstuff

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2728717C2 (de) * 1977-06-25 1983-11-10 Pfister Gmbh, 8900 Augsburg Verfahren und Vorrichtung zur berührungsfreien Bestimmung von Qualitätsmerkmalen eines Prüfobjektes der Fleischwaren-Kategorie, insbesondere eines Schlachttierkörpers oder Teilen davon
DE3047490A1 (de) * 1980-12-17 1982-10-21 Pfister Gmbh, 8900 Augsburg Verfahren zur beruehrungsfreien bestimmung von qualitaetsmerkmalen eines pruefobjektes der fleischwaren-kategorie
GB8604751D0 (en) * 1986-02-26 1986-04-03 Analytical Instr Ltd Colour analyser
CA2133825C (en) * 1992-04-13 2002-12-31 Alan Benn Image analysis for meat
WO1994000997A1 (en) 1992-07-03 1994-01-20 Paul Bernard David Newman A quality control and grading system for meat
GB9215866D0 (en) 1992-07-25 1992-09-09 Aew Int Ltd Measurement device
US5339815A (en) * 1992-12-22 1994-08-23 Cornell Research Foundation, Inc. Methods and apparatus for analyzing an ultrasonic image of an animal or carcass
US5398290A (en) 1993-05-03 1995-03-14 Kansas State University Research Foundation System for measurement of intramuscular fat in cattle
US5960105A (en) * 1993-05-03 1999-09-28 Kansas State University Research Foundation Measurement of intramuscular fat in cattle
EP0692090A1 (de) * 1994-02-01 1996-01-17 Tulip International A/S System, vorrichtung und methode zur on-line bestimmung der qualitätseigenschaften von fleisch und beleuchtungsanordnung für fleischteile
US5474085A (en) 1994-02-24 1995-12-12 University Of Prince Edward Island Remote thermographic sensing of livestock
AUPN660195A0 (en) 1995-11-16 1995-12-07 Life Resources Systems Pty Ltd Novel apparatus and method for determining meat characteristics
AU722769B2 (en) * 1996-08-23 2000-08-10 Her Majesty The Queen In Right Of Canada As Represented By The Department Of Agriculture And Agri-Food Canada Method and apparatus for using image analysis to determine meat and carcass characteristics

Also Published As

Publication number Publication date
BR9908065A (pt) 2001-11-13
CA2322037A1 (en) 1999-08-26
ATE246805T1 (de) 2003-08-15
WO1999042823A1 (en) 1999-08-26
US6198834B1 (en) 2001-03-06
AU2771799A (en) 1999-09-06
DE69910182T2 (de) 2004-06-03
EP1060391A1 (de) 2000-12-20
DE69910182D1 (de) 2003-09-11
AU755764B2 (en) 2002-12-19
CA2322037C (en) 2006-04-25

Similar Documents

Publication Publication Date Title
EP1060391B1 (de) Fleischfarbbilderzeugungssystem zum voraussagen des geschmackes und ertrages
Craigie et al. A review of the development and use of video image analysis (VIA) for beef carcass evaluation as an alternative to the current EUROP system and other subjective systems
Qiao et al. Pork quality and marbling level assessment using a hyperspectral imaging system
CA2133825C (en) Image analysis for meat
AU605001B2 (en) A method and apparatus for the determination of quality properties of individual cattle carcasses
Stewart et al. Objective grading of eye muscle area, intramuscular fat and marbling in Australian beef and lamb
Liu et al. Pork carcass evaluation with an automated and computerized ultrasonic system
Lohumi et al. Nondestructive estimation of lean meat yield of South Korean pig carcasses using machine vision technique
Kongsro et al. Prediction of fat, muscle and value in Norwegian lamb carcasses using EUROP classification, carcass shape and length measurements, visible light reflectance and computer tomography (CT)
Horgan et al. Automatic assessment of sheep carcasses by image analysis
WO1991014180A1 (en) Evaluating carcasses by image analysis and object definition
CA2363089A1 (en) Meat imaging system for palatability and yield prediction
Chandraratne et al. Prediction of lamb carcass grades using features extracted from lamb chop images
Jia et al. Prediction of lean and fat composition in swine carcasses from ham area measurements with image analysis
CA2541866A1 (en) Apparatus for meat palatability prediction
AU2004200865A1 (en) Meat imaging system for palatability and yield prediction
Teira et al. Digital-image analysis to predict weight and yields of boneless subprimal beef cuts
MXPA00008218A (en) Meat color imaging system for palatability and yield prediction
Carnier et al. Computer image analysis for measuring lean and fatty areas in cross-sectioned dry-cured hams
Lohumi et al. Erratum to: Nondestructive estimation of lean meat yield of south Korean pig carcasses using machine vision technique
Hopkins Reliability of three sites for measuring fat depth on the beef carcass
WO2024182367A1 (en) A vision-based quality control and audit system and method of auditing, for carcass processing facility
Jeyamkondan et al. Predicting beef tenderness with computer vision
CA2466289A1 (en) Method and apparatus for using image analysis to determine meat and carcass characteristics
Jeyamkondan et al. Segmentation of longissimus dorsi for beef quality grading using computer vision

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20000823

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17Q First examination report despatched

Effective date: 20010307

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: LI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: CH

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030806

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 69910182

Country of ref document: DE

Date of ref document: 20030911

Kind code of ref document: P

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20031106

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20031106

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20031106

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20031117

NLV1 Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act
PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20040106

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20040218

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20040228

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20040507

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20060228

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IE

Payment date: 20080131

Year of fee payment: 10

Ref country code: GB

Payment date: 20080124

Year of fee payment: 10

Ref country code: DE

Payment date: 20080229

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20080123

Year of fee payment: 10

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070218

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20090218

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20091030

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20090218

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20090901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20090218

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20090302