GB2446822A - Quality control of meat products using optical imaging - Google Patents

Quality control of meat products using optical imaging Download PDF

Info

Publication number
GB2446822A
GB2446822A GB0703485A GB0703485A GB2446822A GB 2446822 A GB2446822 A GB 2446822A GB 0703485 A GB0703485 A GB 0703485A GB 0703485 A GB0703485 A GB 0703485A GB 2446822 A GB2446822 A GB 2446822A
Authority
GB
United Kingdom
Prior art keywords
light
image
imaging device
source
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0703485A
Other versions
GB0703485D0 (en
Inventor
Gareth Jones
Tony Xu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Enfis Ltd
Original Assignee
Enfis Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Enfis Ltd filed Critical Enfis Ltd
Priority to GB0703485A priority Critical patent/GB2446822A/en
Publication of GB0703485D0 publication Critical patent/GB0703485D0/en
Priority to PCT/GB2008/000600 priority patent/WO2008102143A1/en
Publication of GB2446822A publication Critical patent/GB2446822A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/8901Optical details; Scanning details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; fish

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Textile Engineering (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Detecting foreign bodies such as bone in meat products and the like using an optical imaging technique. Objects such as chicken fillets 6a, 6b, 6c are moved along conveyor 4. An imaging device 10 is able to take a first image of the objects backlit by a light source 8, at a first wavelength. Second light sources 22, 24 are arranged so that simultaneously imaging device 10 is able to take a second image of the objects based on light reflected from the surface of the objects at a second wavelength. Light source 8 may be an LED array emitting red light and light sources 22, 24 may be LED arrays emitting green light. Polarization filters 16, 34, 36 may be placed in front of the light sources. A polarization filter 20 may be placed between the conveyor 4 and imaging device 10 having an angle of polarization set perpendicular to that of filters 16, 34, 36. A processor 11 may be adapted to identify dark regions in the two images.

Description

* 2446822 Quality control of meat products and the like This invention
relates to a quality control system and method for food products and the like. For example, one embodiment of the invention provides a method and a device for sensing foreign bodies and the like in products, such as food products.
The preparation of prepared meat products such as chicken breast fillets can be highly automated. One area where extensive use of human operators is typically used is in the checking of the meat products. Major issues here include screening for discoloured meat and the detection of the presence of bone in the meat product. Another area typically requiring human screening is the detection of bone in the fish filleting process.
The presence of bone and bone fragments in some prepared meat products must be kept to an absolute minimum.
Discoloration of meat products can be caused in a number of ways, for example by blood spotting or bruising. Even in circumstances where this does not affect the quality of the meat product, discoloration may result in a product that is unattractive to the consumer, thereby reducing the value of that product.
As noted above, checking for defects such as bone fragments and discoloration is typically a labour-intensive process.
This is not only expensive, but can also be an error-prone process. It would be advantageous if the checking processes could be automated at least to some degree.
The devices and methods of the present invention seek to address at least some of the problems associated with the prior art systems and/or to provide alternatives to the
devices and methods of the prior art.
The present invention provides an apparatus comprising: a first light source arranged to provide light having a first wavelength; a second light source arranged to provide light having a second wavelength different to said first wavelength; an imaging device able to distinguish between light having said first wavelength and light having said second wavelength; and a positioning means, such as a conveyor, for positioning an object under test, wherein: the first light source is positioned relative to the imaging device and the positioning means such that the imaging device is able to take images of the object under test that is backlit by said first source of light; said second light source is positioned relative to the imaging device and the positioning means such that the imaging device is able to take images of the object based on light reflected from the surface of said object; and said imaging device is adapted to generate simultaneously a first image resulting from light provided by said first light source and a second image resulting from light provided by said second light source.
By generating said first and second images at the same time, it is possible to make use of data from both images without having to conduct the kind of complicated image
S
processing steps that result from dealing with images of a moving object taken at different times.
The first image taken in accordance with the present S invention may be used to identify the position of one or more regions including a foreign body. The said foreign body may be bone, or bone fragments. The second image taken in accordance with the present invention may be used to identify areas of blood spotting or similar problems that affect the reflectivity of the surface of the object under test.
A processing means adapted to identify dark regions in said first and second images may also be provided. The said dark regions may be identified using a thresholding algorithm which sets a light threshold, below which a portion of an image is deemed to be dark and above which the portion of the image is deemed to be light. In one form of the invention, the thresholding process is conducted on individual pixels of an image and blob analysis is used to identify dark regions within the image.
The images may need to be digitised prior to carrying out the thresholding and blob analysis steps.
The processing means may be adapted to identify dark regions connon to said first and second images and/or to identify dark regions in said first image that are not present in said second image and/or to identify dark regions in said second image that are not present in said first image. In one form of the invention, a dark region in said first image that does not have a corresponding dark region in said second image is indicative of the presence of a foreign body in that substance that does not permit light to pass therethrough.
In one form of the invention, the processing means is adapted to generate a third image indicating the position of dark regions of said first image that do not correspond with dark regions of said second image. Similarly, the processing means may be adapted to generate an image indicating the position of dark regions that appear in both said first and second images or dark regions that appear only in said second image. In a similar manner, the processing means may be adapted to generate an images indicating the position of bright regions that appear in one or other of the first and second images or that appear in both images.
In one form of the invention, the first image is generated based on the output of a first set of sensor elements of said imaging device and said second image is generated based on the outputs of a second set of sensor elements of said imaging device. Said first and second sets of sensor elements may be red and green CCD sensors elements respectively.
The imaging device may, for example, be a colour camera.
In one form of the invention, the processing means is adapted to identify the planar dimensions of the object under test and to extract image data of that object. In one embodiment, this is achieved by taking an image of the object under test using a blue CCD sensor element.
The light output by said first source of light may be polarized by a first polarizing filter before reaching said object under test and polarized by a second polarizing filter after leaving said object. The first and second polarizing filters may be arranged to have polarization angles substantially perpendicular to one another. In one form of the invention, the object under test perturbs the polarization of said light in a generally random manner.
The object under test receives linearly polarized light from said first light source. Any light that does not pass through the said object is substantially blocked by said second polarizing filter but most of the light that passes through the said object also passes through the second polarizing filter.
The light output by said second source of light may be polarized by a third polarizing filter before reaching said object and polarized by a fourth polarizing filter after being reflected from said object. The third and fourth polarizing filters may be arranged to have polarization angles substantially perpendicular to one another.
The said second and fourth polarizing filters may be the same device.
The said light sources may comprise a plurality of light sources. For example, one or more of the light sources may comprise an LED array, although other light sources, such as lamps, are possible. A diffuser may be provided for scattering: alternatively, a collimating lens may be provided.
In one form of the invention, the light provided by said said first light source has a wavelength in the range 600nm to l000nm. Said light more preferably has a wavelength in the range 600nm to 700rim and still more preferably in the range 63Onrn to 660nm. The wavelength may be chosen so that it is relatively highly transmissive through the object under test. Of course, this wavelength may depend on the nature of that object.
In one form of the invention, the light provided by said second light source has a wavelength in the range of 400nm to 600nm, more preferably 500mn to 600nm and still more preferably around 540nm to 570nm. In one form of the invention, light having a wavelength of 540nm was found to give good contrast between the flesh of a meat product and blood at or near the surface of that product.
In one form of the invention, the object under test is a food product, such as a meat product. By way of example, the said food product could be a chicken breast fillet or a fish product. In other forms of the invention, the object under test is a non-food product, such as a person's tooth.
Of course, many other applications of the invention are possible.
The present invention also provides a method comprising the steps of: backlighting an object using a first source of light; lighting said object using a second source of light; and using an imaging device to generate a first image of said object as backlit by said first source of light and a second image of said light as it is reflected from said object, the first and second images being generated at the same time, wherein said first source of light has a wavelength different to said second source of light.
The present invention further provides a method of detecting bone in a meat product, the method comprising the steps of: backlighting the meat product using a first source of light; lighting said meat product using a second source of light; and using an imaging device to generate a first image of said meat product as backlit by said first source of light and a second image of said meat product as it is reflected from said object, the first and second images being generated at the same time, wherein said first source of light has a wavelength different to said second source of light.
IS
Processing means may be provided to identify dark regions in first image of said meat product. which are not present in said second image of said meat product. The said dark regions may be potentially indicative of bone.
The method may comprise the step of generating a list of potential bone sites, wherein said list comprises said the dark regions identified in said first image of said meat product which are not present in said second image.
The method may comprise the step of excluding areas from said list if the contrast of said image between said area of said first image and the area and the immediate vicinity of said area is below a threshold level. As a result of this process, a PASS output may be provided in the event that no areas of bone are detected and a FAIL output may be provided in the event that areas of bone are detected.
The method may include the step of identifying areas of said second image data that are brighter than a predetermined threshold, which may be indicative of the presence of shards of bone.
A device and method in accordance with the invention will now be described, by way of example only, with reference to the accompanying schematic drawings in which: Fig. 1 shows an apparatus in accordance with an embodiment of the present invention; Fig. 2 is a plan view of part of a conveyor used in the present invention; Fig. 3 is a plan view of part of a conveyor used in accordance with an aspect of the present invention; Fig. 4 shows an exemplary first image generated during the use of the apparatus of Figure 1; Fig. 5 shows an exemplary second image generated during the use of the apparatus of Figure 1; Fig. 6 shows an exemplary third image generated during the use of the apparatus of Figure 1; Fig. 7 is an exemplary image generated by an embodiment of the present invention; and Fig. 8 is a flow chart demonstrating the functionality of the apparatus of Figure 1.
Figure 1 is a schematic representation of an apparatus, indicated generally by the reference numeral 2, in accordance with an embodiment of the present invention.
The apparatus 2 includes a conveyor 4 on which a chicken breast fillets 6a, 6b and 6c are moved. A first LED array 8 is located below the conveyor 4. Second and third LIED arrays 22 and 24 and an imaging device 10 are located above the conveyor.
As shown in Figure 1, the first LIED array 8 comprises an LED array 18, a diffuser 14 and a polarizing filter 16. A second polarizing filter 20 is provided between the conveyor 4 and the imaging device 10; the second polarizing filter 20 has an angle of polarization set perpendicular to that of the polarizing filter 16. The diffuser 14 is provided to diffuse the light from the LEDs in the array 8 in order to remove the images of the individual LEDs in the image produced by the imaging device 10. The polarizing filter 16 passes light having a particular polarization.
In this way, diffuse, linear polarized light is directed towards the conveyor 4.
Chicken breast meat is a diffuse scattering medium that changes the polarization of light that passes through it in a generally random manner. Accordingly, light that has not passed through the chicken breast fillet will retain the linear polarization, but light that has passed through the chicken will not. Thus, by providing a second polarizing filter 20 as part of the imaging device 10 that has an angle of polarization set perpendicular to that of the polarizing filter 16, linearly polarized light that does not pass through the chicken breast fillet will be substantially attenuated by the filter 20, but light passing through the chicken will not generally be so attenuated. In this way, the majority of the image formed by the imaging device 10 is derived from light that has passed through the fillet 6b. Thus, the problem of glare I0 caused by light that does not pass through the chicken breast fillet being brighter than light that does pass through the chicken breast fillet is significantly reduced.
As shown in Figure 1, the LED array 22 includes a diffuser and a polarizing filter 34 and the LED array 24 includes a diffuser 32 and a polarizing filter 36. The polarizing filters 34 and 36 have, angles of polarization set perpendicular to that of the polarizing filler 20. The polarization of light that is reflected from the surface of the fillet 6b is generally unchanged; accordingly, light that is so reflected will tend to be blocked by the polarizing filter 20 and light that has passed through at least part of the scattering meat product passes through the filter 20. This is useful since meat products such as chicken breast fillets are generally highly reflective.
Information relating to discoloration of the meat product is obtained from light that has passed a short distance into the meat product and then been reflected. If the substantial amount of light reflected from the surface of the meat product is allowed to reach the imaging device 10, the information from the light that has passed a short distance into the meat product (i.e. the light carrying the information of interest) will be swamped by the light reflected from the surface.
Figure 2 is a plan view of part of the conveyor 4. As shown in Figure 2, the conveyor 4 comprises a chain arrangement of an opaque material 26. A substantial number of holes, such as hole 28, allow light to pass through the conveyor 4. The opaque material 26 may, for example, be polypropylene; many other suitablematerials would be apparent to the person skilled in the art.
In the use of the conveyor 4, light is blocked by the opaque material 26 of the conveyor but is allowed to pass through the holes 28 in the chain arrangement. In this way, a significant amount of light can pass through the conveyor. The use of an opaque material for the conveyor is preferred to the use of a flexible transparent material, since transparent materials generally change the polarization of light in an uncontrollable manner. Thus, using a transparent material would render the glare-reduction technique described above ineffective. It should be noted that the scattering effect of the chicken breast means that the image of the chain arrangement of the conveyor is not generally visible in the image generated by the imaging device 10. In tests, it was found that the chain arrangement was no longer visible provided that the meat has a thickness of at least 5mm.
The chain arrangement of the conveyor shown in Figure 2 is not essential. For example, the conveyor could take the form of a number of rollers made from an opaque material, with the rollers being spaced to provide gaps through which light can pass.
In the use of the apparatus 2 to detect foreign bodies in meat product, such as fillets Ga, 6b and 6c, the fillets are transferred along the conveyor 4 (from left to right in the example of Figure 1). In the example of Figure 1, the fillet 6b is positioned between the first LED array 8 and the imaging device 10. The first LED array 8 is used to
S
backlight the fillet 6b and, at the same time, an image of the fillet is taken by the imaging device 10.
The size of a foreign body, such as a bone, which may be detected in such a system on a moving conveyor belt will depend on the quality of the image taken. To facilitate a very sharp image either the light source, in this case LEDs, are illuminated only over a very short period of time similar to a camera flash or the capture time of the camera is made very short so that the image is not blurred. One exemplary implementation of the present invention has a conveyor speed of about 0.7 to 0.9 metres/second and provides a throughput of about 60 samples per minute.
Figure 3 is a schematic plan view of the conveyor 4 in which chicken breast fillets 6a, 6b and 6c are visible.
Also shown in Figure 3 are an emitter 40a and a receiver 40b that form an emitter-receiver pair. The emitter 40a and receiver 40b are located opposite one another on different sides of the conveyor 4. The emitter 40a and receiver 40b are arranged so that a signal output by the emitter 40a is received by the receiver 40b in the absence of any object blocking the path of that signal. Thus, in the exemplary situation of Figure 3 in which a chicken breast fillet 6b is located between the emitter 40a and the receiver 40b, no signal is received at the receiver. In this way, the presence of a fillet on the conveyor 4 can be detected.
By using light that has a relatively high transmission through the chicken meat, the presence of foreign bodies, such as bone, that block the light can be readily detected from the image taken by the imaging device 10. It has been found that the transmission of light through chicken meat is highest when red light is used. For this reason, red (640nm) or near-infra-red light is used in the apparatus of Figure 1.
The presence of a foreign body is indicated by the presence of a dark portion in the image generated by the imaging device 10. In the use of the apparatus 2, the output of the imaging device 10 is passed to a processor 11, which generates a digital image from the output of the imaging device (if that output is not already digital) and performs a thresholding step.
The thresholding step determines which parts of the digitised image are deemed to be dark, and which are deemed to be light. This is achieved by setting a light threshold, below which the image is deemed to be dark and above which the image is deemed to be light. This step is likely to require simple on-site calibration. Some filtering may also be required at this stage to remove noise in the data.
Once the dark areas of the image have been determined, the processor 11 performs blob analysis on the digital image.
A blob in this context is simply a set of connected image pixels that are deemed to be dark. By performing blob analysis in the present invention, different shapes of dark regions can be determined.
Blob analysis is a well established technique that is well known to persons skilled in the art. Accordingly, further discussion of the technique is not required here.
Figure 4 shows, schematically, an exemplary result of the blob analysis process. The image of Figure 4 includes three blobs, 100, 102 and 104 indicative of dark areas as detected by the imaging device 10. These dark areas may, for example, indicate the position of bone fragments in the fillet 6b.
Light from the second and third LED arrays 22 and 24 can be used in the detection of discoloured meat, for example meat that has been discoloured as a result of blood spotting or bruising or other types of damage or contamination. The discolouration detection method relies on the fact that different substances reflect light differently. For example, chicken flesh, blood, bone and fat all reflect light differently. It has discovered that green light (540nm) gives good contrast between blood and normal flesh.
In the use of the system of Figure 1 to detect discolouration of meat products, green light is flashed at the meat product by the second and third LED arrays 22 and 24. The reflected light is captured by the imaging device and the processor 11 performs similar digitising, thresholding and blob analysis steps to those discussed above.
Figure 5 shows, schematically, an exemplary result of the blob analysis process. The image of Figure 5 includes two blobs, 100' and 102' indicative of dark areas as detected by the imaging device 10. These dark areas may, for example, indicate the position of blood spotting in the fillet 6b.
The inventor has discovered that images taken by said imaging device 10 of said fillet 6b when backlit by said first LED array 8 are not able to reliably distinguish between bone fragments or the like and blood spotting or the like, but that images taken by said imaging device 10 of said fillet 6b when lit by said second and third LED arrays 22 and 24 are able to detect the presence of blood spotting, but are not able to detect the presence of bone fragments.
The inventor has realised that it is possible to combine the data resulting from the images when lit by said first LED array (as shown in Figure 4) and the images when lit by said second and third LED arrays (as shown in Figure 5) in order to distinguish between dark portions detected by the arrangement of Figure 1 that result from the presence of bone or bone fragments, and dark portions that result from the presence of blood spotting or the like.
Assume that the blobs 100, 100', 102 and 102' are caused by blood spotting and that the blob 104 is caused by the presence of bone. By removing the blobs in the image of Figure 4 that are also present in Figure 5, an image can be arrived at that shows only the position of the bone fragment. Such an image is shown in Figure 6.
The exemplary images of Figures 4 to 6 are, of course, highly schematic. By way of example, Figure 7 shows an exemplary output of a red CCD sensor of an implementation of the present invention, showing a chicken breast fillet on a conveyor. A piece of bone, indicated generally by the reference numeral 105 is visible in the fillet.
The manipulation of the images of Figures 4 and 6 described above is relatively straightforward in principle; however, it is difficult to achieve at high speed. Since high product throughput is a requirement in meat processing plants, it is necessary to provide an arrangement that is able to combine the data from the two measurements quickly.
In particular, the image processing requirements needed in order to combine the data output by said imaging device 10 as a result of light from the first LED array and as a result of light from the second and third LED arrays is not trivial. As noted above, in one embodiment of the invention, the system is required to provide a throughput of 60 chicken pieces per minute.
The present invention addresses this problem by providing a colour imaging device 10 that is distinguish between red (or similar) light output by said first LED array and green (or similar) light output by said second and third LED arrays. For example, the colour imaging device 10 may include a set of red CCD sensor elements that are only sensitive to red light and a set of green CCD sensor elements that are only sensitive to green light. In this way, it is possible to simultaneously obtain separate images of the fillet 6b resulting from the lighting by the first LED array 8 and the lighting by the second and third LED array 22 and 24. One implementation of the present invention has made use of a 1392 x 1040 pixel BASLER AlO2kc camera provided by Basler AG, An der Strusbek 60-62, 22926 Ahrensburg, Germany used in conjunction with a FUJINON CCTV lens (HF9HA-1B). Of course, the skilled person would be aware of many other devices that could be used in the implementation of the present invention.
Figure 8 is a flow chart showing an algorithm 50 for automating the detection of bone fragments and blood discoloration in chicken breast fillets. The algorithm could, of course, be readily adapted for other applications.
As shown in Figure 8, the algorithm 50 starts with the capture of an image by the imaging device 10 at step 52.
The image is separated into red, green and blue mono-colour images using sets of red, green and blue CCD sensor elements in step 54.
The blue image is used to extract an image of a chicken breast fillet from the overall image at step 56. This is achieved by using a pre-set global threshold on the blue image to extract a blob from the background data and filling in the area defined by the blob to define a mask that defines the entire chicken breast fillet.
It has been found that it is generally quicker to determine the output of the chicken breast fillet from the blue image data than either the green or red image data. It has also been found that it is quicker to use green image data than red image data. Accordingly, should blue image data not be available in any particular embodiment of this invention, it would be preferable to use the green image data for this task than the red image data. It should be noted that although the LED arrays 8, 22 and 24 output light having wavelengths centred in the red and green bands, it has been found that there is sufficient blue light output by those arrays to enable a useful blue image to be extracted.
Step 58 determines whether or not the processing at step 56 has identified an object to be analysed. If a chicken breast fillet is detected, the algorithm moves to step 60.
Otherwise, the algorithm moves to step 82.
At step 60, the mask obtained in step 56 is applied to the red and green images taken by the imaging device 10 so that the output of step 60 should include data regarding the object under test only, thereby removing image data of the conveyor 4. Removing the image data of the conveyor significantly simplifies the data processing requirements of the system.
At step 62, the red image is analysed to identify areas that could potentially be bone. In one implementation of the invention, this process is carried out as follows.
The masked red area is divided into several small windows, the window size being decided by pre-set parameters ROW_NUM and COL_NUM, which set the number of rows and columns in the masked area respectively. Thus, the window widths and heights are as follows: Window width = masked area / COL_NUM; Window height = masked area / ROW_NUM.
The windows overlap in both the row and column directions, such the total number of windows is given by: Number of windows = (2xROWNUM -1) x (2xCOLNUM -1) For each window, a local threshold is calculated as follows: Local threshold = (Mean value of local window) x pre-set ratio The threshold is applied to each window to identify each pixel having a level below the local threshold (referred to as a "foreground pixel"). Each foreground pixel represents a potential bone area.
The foreground pixel locations are translated into a binary image representing the locations of all potential bone areas.
At step 64, it is determined whether or not step 62 has identified any areas that could potentially be bone. If potential bone areas have been detected, the algorithm passes to step 66; otherwise, the algorithm passes to step 80.
At step 66, the green image of each area that could potentially be bone is compared with a pre-set threshold level to determine whether or not that area is identified as a blood spot area. If the green image is belowthe threshold, the area is determined to be a blood area and is removed from the list of potential bone areas. Once all
S
potential bone areas have been checked, the algorithm moves to step 68.
Steps 68 and 70 form a loop. Step 68 asks if there are any remaining potential bone areas on the list. If there are, then the algorithm proceeds to step 70, which determines whether the area concerned is meat or noise. The steps 68 and 70 are repeated until all areas have been checked to determine whether they are meat or noise. Step 70 is carried out as follows.
The level of each potential bone area is compared with the mean background light level in the area immediately surrounding the potential bone area. If this ratio is below a pre-defined level, then it is determined that the difference is not caused by the presence of bone. For example, the lower light level could be caused by uneven thickness of the meat sample or by noise in the sample signal. This test therefore checks whether or not the contrast between the potential bone area and the surrounding area is sufficiently sharp to indicate the presence of bone. When all potential areas of bone have been analysed in this way, the algorithm moves to step 72.
Step 72 determines whether or not any potential bone areas remain on the list. If so, the algorithm moves to step 74, which step provides a FAIL output, on the basis that an area of bone has been detected. If not, the algorithm moves to step 76.
Step 76 determines performs an algorithm for detecting bone shards or bone fragments. It has been found that in some circumstances, bone shards or bone fragments can be present in a meat sample without being detected under red backlighting that are detectable as bright areas under green lighting. Thus, step 76 determines whether or not any areas of the green output exceed a threshold. If so, a compactness ratio of the area is compared with a pre-set threshold to determine whether or not a bone shard or fragment is present. If a bone shard or fragment is detected, the algorithm moves to step 74, which step provides a FAIL output; otherwise, the algorithm moves to step 80, which step provides a PASS output.
In some embodiments of the invention, the steps 76 and 78 are omitted so that, if no bone areas are detected at step 72, the algorithm moves directly to step 80.
From either step 74 or step 80, the algorithm moves to step 82. Step 82 simply waits for an interrupt signal that indicates that a further sample has been detected. When such an interrupt signal is received, the algorithm returns to step 58.
Although the invention has been described above in relation to chicken breast fillets, the invention is not so limited.
The present invention could be used with any meat product that has a sufficiently high level of light transmission.
Pork and fish are two examples for which the present invention is particularly well suited.
Furthermore, the invention is not limited to meat products.
Other food and non-food items could be checked in a similar way. For example, defects in many food items could be detected using one or more of the techniques described herein. Other examples include the detection of discolouration or bruising in fruit, the detection of bones in fish and the detection of defect in processed foods such as potato crisps etc. There are also a number of medical applications, such as imaging testicles, or the hand or wrist, as well as a number of veterinary applications. One particular example could be the imaging of blood flow in the hand as a means of determining blood circulation issues. There are also a number of potential dental applications, such as taking image of teeth, or taking images of the root of a tooth in the gum of a patient.
In each of the embodiments described above, light emitting diodes (LEDs) are used as the light sources. The use of LEDs in not essential: lamp-based or scanned laser light sources are alternatives. Nevertheless, there are a number of advantages associates with using LEDs. For example, LED5 have a narrow wavelength emission which means that the desired wavelength can be reliably obtained. Further, LEDs can be quickly turned on and off, especially when compared with traditional lamp systems and cover a large spatial area compared with laser systems. The fast switching speed enables sharp images to be obtained, thereby improving the accuracy of the system. The use of LEDs is efficient; this is advantageous since it reduces the heat output of the light sources.
A number of forms of the present invention are described herein. Several of those forms have a number of variants.
The skilled person will be aware that any of the variants may be applied to any of the forms of the invention.
Accordingly, the present invention is not limited to the specific forms of the invention described herein.

Claims (22)

  1. CLAIMS: 1. An apparatus comprising: a first light source arranged to
    provide light having a first wavelength; a second light source arranged to provide light having a second wavelength different to said first wavelength; an imaging device able to distinguish between light having said first wavelength and light having said second wavelength; and a positioning means for positioning an object under test, wherein: the first light source is positioned relative to the imaging device and the positioning means such that the imaging device is able to take images of the object under test that is backlit by said first source of light; said second light source is positioned relative to the imaging device and the positioning means such that the imaging device is able to take images of the object based on light reflected from the surface of said object; and said imaging device is adapted to generate simultaneously a first image resulting from light provided by said first light source and a second image resulting from light provided by said second light source.
  2. 2. An apparatus as claimed in claim 1, further comprising processing means for processing data received by said imaging device.
  3. 3. An apparatus as claimed in claim 2, wherein said processing means is adapted to identify dark regions in said first and second images.
  4. 4. An apparatus as claimed in claim 3, wherein said processing means is further adapted to identify dark regions coimiton to said first and second images.
  5. 5. An apparatus as claimed in claim 3 or claim 4, wherein said processing means is further adapted to generate a third image indicating the position of dark regions of said first image that do not correspond with dark regions of said second image.
  6. 6. An apparatus as claimed in any one of claims 2 to 5, wherein said processing means is adapted to identify the planar dimensions of said object under test and to extract image data of said object under test.
  7. 7. An apparatus as claimed in any preceding claim, wherein said first image is generated based on the output of a first set of sensor elements of.said imaging device and said second image is generated based on the outputs of a second set of sensor elements of said imaging device.
  8. 8. An apparatus as claimed in any preceding claim, further comprising a first and a second polarizing filter, wherein: the first polarizing filter is positioned between said first light source and said positioning means such that, in use, light provided by said first light source is polarized before transmission of said light through said object under test; the second polarizing filter is positioned between said positioning means and said imaging device such that, S in use, light provided by said first light source that passes through said object under test is polarized by said second polarizing filter before reaching said imaging device and light provided by said second light source that is reflected by said object under test is polarized by said second polarizing filter; and said first polarization filter has a polarization angle substantially perpendicular to that of said second polarization filter.
  9. 9. An apparatus as claimed in any preceding, wherein light output by said first light source has a wavelength in the range 600nm to l000nm.
  10. 10. An apparatus as claimed in any preceding claim, wherein light output by said second light source has a wavelength in the range of 400nm to 600nzn.
  11. 11. An apparatus as claimed in any preceding claim, wherein said positioning means is a conveyor.
  12. 12. A method comprising the steps of: backlighting an object using a first source of light; lighting said object using a second source of light; and using an imaging device to generate a first image of said object as backlit by said first source of light and a second image of said light as it is reflected from said
    I
    object, the first and second images being generated at the same time, wherein said first source of light has a wavelength different to said second source of light.
    S
  13. 13. A method as claimed in claim 12, further comprising the step of identifying dark regions in said first and second images.
  14. 14. A method as claimed in claim 12 or claim 13, further comprising the step of identifying dark regions in said first image that are not present in said second image.
  15. 15. A method as claimed in any one of claims 12 to 14, is further comprising the step of identifying dark regions in said second image that are not present in said first image.
  16. 16. A method as claimed in any one of claims 12 to 15, further comprising the step of identifying the planar dimensions of said object and using said dimension as a mask when generating said first and second images.
  17. 17. A method as claimed in any one of claims 12 to 16, wherein said object is a meat product.
  18. 18. A method of detecting bone in a meat product, the method comprising the steps of: backlighting the meat product using a first source of light; lighting said meat product using a second source of light; and
    S
    using an imaging device to generate a first image of said meat product as backlit by said first source of light and a second image of said meat product as it is reflected from said object, the first and second images being generated at the same time, wherein said first source of light has a wavelength different to said second source of light.
  19. 19. A method as claimed in claim 18, further comprising the step of identifying dark regions in first image of said meat product which are not present in said second image of said meat product.
  20. 20. A method as claimed in claim 18 or claim 19, further comprising the step of generating a list of potential bone sites, wherein said list comprises said the dark regions identified in said first image of said meat product which are not present in said second image.
  21. 21. A method as claimed in claim 20, further comprising the step of excluding areas from said list if the contrast of said image between said area of said first image and the area and the immediate vicinity of said area is below a threshold level.
  22. 22. A method as claimed in any one of claims 18 to 21, further comprising the step of identifying areas of said second image data that are brighter than a predetermined threshold.
GB0703485A 2007-02-22 2007-02-23 Quality control of meat products using optical imaging Withdrawn GB2446822A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0703485A GB2446822A (en) 2007-02-23 2007-02-23 Quality control of meat products using optical imaging
PCT/GB2008/000600 WO2008102143A1 (en) 2007-02-22 2008-02-21 Quality control of meat products and the like

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0703485A GB2446822A (en) 2007-02-23 2007-02-23 Quality control of meat products using optical imaging

Publications (2)

Publication Number Publication Date
GB0703485D0 GB0703485D0 (en) 2007-04-04
GB2446822A true GB2446822A (en) 2008-08-27

Family

ID=37945573

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0703485A Withdrawn GB2446822A (en) 2007-02-22 2007-02-23 Quality control of meat products using optical imaging

Country Status (2)

Country Link
GB (1) GB2446822A (en)
WO (1) WO2008102143A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2938654A1 (en) * 2008-11-20 2010-05-21 Sedna METHOD AND DEVICE FOR CONTROLLING THE QUALITY OF FRESHNESS OF FISH.
FR2985025A1 (en) * 2011-12-23 2013-06-28 Maf Agrobotic DEVICE AND METHOD FOR NON-DESTRUCTIVE DETECTION OF DEFECTS IN FRUIT AND VEGETABLES
WO2017118757A1 (en) 2016-01-08 2017-07-13 Teknologisk Institut A system and method for determining the presence and/or position of at least one bone in a meat piece
CN109696440A (en) * 2017-10-20 2019-04-30 丰田自动车株式会社 Check device checks facility and check device fault confirmation method
LU501123B1 (en) * 2021-12-29 2023-06-29 Analitica D O O Apparatus and method for detecting polymer objects and/or chemical additives in food products

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101866997A (en) * 2010-05-27 2010-10-20 大余众能光电科技有限公司 Method for improving pattern recognition capability of LED (Light-Emitting Diode) chip and system thereof
EP3198262A4 (en) * 2014-07-21 2018-07-25 7386819 Manitoba Ltd. Method and device for bone scan in meat
WO2017121713A1 (en) 2016-01-11 2017-07-20 Teknologisk Institut A method and device for scanning of objects using a combination of spectral ranges within vision, nir and x-rays
JP6814595B2 (en) * 2016-10-19 2021-01-20 株式会社前川製作所 Meat bone discrimination device and meat bone discrimination method
DK180343B1 (en) 2018-11-26 2021-01-15 Teknologisk Inst System and method for automatic removal of foreign objects from a food surface
JP7271286B2 (en) * 2019-04-19 2023-05-11 キヤノン株式会社 Electronic equipment and its control method
CN110632088A (en) * 2019-08-09 2019-12-31 广州超音速自动化科技股份有限公司 Battery edge sealing and gluing detection equipment and detection method
JP2022099535A (en) * 2020-12-23 2022-07-05 株式会社前川製作所 Object detection device, machine learning implementation device, object detection program, and machine learning implementation program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1003027A1 (en) * 1997-06-17 2000-05-24 Yuki Engineering System Co, Ltd. Device for checking sheet packaging
WO2001049043A1 (en) * 1999-12-27 2001-07-05 Og Technologies Inc Method of simultaneously applying multiple illumination schemes for simultaneous image acquisition in an imaging system
WO2006075164A1 (en) * 2005-01-12 2006-07-20 Enfis Limited Sensing in meat products and the like

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1003027A1 (en) * 1997-06-17 2000-05-24 Yuki Engineering System Co, Ltd. Device for checking sheet packaging
WO2001049043A1 (en) * 1999-12-27 2001-07-05 Og Technologies Inc Method of simultaneously applying multiple illumination schemes for simultaneous image acquisition in an imaging system
WO2006075164A1 (en) * 2005-01-12 2006-07-20 Enfis Limited Sensing in meat products and the like

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2938654A1 (en) * 2008-11-20 2010-05-21 Sedna METHOD AND DEVICE FOR CONTROLLING THE QUALITY OF FRESHNESS OF FISH.
EP2189789A1 (en) * 2008-11-20 2010-05-26 Sedna Method and device for checking the freshness of a fish
FR2985025A1 (en) * 2011-12-23 2013-06-28 Maf Agrobotic DEVICE AND METHOD FOR NON-DESTRUCTIVE DETECTION OF DEFECTS IN FRUIT AND VEGETABLES
WO2017118757A1 (en) 2016-01-08 2017-07-13 Teknologisk Institut A system and method for determining the presence and/or position of at least one bone in a meat piece
US10251406B2 (en) 2016-01-08 2019-04-09 Teknologist Institut Device for loosening bones from a meat piece such as ribs from a belly piece of slaughtered animal
CN109696440A (en) * 2017-10-20 2019-04-30 丰田自动车株式会社 Check device checks facility and check device fault confirmation method
CN109696440B (en) * 2017-10-20 2021-08-24 丰田自动车株式会社 Inspection device, inspection facility, and inspection device failure confirmation method
LU501123B1 (en) * 2021-12-29 2023-06-29 Analitica D O O Apparatus and method for detecting polymer objects and/or chemical additives in food products

Also Published As

Publication number Publication date
GB0703485D0 (en) 2007-04-04
WO2008102143A1 (en) 2008-08-28

Similar Documents

Publication Publication Date Title
GB2446822A (en) Quality control of meat products using optical imaging
US20080204733A1 (en) Sensing in Meat Products and the Like
US11830179B2 (en) Food inspection assisting system, food inspection assisting apparatus and computer program
WO2008016309A1 (en) Multi-modal machine-vision quality inspection of food products
JP5875186B2 (en) Agricultural product inspection device and agricultural product inspection method
JP2008541007A (en) Food foreign matter detection device
JP6203923B1 (en) Fruit and vegetable inspection equipment
KR102340173B1 (en) Contact lens inspection in a plastic shell
US6433293B1 (en) Method and device for detecting dirt as present on articles, for example eggs
WO2017029864A1 (en) Egg inspection device and egg differentiation system
JP2007178407A (en) Foreign matter intrusion inspection method for inspection object, and device used therefor
JP2008309678A (en) Contaminated egg inspection device
EP3531126B1 (en) Method and apparatus for the inspection of packaged fish products
JP2021193383A (en) Inspection device of egg
US10922810B2 (en) Automated visual inspection for visible particulate matter in empty flexible containers
JP2022000641A (en) Inspection device for eggs
JP6203922B1 (en) Fruit and vegetable inspection equipment
JP2019039758A (en) Egg inspection device
JP2011033612A (en) Agricultural product inspection device
JP7039700B2 (en) Inspection device with light watermark
JP2009168743A (en) Inspection method and inspection device
JP2015184071A (en) Agricultural product inspection device and method for inspecting agricultural product
JP2004333177A (en) Method and apparatus for discriminating object to be inspected
JP2018155540A (en) Light inspection device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)