CA2207326A1 - Data recognition system - Google Patents

Data recognition system

Info

Publication number
CA2207326A1
CA2207326A1 CA002207326A CA2207326A CA2207326A1 CA 2207326 A1 CA2207326 A1 CA 2207326A1 CA 002207326 A CA002207326 A CA 002207326A CA 2207326 A CA2207326 A CA 2207326A CA 2207326 A1 CA2207326 A1 CA 2207326A1
Authority
CA
Canada
Prior art keywords
state
data
neural network
substance
histogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002207326A
Other languages
French (fr)
Inventor
Charles Tasman Wescott
Leonard George Chadborn Hamey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Arnotts Biscuits Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arnotts Biscuits Ltd filed Critical Arnotts Biscuits Ltd
Publication of CA2207326A1 publication Critical patent/CA2207326A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/10Starch-containing substances, e.g. dough
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Food Science & Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention is directed towards a method of determining the state of a substance (2) having a color property dependent on the state of the substance. The method includes the following steps: forming a pixel image of the substance (3) using a camera (3); projecting the pixel image into a three-dimensional color space using a neural network (7); and comparing the projection (37) with a projection of a second position of the substance (2) to determine the state of the substance (2) using a second neural network (9).

Description

WO 96/18975 PCT/AU95/0~813 DATA RECOGNITION ~Y~ ;M
Field of the Invention The present invention relates to recognition of data characteristics wit'nin an i~nage and more particularly to the recognition of characteristics within a baking process.
Back~ l vulld of the Invention There are several factors which influence the results produced by a baking process. For example, in the production of bread or biscuits or other such foodstuffs, a number of factors including the temperature of the oven, the baking period, and the size, shape, thickn~ and moisture level of the substances (e.g. dough) utilised influence the final product produced at the end of the baking process. Further, these baked foodstuffs are often produced on a commercial scale, in large volumes, using a mass production process.
At the end of the production process, the goods are normally inspected to ensurequality control. Inspection often takes the form of looking at a particular batch of goods coming off a production line in order to ensure their suitability.
Often the inspection process is undertaken by a human observer. It has been found, in practice, that human observers provide subjective judgements that are prone to both long- and short-term variations. Additionally, human observers are only capable of working at certain fixed speeds and are prone to adverse conditions such as boredom, tiredness and/orisickn,ss~.
Summary of the Invention It is an object of the present invention to provide a method of determining the state of an object having visual data properties that are subject to variation.
In accordance with a first aspect of the present invention, there is provided a ~ method of determining the state of a substance having a color property dependent on said state, said method comprising the steps of:
forming a pixel image of said substance;
projecting said pixel image into a three dimensional color space; and CA 02207326 l997-06-09 W O 96/18975 PC~r/AU95/00813 co~ ring said projection with a projection of a second portion of said s~lhst~nre, having a predetermined state, so as to determine the state of said substance.
In accordance with a second aspect of the present invention tnere is disclosed an a~dlus which is able to operate in accordance with the above method.
In accordance with a third aspect of the present invention, there is provided a method of determining the state of a first data sample having collectively a series of characteristics, said method col~lising the steps of:
projecting said first data sample into a multi-dimensional space; and comparing said projection with a projection of a second data sample having a 10 set of predetermined characteristics, to determine the state of said first data sample characteristics .
Brief Des~. i,~ion of the D~
The preferred embodiment of the present invention is described with reference tothe accompanying drawings, in which:
Fig. 1 ill~l~Lldtes the steps of the preferred embodiment;
Fig. 2 illustrates the input scanned data format utilised by the preferred embo~iment;
Figs. 3-8 illustrate the progression of the color change with increased baking levels;
Fig. 9 illustrates a Kohonen self-organising feature map;
Fig. 10 illustrates a first baking curve; and Fig. 11 illustrates a second baking curve for a different biscuit product.
Detailed Des~ ,lion Although the pl~felled embodiment will be described with reference to baking, it will be apparent to a person skilled in the art that the invention is not limited thereto and applies to all forms of visn~ tinn where the state of a substance can be determined from its appearance.

CA 02207326 l997-06-09 In the ~rer~led embodiment of the present invention, a detailed analysis of the color chara~;L~ ics of the baked product is undertaken by a col~ul~l system ill~e~l~led into the producti- n process.
O A mass production system 1 for producing baked foodstuffs according the ~er~lled embodiment of the invention is illustrated schem~ti-~lly in Fig. 1. Thesystem 1 col~.~rises a food processin~ apparatus (e.g. oven) 4, subst~nce characl~lisa~ion apparatus 3, 6, 7, 8, 9, and control system 11. Baked products 2 are imaged by a camera 3 as the products exit the oven 4. The image taken by camera 3 is subjected to a number of pl~ces~sirl~ steps preferably implemented using a microcomputer, which will be described in more detail hereinafter. The processing includes an image processing or calibration step 6, a first neural network processing step 7, an image processing step 8, and a second neural network step 9 to produce an output in~ tion 10 that in~lir~t~s the extent of the baking of the baked products 2.
The output 10 provides an indication of whether the products 2 are under- or over-baked and can be utilised as an input to the control system 11 to determine if the oven parameters such as heat or time need to be adjusted.
The camera 3 utilised in the prerelled embodiment is a three-chip charge coupled device (CCD) camera which produces a 512 X 512 array of red (R), green (G), and blue (B), or R(~rB, pixel values. This array of pixel values is stored in a "frame grabber" board mounted within the microcolll~uL~. The baking products 2 were imaged using two daylight-color-b~l~n~ed THORN (Registered Trade Mark) fluorescent lamps for direct illumin~tion and a black curtain to exclude ambient ilhlmin~tion. It should be apparent to a person skilled in the art that other forms of illumination may be used, and/or the background curtain may be elimin~ted in an industrial production process, although some other form of ambient light shielding may by utilised without departing from the scope and spirit of the invention. In a first sample, the baked products imaged were NSAO" (Registered Trade Mark) biscuits produced in a production line of the Arnott's Biscuits T imit~l biscuit factory.

CA 02207326 l997-06-09 WO 96tl8975 PCT/AU95/00813 In order to ensure proper color calibration of the imaged biscuits, the biscuitswere imaged with a color calibration chart 12 of the form shown in Fig. 2. The sc-~nn~d image 12' includes biscuit area 13, background 14, reference white background 15, grey scale 16, and reference colors 17. The sc~nn~d image 12' and image 12 were then utilised to color calibrate the pixels of the biscuit 13. The process of color calibrations to ensure color consistency of sampled images is a well known process to those skilled in the art of co,lll,u~er im~ging. For a ~ cll~cion of the color calibration process, reference is made to Novak, C.L. and S.A. Shafer 1990, "Supervised Color Constancy Using a Color Chart," Technical Report CMU-CS-140, Carnegie Mellon 10 University School of Co~ ,uLer Science. The paper by Novak et al. sets out more extensive means of color calibration. However, it should be noted that under ap~lopliate ci,.;u~llstances of illumination and instrumentation control, calibration can be dispensed with.
On completion of the color calibration process (block 6 of Fig. 1), each biscuitimage can be stored as an RGB file. The RGB color space is a well known color space, in common use with colllpu~er systems. However, it will be a~p~nl to a person skilled in the art that other color systems such as HSV, XYZ or CIE color space coordinates may also be lltili~e~
Referring now to Figs. 3 to 8, a projection of the pixel values obtained for biscuit 13 (Fig. 2) is shown for different stages of the baking process.
Fig. 3 shows a projection 20 of the pixel image of a "raw" biscuit onto an RGB
cube 21.
Figs. 4 to 8 show corresponding projections for biscuits varying from underbaked biscuits (Fig. 4) to overbaked biscuits (Fig. 8). It can be seen fromco.~pa~ing Figs. 4 to 8 that the color characteristics of a biscuit change as that biscuit is subjected to increased levels of baking. The particular progression of pixel data of Figs. 4 to 8 can therefore be utilised to determine the baking state of any particular sample biscuit.
In order to utilise the progression from Fig. 4 to Pig. 8 in the baking process, it is n~cP.,cs ~y to succinctly and compactly describe an ~ ation to the data samples of Figs. 4 to 8. This can be done through the production of a "baking curve" which is a one-dimensional represent~tion of the important color variations within the three-~imt~ncional data space of Figs. 4 to 8. One method of production of a baking curve isto utilise a Kohonen self-org~nicing feature map which is an unsupervised learning technique that is effective in extracting structure from complex experimental data which bears a highly non-linear relationship. A detailed description of Kohonen's self-org~ni~in~ feature map is provided in Neural Network Architectures: An Introduction o by Judith E. Dayhoff, published 1990 by Van Nostrand Reinhold at pages 163-191, the contents of which are hereby incorporated by cross-reference.
Turning now to Fig. 9, there is shown an example of a self-organising map (SOM) as utilised in the ~leÇ~led embodiment. The SOM 30 has three input nodes 31 which correspond to the red, green and blue color components of pixel values from the 15 tli~iti~e~ color image of the biscuit 13 (Fig. 2). The SOM 30 includes N output nodes 32. Every input node 31 is connected by means of edges e.g. 33, to each of the output nodes 1 to N. The use of a one-dimensional SOM 30 means that, upon training, theSOM network 30 will map the entire set of RGB pixel values 31 from a biscuit image to a one-dimensional array of points or output nodes 32. This will have two maineffects: firstly, it will reduce the ir.put data function inio a data space of lower dimensionality; and secondly, the interrelationships between the most relevant points in the input data 31 will be retained intact in the output format of the network 30.
If the input 31 is denoted E as follows:
E = (Red, Green, Blue), Eqn 1 and if output node i is connected to the input 31 by a series of edges having weights denoted Ui, where Ui takes the following form:
Ui --(Wi,red ~ Wi,green . Wi,blue)~ Eqn 2 where it is assumed that the weights are initially randomly assigned to edges and range over the same set of data values as the pixel component values (in the ~ef~.led embodiment from 0 to 255), then the SOM is trained by finding, for each pixel, the output node 32 having weights Ui which are closest to the input pixel E. The degree of "closeness" is normally measured by means of a Euclidean distance measure. The closest output node, having ~ul)scli~t c, can be denoted mathem~tir~lly as follows:
¦¦E UC¦¦ = ¦¦E-Ui¦¦. Eqn3 The best or winning node c is then altered in conjunction with nodes within the neighbourhood of c (for example, node c - 1 and c + 1). The alteration, for eachoutput node j in the neighbourhood of c (denoted Nc) proceeds by first calculating ~U
as follows:
~Uj = a (t) x (E - Uj), Eqn 4 and then deriving a new set of weights Uj(t+1) as shown in equation 5:
t+l t U j = U j + ~ Uj. Eqn 5 Those output nodes that are not in a predetermined neighbourhood around the chosen node c are left unaltered.
The function of a(t) in equation 4 is known as the "learning rate" and is a monotonically decreasing function, with an alpha function of the following form being suitable:
a (t) = aO (1 - t/T), Eqn 6 where aO takes on values in the range of 0.02 to 0.05, t is the current training iteration, and T is the total number of training iterations to be done. The width d of the neighbourhood Nc can also be chosen to vary in a similar manner as set out in equation 7:
d (t) = do (1 - t/T), Eqn 7 where do can be chosen to initially be, say, a third of the width of the output nodes.
As an example of a process carried out in accordance with the preferred embodiment, four biscuit samples of "SAO" from Arnott's Biscuits T imitt.~l weresc~nn.ocl to yield 47,967 pixels per biscuit with three separate R, G, and B values for each pixel. The four sets of pixels were then shuffled in a random sequence and used -CA 02207326 l997-06-09 Wo 96/18975 pcTlAus5loo8l3 as training input to an SOM of the form of Fig. 9 having ten output nodes for a total of 20 training passes.
Referring now to Fig. 10, there is shown a plot of the ten output node weight values Ui within a three--limen~ional color cube 35. The ten points (e.g., 36) are shown joined together by curve 37. The curve 37 is hereby denoted to be the final "baking curve" of the input data.
Turning now to Fig. 11, the process was repeated for a second form of biscuit, comprising Arnott's "MILK COFFEE" (Registered Trade Mark) biscuit for a 15 node output SOM and the results 41 are shown plotted within the color cube 40, with the 10 training data being passed through the SOM a total of 50 times. It can be seen that the structure of the baking curve 41 of Pig. 11 is similar to that of the baking curve 37 of Fig. 10, with the two curves 37,41 oc-;u~ying a different portion of the color cube 35,40 and reflecting the differences in ingredients between the products. The color curve 41 of Fig. 11 is also shorter than that of Fig. 10 as the "MILK COFFEE" form of biscuit exhibits more consi~lellt browning than the "SAO" form of biscuits, which have blisters that can cause uneven browning in color.
Once trained, the SOM 30 of Fig. 9 can be utilised as neural network 7 of Fig.
1 to produce, for each input pixel, an output node intlir~tor having the closest position to the input pixel. The closest m~t~hing output node 32 (Fig. 9) for each pixel of an image can be subjected to image procPcsing 8 (Fig. 1) which can take the form ofhistogr~mming, thereby producing a histogram profile of the sc~nn~cl baking product 2.
A second neural network 9, which takes the form of a supervised feed f~Jlw~d neural network, can then be subjected to "training" by im~ging a large number ofbiscuits 2 having known baking characteristics, feeding the images through SOM 7, and forming a histogram 8. The histogram 8 can then form the input data to a supervised back propagation neural network which can be trained, in the normal manner, to classify the color level of the baking product (e.g. the biscuit) 2. The samples can be continuously fed through the steps 6 to 9 until the neural network 9 is properly trained to produce output 10 in(li~ting the level of baking.

, The output 10 can then be utilised by control system 11 which can take the form of human or aulo~ Lic process adjustment to adjust the conditions within oven 4 to hll~ruve the baking products 2.
The specified system provides automatic segmPnt~tion of the biscuit subject fromdiverse backgrounds. This is accomplished by the histogr~mming process which "weighs" each image pixel as a reducing function of its distance from the histogram points. Thus, pixels that are signifir~ntly distant from the baking curve discovered by the self-organising map are down-weighted to the extent that they make little or no contribution to the overall histogram. In practice, this means that it is not n~cess~ry for 10 biscuits to be imaged with a specially prepared background. Any background with colors sufficiently dissimilar to the biscuit colors under consideration will suffice. This is of practical benefit when applying the system to on-line monitoring and oven control, as the im~ging background may well be a conveyor belt of inconsistent color.
The co.ll~uu~ion of the histogram 8 from the map produced by the SOM is performed in detail as follows. In principle, the purpose is to obtain histograms in which each bin represents the weighted count of pixels falling within a fuzzy portion of the baking curve 35,40. The fuzzy portion is defined by a Gaussian weighting function with parameters ~x and ~y. The parameter ~x denotes the spread of the Gaussian weighting function about the baking curve, and it is determined by consideration of the 20 likely color variation around the curve. It is chosen sufficiently small enough to enable the automatic segm~nt~tion process previously described. The parameter ~y denotes the spread of the Gaussian weighting function along the baking curve for a particular histogram bin. Normally, c~y is greater than '~x-A practical implementation of the above technique involves the following steps:
a. The SOM nodes are interpolated to obtain a large number of sampling points. The number of sampling points is determined by ~x so as to limit to an acceptable level the aliasing effect caused by sampling the baking curve at discrete points. The Nyquist result in sampling theory applies in this step.

g b. The biscuit pixels are histogrammed at the interpolated sampling points.
The dict~n~e of each pixel (in RGB color space) from each sampling point is computed, and the Gaussian function with spread 6x is used to compute the weighted conLribulion of that pixel to the hi~Lo~ bin at that particular sampling point.
C. The histogram produced in (b) is treated as a l-D signal and filtered with a second (~ si~n function with a spread ~. In the process, it is subsampled to the number of input nodes of the feed fol w~rd neural network. The value of 6yiSchosen so as to limit to an acceptable level the aliasing caused by this furthersubsampling.
The system may be applied to multi-dimensional data of diverse kinds such as combinations of color, visual texture and/or three-dimensional structure sensing. In some such situations, the set of state points may require a 2-D or 3-D SOM whereas the baking curve requires only a 1-D SOM.
It will be obvious to those skilled in the art that the steps 6 to 9 can be 15 implement~d in many different ways, including de~ic~t~d neural network hardware and associated computer h~dw~ or in the form of a software siml]l~tion of the neuralnetwork system. The pler~ d method of implement~tion of steps 6 to 9 is in the form of software implement~tion on a standard microcoln~uL~l system as this allows for easy alteration when it is desired to alter the form of baking products.
The foregoing describes only one embodiment of the present invention, and mo~ c~tions obvious to those skilled in the art can be made thereto without departing from the scope of the invention.

Claims (21)

CLAIM
1. A method of baking foodstuffs by determining the state of a substance having a color property dependent on said state, said method comprising the steps of:
forming a pixel image of said substance, wherein said pixel image comprises a plurality of pixels;
projecting said pixel image into a three dimensional color space;
comparing said projection with a projection of a second portion of said substance, having a predetermined state, so as to determine the state of said substance;
and controlling an oven used to bake said foodstuffs dependent on the state of said substance.
2. The method as claimed in claim 1, wherein said comparing step further comprises the steps of:
deriving a series of state data points from said second portion of said substance; and producing a comparison of said projection with said state data points so as to determine the state of said substance.
3. The method as claimed in claim 2, wherein said deriving step further comprises the step of:
deriving said series of state data points by means of a self organising feature map.
4. The method as claimed in claim 2, wherein said producing step further comprises the step of:
producing a data histogram from said projected pixel image and said state data points, said histogram indicating the number of projected pixels in the neighbourhood of each of said state data points.
5. The method as claimed in claim 4, wherein said producing step further comprises:
deriving a neural network input to a neural network using said data histogram, said neural network being trained to recognise neural network inputs derived from histograms of projected pixel images of substances having a predetermined state and to generate a signal indicative of said predetermined state; and outputting a signal from said neural network indicative of said state of the projected pixel image of said data histogram.
6. A method of baking foodstuffs by determining the state of a first data sample having collectively, a series of characteristics, said method comprising the steps of:
projecting said first data sample comprising a plurality of data points into a multi-dimensional space;
comparing said projection with a projection of a second data sample having a set of predetermined characteristics, so as to determine the state of said first data sample characteristics; and controlling an oven used to bake said foodstuffs dependent on the state of said first data sample characteristics.
7. The method as claimed in claim 6, wherein said comparing step further comprises the steps of:
deriving a series of state data points from said second data sample; and producing a comparison of said projection with said state data points so as to determine the state of said first data sample characteristics.
8. The method as claimed in claim 7, wherein said deriving step further comprises the step of:
deriving said series of state data points by means of a self organising feature map.
9. The method as claimed in claim 8, wherein said producing step further comprises the step of:
producing a data histogram from said first data sample and said state data points, said histogram indicating the number of first samples in the neighbourhood of each of said state data point.
10. The method as claimed in claim 9, wherein said producing step further comprises the steps of:
deriving a neural network input to a neural network using said data histogram, said neural network being trained to recognise neural network inputs derived from histograms of a data samples having a predetermined series of characteristics and to generate a signal indicative of said predetermined state, and outputting a signal from said neural network indicative of said state of the projected data samples of said data histogram.
11. A system for baking foodstuffs by determining the state of a substance having a color property dependent on said state, said system comprising:

an imaging device for forming a pixel image of said substance, wherein said pixel image comprises a plurality of pixels;
means for projecting said pixel image into a three dimensional color space;
means for comparing said projection with a projection of a second portion of said substance, having a predetermined state, so as to determine the state of said substance; and means for controlling an oven used to bake said foodstuffs dependent on the state of said substance.
12. The system as claimed in claim 11, wherein said means for comparing said projection further comprises:
means for deriving a series of state data points from said second portion of said substance; and means for producing a comparison of said projection with said state data points so as to determine the state of said substance.
13. The system as claimed in claim 12, wherein said deriving means further comprises:
means for generating a self organising feature map to derive said series of state data points.
14. The system as claimed in claim 12, wherein said means for producing said comparison further comprises:
means for producing a data histogram from said projected pixel image and said state data points, said histogram indicating the number of projected pixels in the neighbourhood of each of said state data points.
15. The system as claimed in claim 14, wherein said producing means further comprises:
means for deriving a neural network input to a neural network using said data histogram, said neural network being trained to recognise neural network inputs derived from histograms of projected pixel images of substances having a predetermined state and to generate a signal indicative of said predetermined state; and means for outputting a signal from said neural network indicative of said state of the projected pixel image of said data histogram.
16. A system for baking foodstuffs by determining the state of a first data sample having collectively, a series of characteristics, said system comprising:

-means for projecting said first data sample comprising a plurality of data points into a multi-dimensional space;
means for comparing said projection with a projection of a second data sample having a set of predetermined characteristics, so as to determine the state of said first data sample characteristics; and means for controlling an oven used to bake said foodstuffs dependent on the state of said first data sample characteristics.
17. The system as claimed in claim 16, wherein said means for comparing said projection further comprises:
means for deriving a series of state data points from said second data sample;
and means for producing a comparison of said projection with said state data points so as to determine the state of said first data sample characteristics.
18. The system as claimed in claim 17, wherein said deriving means further comprises:
means for generating a self organising feature map to derive said series of state data points.
19. The system as claimed in claim 18, wherein said means for producing said comparison further comprises:
means for producing a data histogram from said first data sample and said state data points, said histogram indicating the number of first samples in the neighbourhood of each of said state data point.
20. The system as claimed in claim 19, wherein said means for producing said histogram further comprises:
means for deriving a neural network input to a neural network using said data histogram, said neural network being trained to recognise neural network inputs derived from histograms of a data samples having a predetermined series of characteristics and to generate a signal indicative of said predetermined state, and means for outputting a signal from said neural network indicative of said state of the projected data samples of said data histogram.
21. An assembly for baking foodstuffs, comprising:
an oven; and the system for baking foodstuffs in accordance with any one of claims 11 to 20.
CA002207326A 1994-12-13 1995-12-01 Data recognition system Abandoned CA2207326A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPN0023 1994-12-13
AUPN0023A AUPN002394A0 (en) 1994-12-13 1994-12-13 Data recognition system

Publications (1)

Publication Number Publication Date
CA2207326A1 true CA2207326A1 (en) 1996-06-20

Family

ID=3784530

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002207326A Abandoned CA2207326A1 (en) 1994-12-13 1995-12-01 Data recognition system

Country Status (8)

Country Link
JP (1) JPH10511786A (en)
CN (1) CN1170469A (en)
AU (1) AUPN002394A0 (en)
CA (1) CA2207326A1 (en)
DE (1) DE19581867T1 (en)
GB (1) GB2311369A (en)
NZ (1) NZ296487A (en)
WO (1) WO1996018975A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002312762A (en) * 2001-04-12 2002-10-25 Seirei Ind Co Ltd Grain sorting apparatus utilizing neural network
CN106778912A (en) * 2017-01-13 2017-05-31 湖南理工学院 A kind of full-automatic apparatus for baking and method for cake
CN110111648A (en) * 2019-04-17 2019-08-09 吉林大学珠海学院 A kind of programming training system and method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0437439B1 (en) * 1988-07-14 1996-03-27 Garibaldi Pty. Ltd. Computerised colour matching
JPH0786433B2 (en) * 1990-05-22 1995-09-20 支朗 臼井 Color vision information conversion method and device
US5283839A (en) * 1990-12-31 1994-02-01 Neurosciences Research Foundation, Inc. Apparatus capable of figure-ground segregation
GB2256708A (en) * 1991-06-11 1992-12-16 Sumitomo Heavy Industries Object sorter using neural network
SE470465B (en) * 1992-09-07 1994-04-18 Agrovision Ab Method and apparatus for automatic assessment of grain cores and other granular products

Also Published As

Publication number Publication date
GB2311369A (en) 1997-09-24
WO1996018975A1 (en) 1996-06-20
GB9710197D0 (en) 1997-07-09
AUPN002394A0 (en) 1995-01-12
CN1170469A (en) 1998-01-14
JPH10511786A (en) 1998-11-10
DE19581867T1 (en) 1997-12-11
NZ296487A (en) 2000-01-28

Similar Documents

Publication Publication Date Title
RU2613319C2 (en) Method for assessment and quality control of food products on dynamic production line
US11478108B2 (en) Intelligent identification cooking system for oven
Sapirstein et al. Instrumental measurement of bread crumb grain by digital image analysis
Pedreschi et al. Development of a computer vision system to measure the color of potato chips
Brosnan et al. Improving quality inspection of food products by computer vision––a review
US5818953A (en) Optical characterization method
Paquet-Durand et al. Monitoring baking processes of bread rolls by digital image analysis
US20060081135A1 (en) Industrial overline imaging system and method
WO2005104726A2 (en) Method for on-line machine vision measurement, monitoring and control of organoleptic properties of products for on-line manufacturing processes
DE102019120008A1 (en) Method for operating a cooking appliance and cooking appliance
Chen et al. Visual modeling of laser-induced dough browning
EP3715721A1 (en) Cooking device and method for operating a cooking device
Chakraborty et al. Influence of extrusion conditions on the colour of millet-legume extrudates using digital imagery
Nashat et al. Multi-class colour inspection of baked foods featuring support vector machine and Wilk’s λ analysis
CA2207326A1 (en) Data recognition system
Cevoli et al. Storage of wafer cookies: Assessment by destructive techniques, and non-destructive spectral detection methods
EP3722673A1 (en) Cooking device and method
AU4111496A (en) Data recognition system
Llave et al. Analysis of the color developed during carbonization of grilled fish by kinetics and computer imaging
Hamey et al. Pre-processing colour images with a self-organising map: baking curve identification and bake image segmentation
Yin et al. Image processing techniques for internal texture evaluation of French fries
Du et al. Quality evaluation of pizzas
Khan et al. Automatic quality inspection of bakery products based on shape and color information
McCann et al. Pixel and spatial mechanisms of color constancy
Wasnik et al. Digital image analysis: Tool for food quality evaluation

Legal Events

Date Code Title Description
FZDE Discontinued
FZDE Discontinued

Effective date: 19990910