CN1170469A - Data recognition system - Google Patents

Data recognition system Download PDF

Info

Publication number
CN1170469A
CN1170469A CN95196801A CN95196801A CN1170469A CN 1170469 A CN1170469 A CN 1170469A CN 95196801 A CN95196801 A CN 95196801A CN 95196801 A CN95196801 A CN 95196801A CN 1170469 A CN1170469 A CN 1170469A
Authority
CN
China
Prior art keywords
state
steps
projection
neural network
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN95196801A
Other languages
Chinese (zh)
Inventor
C·T·维斯科特
L·G·C·黑梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARONTT'S BISCUITS Ltd
Original Assignee
ARONTT'S BISCUITS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARONTT'S BISCUITS Ltd filed Critical ARONTT'S BISCUITS Ltd
Publication of CN1170469A publication Critical patent/CN1170469A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/10Starch-containing substances, e.g. dough
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Food Science & Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention is directed towards a method of determining the state of a substance (2) having a color property dependent on the state of the substance. The method includes the following steps: forming a pixel image of the substance (2) using a camera (3); projecting the pixel image into a three-dimensional color space using a neural network (7); and comparing the projection (37) with a projection of a second position of the substance (2) to determine the state of the substance (2) using a second neural network (9).

Description

Data recognition system
Invention field
The present invention relates to the identification of the data characteristics in the image, relate more specifically to the feature identification in the bake process.
Background of invention
The result who has some kinds of factor affecting bake process to produce.For example, in producing bread, biscuit or other this based food, comprise the final products that size, shape, thickness and the humidity level of oven temperatures, stoving time and used article (as dough) are produced when bake process finishes at interior all multifactor impacts.Moreover these baked goods normally adopt large-scale production process mass-produced on the commodity scale.
During production processes result, check product usually to guarantee quality control.In order to ensure their applicability, check takes to observe the form of the product of the particular batch of getting off from production line usually.
Checkout procedure is normally undertaken by the observer.Find that in practice the observer provides the subjective judgement that is easy to long-term and short term variations.In addition, the observer can only work and be subject to the influence such as adverse condition such as bored, tired and/or diseases on certain fixed speed.
Summary of the invention
The purpose of this invention is to provide a kind of method, with the state of the object of the viewdata attribute of determining to have easy variation.
According to a first aspect of the present invention, a kind of method is provided, determine to have the described state of the object of the color attribute that depends on described state, described method comprises the steps:
Form the pixel images of described object;
Described pixel images is projected in the three-dimensional colour space; And
The projection of described projection with the second portion of the described object with predetermined state compared, so that determine the state of described object.
According to a second aspect of the present invention, a kind of device that can operate is according to the method described above disclosed.
According to a third aspect of the present invention, a kind of method is provided, definite state that comprehensively has first data sample of series of features, described method comprises the steps:
Described first data sample is projected in the hyperspace; And
Described projection and the projection with second data sample of one group of predetermined characteristic are compared, to determine the state of described first data sample.
The accompanying drawing summary
Describe preferred embodiment of the present invention with reference to the accompanying drawings, wherein:
Fig. 1 illustrates the step of preferred embodiment;
Fig. 2 illustrates the used input scan data layout of preferred embodiment;
Fig. 3-8 illustrates the increase with baking degree, the enhancement process of color change;
Fig. 9 illustrates the Kohonen s self-organizing feature map;
Figure 10 illustrates the first baking curve; And
Figure 11 illustrates the second baking curve of different biscuit products.
Describe in detail
Though will describe preferred embodiment with reference to baking, person skilled in the art person will know and the invention is not restricted to this, and be applicable to all the sight check forms that can determine the article state from its outward appearance.
In preferred embodiment of the present invention, the color characteristic of baked goods carries out labor by integrated computer system in process of production.
Schematically show mass production system 1 among Fig. 1 according to the production baked goods of preferred embodiment of the present invention.System 1 comprises food processing apparatus (as baking oven) 4; Article characteristics statement device 3,6,7,8,9 and control system 11.When product when baking oven 4 comes out by gamma camera 3 imaging baked goods 2.The image of gamma camera 3 picked-up is subjected to the influence of best some treatment steps of being realized by microcomputer, and after this this point will be described in more detail.Comprise image processing or calibration steps 6, first nerves network processes step 7, image processing step 8 and nervus opticus network step 9 in the processing, this step is to generate the output indication 10 of the baking degree of indicating baked goods 2.Output 10 provides product 2 whether to toast deficiency or excessive indication, and the input that can be used as control system 11 determines whether that the needs adjustment is such as baking oven parameters such as heat or times.
Gamma camera 3 used in the preferred embodiment is for producing red (R), green (G) and blue (B), i.e. three substrate charge-coupled device (CCD) gamma cameras of 512 * 512 arrays of RGB pixel value.This array of pixel value is stored in " frame sample device (the framegrabber) " plate that is installed in the microcomputer.Utilize THORN (registered trademark) fluorescent light direct illumination and a black plate washer of two daylight color balances to get rid of ambient lighting and imaging baked goods 2.Person skilled in the art person should know the illumination that also can utilize other form, with/or in industrial processes, also can eliminate the background dividing plate, still do not depart from the scope of the present invention though can utilize the ambient light shield of some other form and spirit.In first example, the baked goods of imaging is to produce " SAO " (registered trademark) biscuit in the Arnott biscuit company limited biscuit plant produced line.
In order to guarantee, take the imaging biscuit with the color calibration table 12 of form shown in Fig. 2 by the correct color calibration of imaging biscuit.Scanning image 12 ' comprise biscuit district 13, background 14, with reference to white background 15, gray scale 16 and with reference to color 17.Utilize then scanning image 12 ' with the image 12 colors of calibrating the pixel of biscuit 13.The colored calibration process of the solid colour of the image that assurance is sampled is the well-known processes of familiar with computers imaging technique personnel.For the discussion of colored calibration process, see also nineteen ninety Novak, " utilizing colour chart to monitor colour stability " of C.L. and S.A.Shafer, technical report CMU-CS-140, Carnegie Mellon university computer science institute.People's such as Novak paper proposes color calibration method widely.Yet, it should be noted that calibration also can be saved under appropriate illumination environment and instrument control.
When having finished color calibration process (frame 6 of Fig. 1), can be with each biscuit image as the RGB file storage.The RGB color space is a computer system famous color space commonly used.Yet person skilled in the art person is clear also to be utilized such as other color systems such as HSV, XYZ or CIE color space coordinates.
Referring to Fig. 3 to 8, the different phase that wherein shows bake process is the projection of the pixel value that obtains of biscuit 13 (Fig. 2).
Fig. 3 illustrates the projection 20 of the pixel images of " life " biscuit on RGB cube 21.
Fig. 4 to Fig. 8 illustrates biscuit changes to the biscuit (Fig. 8) of overbaking from the biscuit (Fig. 4) that toasts deficiency homolographic projection.The color characteristic that can see biscuit from comparison diagram 4 to Fig. 8 changes with the increase that biscuit is subjected to baking degree.Therefore, can utilize the specific progress of the pixel data of Fig. 4 to Fig. 8 to determine the baking state of any specific sample biscuit.
In order to utilize in the bake process, be necessary to describe and the akin method of the data sample of Fig. 4 to Fig. 8 from the progress of Fig. 4 to Fig. 8 brief and concisely.This can be by producing one " baking curve ", and the important change color in the three-dimensional data space of presentation graphs 4 to Fig. 8 is accomplished one-dimensionally.A kind of method that generates the baking curve is to utilize the Kohonen s self-organizing feature map, and this is a kind of learning art of non-directiveness, and it is effective in drawing-out structure from the complicated empirical data that presents the height nonlinear relationship.The detailed description of the s self-organizing feature map of Kohonen is provided at Judith E.Dayhoff's in " neural network structure: cross the threshold ", and nineteen ninety Van NorstrandReinhold publishes, and the 163-191 page or leaf is combined in this by cross reference with its content here.
Go back to Fig. 9, the example of the self-organization mapping (SOM) that wherein shows in the preferred embodiment to be adopted.SOM30 has three input nodes 31, corresponding to red, the green and blue color component from the pixel value of the digitizing color image of biscuit 13 (Fig. 2).SOM30 comprises N output node 32.Each input node 31 is used the limit, as 33, is connected to each output node 1 to N.Adopt one dimension SOM30 to mean when training, SOM network 30 will be mapped to a little from whole group of RGB pixel value 31 of biscuit image or the one-dimensional array of output node 32 on.This will have two kinds of main effects: at first will import data function and be reduced in the data space that hangs down dimension; Secondly, the mutual relationship between the maximally related point in the input data 31 remains unchanged in the output format of network 30.
Be expressed as E if import 31, as follows:
(Blue) formula 1 for Red, Green for E=
And if output node i with a series of limits with the weights that are expressed as Ui be connected to the input 31 on, wherein Ui takes off the form of stating:
Ui=(W I, red, W I, green, W I, blue) formula 2
Suppose that wherein these weights when initial are that the limit is given in Random assignment and its scope (in preferred embodiment from 0 to 255) in the data value set identical with sub-pixel value, then SOM is trained by find out the output node 32 with the weights Ui that approaches most to import pixel E for each pixel." approaching " degree is measured with the range determination value of Euclidean geometry usually.But the immediate output node mathematics face of land with subscript C shows as follows:
Node in the neighborhood of C (for example node C-1 and C+1) the node C that changes this best or win victory then.The change of each output node j in the neighborhood of C (being expressed as Nc) is undertaken by at first being calculated as follows Δ Uj:
Δ Uj=α (t) * (E-Uj) formula 4
Derive one group of new weights U then j (t+1)Shown in 5:
Figure A9519680100082
The output node in the predetermined neighborhood around the node C that selects does not remain unchanged.
α in the formula 4 (t) function is called " learning rate " and is monotonically decreasing function, and the αHan Shuo that has following form is suitable for:
α (t)=α 0(1-t/T) formula 6
α wherein 0Value in 0.02 to 0.05 scope, t is current training iteration, and T is the sum of the training iteration that will carry out.The width d of neighborhood Nc also can be chosen to formula 7 in the similar fashion that provides change:
D (t)=d 0(1-t/T) formula 7
D wherein 0But initial selected is such as 1/3rd of the width of output node.
As the example of the process of carrying out according to this preferred embodiment, be scanned from four " SAO " biscuit samples of Arnott biscuit company limited, to produce 47967 pixels of every biscuit, each pixel has three independently R, G and B values.Move four groups of pixels with random series then, and be used as the training input of training for 20 times altogether of the SOM of Fig. 9 form with 10 output nodes.
Referring to Figure 10, wherein show describing of 10 output node weights Ui in three-dimensional color cube 35.10 points (as 36) are depicted as with curve 37 and connect together.Final " the baking curve " of data imported in curve 37 expressions thus.
Referring to Figure 11, be that the biscuit of second kind of form repeats this process, comprise " coffee with milk " (registered trademark) biscuit of the Arnott of the SOM that 15 nodes are exported, its result 41 describes to be shown in the color cube 40, and training data passes through SOM 50 times altogether.The similar of baking curve 41 that can find out Figure 11 is in the structure of the baking curve 37 of Figure 10, and two curves 37,41 occupy the different piece of color cube 35,40 and reflect composition difference between the product.Because the biscuit of " coffee with milk " form presents than the biscuit of " SAO " form and more consistent is brown, the latter has can cause the foaming that is brown unevenly, thus the colour curve 41 of Figure 11 also than Figure 10 for short.
In case through training, just can utilize the neural network 7 of the SOM30 of Fig. 9, generate the output node identifier (indicator) have near the position of this input pixel for each input pixel as Fig. 1.The immediate coupling output node 32 (Fig. 9) of each pixel of an image can be submitted to the image processing 8 (Fig. 1) that can adopt represented as histograms, generate the histogram profile of the baked goods 2 that is scanned whereby.
The biscuit 2 that has known roasting characteristics then by imaging is in a large number presented image by SOM7 and is constituted histogram 8 and the nervus opticus network 9 of the neural network of the feedforward form that will take to monitor is submitted to " training ".Then, just histogram 8 can constitute the input data to the Propagation Neural Network backward of monitoring, this network can be trained the color level of the baked goods of classifying (as biscuit) 2 with normal mode.Sample can be presented continuously by step 6 to 9 till suitably having trained neural network 9 to generate the output 10 of indication baking level.
Control system 11 just can utilize output 10 to regulate condition in the baking oven 4 to improve baked goods 2 then, and control system can adopt mode artificial or that automated procedure is regulated.
Given system provides the automatic segmentation of the biscuit that is subjected to the different background domination.This is by each image pixel is finished as its histogram forming process apart from the descending function weighting of the distance of histogram point.Like this, the pixel of the baking curve found away from self-organization mapping significantly is weighted to downwards they are less even not have the degree that acts on to overall histogram effect.In fact, this means that there is no need the special background of preparing of usefulness takes the imaging biscuit.Have with reference in biscuit color fully dissimilar any background color all be available.In the time of in native system being applied in the control of online monitoring and baking oven, this has actual benefit, because the imaging background is likely the travelling belt of different colours.
The detailed execution of the mapping compute histograms 8 that generates from SOM is as follows.In principle, purpose is for drawing histogram, and wherein each storehouse (bin) representative falls into the weighted count of the pixel of the fuzzy part of toasting curve 35,40.Fuzzy part is by having parameter σ xWith σ yGaussian weighing function definition.Parameter σ xNear the distribution of the expression gaussian weighing function baking curve, and it is definite by the possible change color around the consideration curve.It is selected as little as to be enough to make previously described automatic segmentation process to carry out.Parameter σ yRepresent of the distribution of the gaussian weighing function in specific upright figure storehouse along the baking curve.Usually, σ yGreater than σ x
The actual realization of above-mentioned technology comprises following step:
A inserts the SOM node to obtain the bulk sampling point.By σ xDetermine the number of sample point, so that the confusion effect that the baking curve of will sampling causes is limited in the acceptable degree on discrete point.In this step, use the Nyquist result in the sampling theory.
B is inserting the histogram of making the biscuit pixel on the sample point.Calculate the distance of each pixel (in the RGB color space), and utilize to have and scatter σ apart from each sample point xGaussian function calculate this pixel the weighting on the histogram bin on this specific sample point influenced.
C handles the histogram that generates in (b) and with having distribution as one-dimensional signal
Figure A9519680100101
The second Gaussian function filtering.In this process, with its double sampling to the input number of nodes of feedforward neural network.Select σ yValue so that obscuring of causing of this further double sampling is limited in the acceptable degree.
Native system can be applicable on different types of multidimensional data, such as color combination, visual texture and three-dimensional structure direction finding.In some cases, the state point set may need two dimension or D S OM, and the baking curve need only one dimension SOM.
For person skilled in the art person, obviously can be with many different modes performing steps 6 to 9, wherein comprise special-purpose neural network hardware and relevant computer hardware or with the form of the software simulation of nerve network system.The preferred approach of performing step 6 to 9 is the software ways of realization on the standard micro computer system, can easily change because this allows when hope changes the form of baked goods.
Above only described one embodiment of the present of invention, and can make thereon the conspicuous modification of person skilled in the art person is not departed from the scope of the present invention.
Claims
Press the modification of PCT13 bar
1. the method for the described state of the object of a color attribute of determining to have the state of depending on, this method comprises the steps:
Form the pixel images of described object, wherein said pixel images comprises a plurality of pixels;
Described pixel images is projected in the three-dimensional color space; And
With the projection of the second portion with predetermined state of described projection and described object relatively, thus determine the state of this object.
2. desired method in the claim 1, wherein said comparison step further comprises the steps:
From the second portion of described object a series of status numbers strong point of deriving; And
Produce the comparison at described projection and described status number strong point, thereby determine the state of described object.
3. the desired method of claim 2, wherein said derivation step further comprises the steps:
With a s self-organizing feature map described status number strong point series of deriving.
4. the desired method of claim 2, wherein said generation step further comprises the steps:
Produce a data histogram from described projection pixel images and described status number strong point, the number of the projection pixel of described histogram indication in the field at each described status number strong point.
5. the desired method of claim 4, wherein said generation step further comprises the steps:
Utilize described data histogram to derive to the neural network input of neural network, described neural network is the input that is trained to the neural network that identification derives from the histogram of the projection pixel images of object with predetermined state, and the signal of the described predetermined state of generation indication, and
Indicate the signal of the described state of the histogrammic projection pixel images of described data from described neural network output.
6. the method for the state of first data sample of determining in general to have series of features, described method comprises the steps;
Described first data sample that will comprise a plurality of data points projects in the hyperspace; And
The described projection and the projection of second data sample with one group of predetermined characteristic are compared, thus the state of definite described first data sample feature.
7. desired method in the claim 6, wherein said comparison step further comprises the steps:
From described second data sample a series of status numbers strong point of deriving; And
Produce the comparison at described projection and described status number strong point, thereby determine the state of the described first data sample feature.
8. desired claim in the claim 7, wherein said derivation step further comprises the steps:
With a s self-organizing feature map described status number strong point series of deriving.
9. desired claim in the claim 8, wherein said generation step further comprises the steps:
Produce a data histogram from described first data sample and described status number strong point, the number of first sample of described histogram indication in the field at each described status number strong point.
10. desired method in the claim 9, wherein said generation step further comprises the steps:
Utilize described data histogram to derive to the neural network input of neural network, train described neural network to discern the input of the neural network of deriving from the histogram of data sample with predetermined characteristic series, and the signal of the described predetermined state of generation indication, and
Indicate the signal of the described state of the histogrammic data for projection sample of described data from described neural network output.
11. a use is according to the method for the baked goods of the described definite method of any one in the claim 1 to 5, described baking method further comprises the steps:
The state that depends on described object, control is used for toasting the baking oven of described food.
12. a use is according to the method for the baked goods of the described definite method of any one in the claim 6 to 10, described baking method further comprises the steps:
The state that depends on the described first data sample feature, control is used for toasting the baking oven of described food.
13. a system that is used for the state of definite object comprises:
An imaging device is used to form the described pixel images of described object; And
The be coupled treating apparatus of described imaging device is used for carrying out according to any one method of claim 1 to 10.
14. an assembly that is used for baked goods comprises
A baking oven;
An imaging device is used for the food of the described baking of the described baking oven of imaging output;
Be coupled to the treating apparatus of described imaging device, be used for carrying out according to any one definite method of claim 1 to 10; And
Be coupled to the control device of described treating apparatus, be used to receive the output signal of described treating apparatus of state of the food of the described baking of indication;
Wherein said control device is coupling on the described baking oven, with the described baking oven of described state of operation according to described baked goods.

Claims (14)

1. the method for the state of an object of determining to have the color attribute that depends on described state, described method comprises the steps:
Form the pixel images of described object;
Described pixel images is projected in the three-dimensional color space; And
The projection of the second portion with predetermined state of described projection and described object is compared, thereby determine the state of described object.
2. desired method in the claim 1, wherein said comparison step further comprises the steps:
From the described second portion of described object a series of status numbers strong point of deriving; And
Produce the comparison at described projection and described status number strong point, thereby determine the state of described object.
3. desired method in the claim 2, wherein said derivation step further comprises the steps:
With the s self-organizing feature map described status number strong point series of deriving.
4. desired method in the claim 2, wherein said generation step further comprises the steps:
Produce the data histogram from described projection pixel images and described status number strong point, the number of described histogram indication projection pixel in the field at each described status number strong point.
5. desired method in the claim 4, wherein said generation step further comprises the steps:
Utilize described data histogram to derive to the neural network input of neural network, described neural network is the input that is trained to the neural network that identification derives from the histogram of the projection pixel images of object with predetermined state, and generates the signal of the described predetermined state of indication; And
Indicate the signal of the described state of the histogrammic projection pixel images of described data from described neural network output.
6. the method for the state of first data sample of determining synthetically to have series of features, described method comprises the steps:
Described first data sample is projected in the hyperspace; And
Described projection and the projection with second data sample of one group of predetermined characteristic are compared, thus the state of definite described first data sample feature.
7. desired method in the claim 6, wherein said comparison step further comprises the steps:
From described second data sample a series of status numbers strong point of deriving; And
Produce the comparison at described projection and described status number strong point, thereby determine the state of the described first data sample feature.
8. desired claim in the claim 7, wherein said derivation step further comprises the steps:
With the s self-organizing feature map described status number strong point series of deriving.
9. the claim in the claim 8, wherein said generation step further comprises the steps:
Produce a data histogram from described first data sample and described status number strong point, the number of first data sample of described histogram indication in the field at each described status number strong point.
10. desired method in the claim 9, wherein said generation step further comprises the steps:
Utilize described data histogram to derive to the neural network input of neural network, described neural network is the input that is trained to the neural network that identification derives from the histogram of data sample with predetermined characteristic series, and the signal of the described predetermined state of generation indication, and
Indicate the signal of the described state of the histogrammic data for projection sample of described data from described neural network output.
11. a use is according to the method for the baked goods of the described definite method of any one in the claim 1 to 5, described baking method further comprises the steps:
The state that depends on described object, control is used for toasting the baking oven of described food.
12. a use is according to the method for the described definite method baked goods of any one in the claim 6 to 10, described baking method further comprises the steps:
The state that depends on the described first data sample feature, control is used for toasting the baking oven of described food.
13. a system that is used for determining object state comprises:
An imaging device is used to form the described pixel images of described object; And
The be coupled treating apparatus of described imaging device is used for carrying out according to any one method of claim 1 to 10.
14. an assembly that is used for baked goods comprises
A baking oven;
An imaging device is used for the food of the described baking of the described baking oven of imaging output;
Be coupled to the treating apparatus on the described imaging device, be used for carrying out according to any one definite method of claim 1 to 10; And
Be coupled to the control device on the described treating apparatus, be used to receive the output signal of described treating apparatus of state of the food of the described baking of indication;
Wherein said control device is coupling on the described baking oven, with the described baking oven of described state of operation according to described baked goods.
CN95196801A 1994-12-13 1995-12-01 Data recognition system Pending CN1170469A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPN0023 1994-12-13
AUPN0023A AUPN002394A0 (en) 1994-12-13 1994-12-13 Data recognition system

Publications (1)

Publication Number Publication Date
CN1170469A true CN1170469A (en) 1998-01-14

Family

ID=3784530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN95196801A Pending CN1170469A (en) 1994-12-13 1995-12-01 Data recognition system

Country Status (8)

Country Link
JP (1) JPH10511786A (en)
CN (1) CN1170469A (en)
AU (1) AUPN002394A0 (en)
CA (1) CA2207326A1 (en)
DE (1) DE19581867T1 (en)
GB (1) GB2311369A (en)
NZ (1) NZ296487A (en)
WO (1) WO1996018975A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110111648A (en) * 2019-04-17 2019-08-09 吉林大学珠海学院 A kind of programming training system and method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002312762A (en) * 2001-04-12 2002-10-25 Seirei Ind Co Ltd Grain sorting apparatus utilizing neural network
CN106778912A (en) * 2017-01-13 2017-05-31 湖南理工学院 A kind of full-automatic apparatus for baking and method for cake

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03506074A (en) * 1988-07-14 1991-12-26 ガリボールディ プロプライエタリー リミテッド Color matching by computer
JPH0786433B2 (en) * 1990-05-22 1995-09-20 支朗 臼井 Color vision information conversion method and device
US5283839A (en) * 1990-12-31 1994-02-01 Neurosciences Research Foundation, Inc. Apparatus capable of figure-ground segregation
GB2256708A (en) * 1991-06-11 1992-12-16 Sumitomo Heavy Industries Object sorter using neural network
SE470465B (en) * 1992-09-07 1994-04-18 Agrovision Ab Method and apparatus for automatic assessment of grain cores and other granular products

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110111648A (en) * 2019-04-17 2019-08-09 吉林大学珠海学院 A kind of programming training system and method

Also Published As

Publication number Publication date
JPH10511786A (en) 1998-11-10
CA2207326A1 (en) 1996-06-20
GB2311369A (en) 1997-09-24
DE19581867T1 (en) 1997-12-11
AUPN002394A0 (en) 1995-01-12
GB9710197D0 (en) 1997-07-09
WO1996018975A1 (en) 1996-06-20
NZ296487A (en) 2000-01-28

Similar Documents

Publication Publication Date Title
Wu et al. Colour measurements by computer vision for food quality control–A review
CN106531060B (en) The bearing calibration of LED display bright chroma and device
CN104871175B (en) Food quality is scored and is controlled
CN107144353B (en) A kind of textile chromatism measurement method based on digital camera
CN106596073A (en) Method and system for detecting image quality of optical system, and testing target plate
EP3220101B1 (en) Texture evaluation apparatus, texture evaluation method, and computer-readable recording medium
CN109377487A (en) A kind of fruit surface defect detection method based on deep learning segmentation
CN114066857A (en) Infrared image quality evaluation method and device, electronic equipment and readable storage medium
CN115841794B (en) OLED display screen detection method and device, display image processing method and device
CN109459136A (en) A kind of method and apparatus of colour measurement
CN110044485B (en) Image type fabric color measuring method
CN102414722A (en) Display of effect coatings on electronic display devices
CN1170469A (en) Data recognition system
CN110935646A (en) Full-automatic crab grading system based on image recognition
DE102008044764A1 (en) Method and portable system for quality evaluation of meat
CN112488997B (en) Method for detecting and evaluating color reproduction of ancient painting printed matter based on characteristic interpolation
US20010048765A1 (en) Color characterization for inspection of a product having nonuniform color characteristics
CN106846328A (en) A kind of tunnel brightness detection method based on video
CN109540892A (en) Duck variety discriminating method and system
McCann Visibility of gradients and low spatial frequency sinusoids: evidence for a distance constancy mechanism
CN101499167A (en) Graying method based on spectral reflection characteristics and three-color imaging principle
Zhu et al. Color calibration for colorized vision system with digital sensor and LED array illuminator
CN107079072B (en) The intercrossed calibration of imager
Wasnik et al. Digital image analysis: Tool for food quality evaluation
Balaban et al. Quality evaluation of Alaska pollock (Theragra chalcogramma) roe by image analysis. Part II: Color defects and length evaluation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C01 Deemed withdrawal of patent application (patent law 1993)
WD01 Invention patent application deemed withdrawn after publication