CN107273886A - A kind of food data analysis method based on cloud computing - Google Patents

A kind of food data analysis method based on cloud computing Download PDF

Info

Publication number
CN107273886A
CN107273886A CN201710547747.2A CN201710547747A CN107273886A CN 107273886 A CN107273886 A CN 107273886A CN 201710547747 A CN201710547747 A CN 201710547747A CN 107273886 A CN107273886 A CN 107273886A
Authority
CN
China
Prior art keywords
image
information
food
unit
cloud computing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710547747.2A
Other languages
Chinese (zh)
Inventor
屈锐
程思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Technology Co Ltd
Original Assignee
Hubei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Technology Co Ltd filed Critical Hubei Technology Co Ltd
Priority to CN201710547747.2A priority Critical patent/CN107273886A/en
Publication of CN107273886A publication Critical patent/CN107273886A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Abstract

The invention belongs to food verification technical field, a kind of food data analysis method based on cloud computing is disclosed, the food data analysis method based on cloud computing includes:The information of food production is shot by image unit;The information transmitted by processing unit to image unit is handled;The processing unit enters the identification of row information by built-in image recognition determination module;Pass through wireless routing unit transmission processing unit information to monitoring unit;By network Cloud Server, the information that monitoring unit is transmitted to routing unit is monitored;And alarm is carried out to violation information;Data sharing is carried out by network Cloud Server and monitoring unit using mobile terminal, multi-faceted regulation and control are carried out.The present invention greatly improves supervision, more accurately understands the process of factory, more transparent, has prevented wheel and deal, and the irregular mode of production has also contained that the production of evil mind manufacturer is harmful to food;The more convenient progress for carrying out censorship simultaneously.

Description

A kind of food data analysis method based on cloud computing
Technical field
The invention belongs to food verification technical field, more particularly to a kind of food data analysis method based on cloud computing.
Background technology
At present, with the development of society, food security is deep by food and drink industry and the thing of consumers in general's extensive concern, Especially in " internet+" epoch, catering trade brings life convenient, catering trade quality and level of security by the public of the mode of touching net How to ensure, as whole industry focus of attention.However, traditional examination is the photo provided by businessman, this mode, It is excessively single, it is impossible to know the real processes of food processing and production, investigated if entering factory, labor intensive wastes time and energy, according to The production of harmful food is not prevented so.
In summary, the problem of present technology is present be:The existing food data analysis method based on cloud computing is examined It is single, it is impossible to know the real processes of food production, it is impossible to obtain food production product implementation quality information;For selective examination quality letter Breath, enters factory's investigation and wastes time and energy.
The content of the invention
The problem of existing for prior art, the invention provides the food data analysis method based on cloud computing.
The present invention is achieved in that a kind of food data analysis method based on cloud computing, described based on cloud computing Food data analysis method includes:
The information of food production is shot by image unit;
The information transmitted by processing unit to image unit is handled;The processing unit is known by built-in image Other determination module enters the identification of row information;Specifically include:
N number of sample is collected as training set X, sample mean m is obtained using following formula:
Wherein, xi∈ sample training collection X=(x1, x2..., xN);
Obtain scatter matrix S:
Obtain the eigenvalue λ of scatter matrixiWith corresponding characteristic vector ei, wherein, eiPrincipal component, by characteristic value from Arrive greatly and small be arranged in order λ1, λ2...;Take out p value, λ1, λ2..., λpDetermine identification space E=(e1, e2..., eP), herein Spatially, in training sample X, the point that each element projects to the space is obtained by following formula for identification:
x'i=Etxi, t=1,2 ..., N;
What is obtained by above formula is p dimensional vectors by former vector after PCA dimensionality reductions;
Described feature extraction is based on sparse representation, and many image recognitions are carried out using recognizer;
Specific method is:
Current frame image is detected and sorted by coordinate and draws the recognition result of each image of present frame;It is each according to present frame The recognition result of individual image calculates each corresponding image each adjacent n frames recognition result;The feature of each image is counted, by surpassing More than half n/2 uniform characteristics determine the final feature of target;
Wherein, the reconstruction error { r between calculating picture to be identified and image library are of all categories1, r2……rn, r1<r2<……< rn, by obtained Similarity value according toRule determine final recognition result;Wherein T1 For rate value, T1=0.65;
Pass through wireless routing unit transmission processing unit information to monitoring unit;
By network Cloud Server, the information that monitoring unit is transmitted to routing unit is monitored;And violation information is entered Row alarm;The monitoring unit is monitored by built-in monitoring module to the safety of food;Specifically include:
First, the integrated information appraisement system set up between analysis object and the safety index factor, appraisement system is by n points The system that m index of object is constituted is analysed, so as to obtain initial information Evaluations matrix:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
To each index normalized in A':
Normalized index:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
- matrix A ' in jth row minimum value;
- matrix A ' in jth row maximum;
aijCorrespond to the element that the i-th row j is arranged in-normalization information matrix, normative information matrix A is represented by:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Then, according to normative information matrix, the proportion of the desired value of jth index under i-th of analysis object is determined:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Finally, the entropy of i-th of analysis object is calculated by entropy assessment
Wherein, Ti- it is defined as the comentropy of i-th of analysis object;
pijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Similarly, safe sub-information entropy can be tried to achieve, i.e.,:
Wherein Si- it is defined as the safe sub-information entropy of i-th of analysis object;
qijThe proportion of jth index under-i-th analysis object;
mijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Information entropy is normalized, formula is normalized:
According to comentropy and the relation of risk factor, the risk factor classification standard based on comentropy is divided into:
0.8≤Hc≤ 1, extremely low danger;
0.6≤Hc< 0.8, low degree of hazard;
0.4≤Hc< 0.6, poor risk;
0.2≤Hc< 0.4, highly dangerous;
0≤Hc< 0.2, high danger;
Data sharing is carried out by network Cloud Server and monitoring unit using mobile terminal, multi-faceted regulation and control are carried out.
Further, the image unit includes CCD camera;The CCD camera obtains the information side of obtaining of food production Method includes:Use circulation successively to project N frames (N≤3) relation and the sinusoidal grating of phase shift is walked for N on testee, it is assumed that system Lateral magnification is M, and body surface reflectivity is R (x, y), then N walks the light intensity in the sinusoidal grating image plane of (N≤3) phase shift Distribution can be expressed as
Wherein I0For background light intensity, C0(x, y) is the fringe contrast on grating image face, and f is the grating frequency of image plane Rate;
According to imaging theory, in face of grating image after vague image Id(x, y;δ) by its focusing as IiIt is (x, y) and corresponding System ambiguous equation is that the point spread function h (x, y) of system convolution is obtained, i.e.,
Id(x, y;δ)=h (x, y) * Ii(x, y) (2)
Symbol * represents convolution, Id(x, y, δ) is away from the light distribution at imaging surface δ positions.
Further, in actual optical system, due to the distortion factor of the diffraction of optical system, dispersion and lens, use Two-dimensional Gaussian function represents the fuzzifying equation h (x, y) of system, i.e.,
σ in formulahIt is diffusion constant, corresponding to the standard deviation of point spread function, be directly proportional i.e. σ to confuson disc radiush= Cr, C value depend on optical system parameter, in most practical cases, can approximately takeCan be with by (2) (3) two formula Light distribution after obtaining in face of projection image is
Modulation of fringes before and after optical grating projection image planes is distributed as
M0(x, y) is to project the modulation degree distribution in image planes, due to a diffusion constant σhIt is directly proportional to confuson disc radius r, and R is directly proportional to defocusing amount δ, and therefore (5) formula can be rewritten as
D is distance of the tested point to reference planes, d in formulaiDistance of the optical grating projection image planes to reference planes, c be by The constant that system is determined.
Further, the modulation degree distribution of striped is calculated by Fourier transformation method or N step (N≤3) phase shift algorithms, when adopting When being handled with Fourier transformation method, any one pixel of the image set of collection as shown in formula (4) is made in Fu along time shaft Leaf transformation can be obtained
G(fdi)=G0(fdi)+G1(fdi)+G-1(fdi) (7)
Suitable spectral window is chosen by fundamental frequency G1(fdi) filter out, then its progress inverse Fourier transform can be obtained
By B (di) pixel can be calculated along the contrast C (d on time shafti), so as to obtain the pixel in the time Modulation degree distribution on axle.Each pixel on to bar graph makees Fourier transformation, and space filtering, inverse Fourier becomes Change, you can obtain the modulation degree distribution of whole striped;
When walking the method for (N≤3) step phase shift using N, in the pictures collected, for any one position (m Frame) bar graph, the modulation degree for calculating the position using N-1 bar graphs before and after at the position is distributed, wherein, m1Frame is to m2Frame, m1=round [(N-1)/2], m2=N-m1- 1, round represent the operation that rounds up, and its expression formula is as follows:
Wherein, Mm(x, y) represents the modulation angle value at m frame positions, Mod represents modulo operation.
Further, the processing unit includes computer;The mobile terminal includes mobile phone.
Further, the food data analysis method based on cloud computing is also divided violation product by detection unit Stream.
Further, the wireless transmitter module built in the detection unit passes through routing unit and monitoring unit wireless connection.
Advantages of the present invention and good effect are:Supervision is greatly improved, the processed of factory is more accurately understood Journey, it is more transparent, prevent wheel and deal, the irregular mode of production has also contained that the production of evil mind manufacturer is harmful to food;Together Shi Gengjia is facilitated, and timesaving carries out the progress of censorship.
The recognition methods of the present invention, improves the efficiency and accuracy rate of identification, preferably protects the safety of food;Extract Image feature vector method, improves resolution to a certain extent, is conducive to the collection and identification of image.
The present invention proposes a probabilistic model, interaction mechanism in the model energy concentrated expression food production, The non-linear process and dynamic process of complexity can be characterized, the risk factor of research object can be predicted, with big data, cloud The development of the technologies such as calculating, Internet of Things and maturation, it is of the invention by the importance as the data research of bromatology University of Science and Technology.The present invention Carry out the distribution of processing extraction modulation degree along time shaft using the method individual element point of Fourier transformation to collecting pictures, Influencing each other between each pixel that can be prevented effectively from same frame bar graph, while can also avoid individually to each frame bar Line figure carries out Fourier transformation processing acquisition modulation degree distribution and the phenomenon of loss in detail occurs;
Modulation angle value is calculated using N step phase shift algorithms using adjacent N frames (N≤3) bar graph, grating throwing can be carried out continuously Shadow and picture collection, shorten the time for optical grating projection and IMAQ, reduce the quantity of picture collection, while also protecting The precision of measurement is demonstrate,proved.So as to obtain accurate view data.
Brief description of the drawings
Fig. 1 is the food data analysis method structural representation provided in an embodiment of the present invention based on cloud computing;
In figure:1st, monitoring host computer;2nd, mobile terminal;3rd, network Cloud Server;4th, camera;5th, client;6th, router; 7th, netting twine;8th, wire.
Embodiment
In order to further understand the content, features and effects of the present invention, hereby enumerating following examples, and coordinate accompanying drawing Describe in detail as follows.
The structure to the present invention is explained in detail below in conjunction with the accompanying drawings.
As shown in figure 1, the food data analysis method provided in an embodiment of the present invention based on cloud computing, including:
S101:The information of food production is shot by image unit;
S102:The information transmitted by processing unit to image unit is handled;
S103:Pass through wireless routing unit transmission processing unit information to monitoring unit;
S104:By network Cloud Server, the information that monitoring unit is transmitted to routing unit is monitored;And violation is believed Breath carries out alarm;
S105:Data sharing is carried out by network Cloud Server and monitoring unit using mobile terminal, multi-faceted adjust is carried out Control.
The information transmitted by processing unit to image unit is handled;The processing unit is known by built-in image Other determination module enters the identification of row information;Specifically include:
N number of sample is collected as training set X, sample mean m is obtained using following formula:
Wherein, xi∈ sample training collection X=(x1, x2..., xN);
Obtain scatter matrix S:
Obtain the eigenvalue λ of scatter matrixiWith corresponding characteristic vector ei, wherein, eiPrincipal component, by characteristic value from Arrive greatly and small be arranged in order λ1, λ2...;Take out p value, λ1, λ2..., λpDetermine identification space E=(e1, e2..., eP), herein Spatially, in training sample X, the point that each element projects to the space is obtained by following formula for identification:
x'i=Etxi, t=1,2 ..., N;
What is obtained by above formula is p dimensional vectors by former vector after PCA dimensionality reductions;
Described feature extraction is based on sparse representation, and many image recognitions are carried out using recognizer;
Specific method is:
Current frame image is detected and sorted by coordinate and draws the recognition result of each image of present frame;It is each according to present frame The recognition result of individual image calculates each corresponding image each adjacent n frames recognition result;The feature of each image is counted, by surpassing More than half n/2 uniform characteristics determine the final feature of target;
Wherein, the reconstruction error { r between calculating picture to be identified and image library are of all categories1, r2……rn, r1<r2<……< rn, by obtained Similarity value according toRule determine final recognition result;Wherein T1 For rate value, T1=0.65.
The monitoring unit is monitored by built-in monitoring module to the safety of food;Specifically include:
First, the integrated information appraisement system set up between analysis object and the safety index factor, appraisement system is by n points The system that m index of object is constituted is analysed, so as to obtain initial information Evaluations matrix:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
To each index normalized in A':
Normalized index:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
- matrix A ' in jth row minimum value;
- matrix A ' in jth row maximum;
aijCorrespond to the element that the i-th row j is arranged in-normalization information matrix, normative information matrix A is represented by:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Then, according to normative information matrix, the proportion of the desired value of jth index under i-th of analysis object is determined:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Finally, the entropy of i-th of analysis object is calculated by entropy assessment
Wherein, Ti- it is defined as the comentropy of i-th of analysis object;
pijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Similarly, safe sub-information entropy can be tried to achieve, i.e.,:
Wherein Si- it is defined as the safe sub-information entropy of i-th of analysis object;
qijThe proportion of jth index under-i-th analysis object;
mijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Information entropy is normalized, formula is normalized:
According to comentropy and the relation of risk factor, the risk factor classification standard based on comentropy is divided into:
0.8≤Hc≤ 1, extremely low danger;
0.6≤Hc< 0.8, low degree of hazard;
0.4≤Hc< 0.6, poor risk;
0.2≤Hc< 0.4, highly dangerous;
0≤Hc< 0.2, high danger;
Data sharing is carried out by network Cloud Server and monitoring unit using mobile terminal, multi-faceted regulation and control are carried out.
As the preferred scheme of the embodiment of the present invention, the image unit includes CCD camera;The CCD camera is eaten The information of product production process, which obtains method, to be included:Use circulation successively to project N frames (N≤3) relation for the N sinusoidal gratings for walking phase shift to exist On testee, it is assumed that system lateral magnification is M, body surface reflectivity is R (x, y), then N walks the sine of (N≤3) phase shift Light distribution in grating image plane can be expressed as
Wherein I0For background light intensity, C0(x, y) is the fringe contrast on grating image face, and f is the grating frequency of image plane Rate;
According to imaging theory, in face of grating image after vague image Id(x, y;δ) by its focusing as IiIt is (x, y) and corresponding System ambiguous equation is that the point spread function h (x, y) of system convolution is obtained, i.e.,
Id(x, y;δ)=h (x, y) * Ii(x, y) (2)
Symbol * represents convolution, Id(x, y;It is δ) away from the light distribution at imaging surface δ positions.
Further, in actual optical system, due to the distortion factor of the diffraction of optical system, dispersion and lens, use Two-dimensional Gaussian function represents the fuzzifying equation h (x, y) of system, i.e.,
σ in formulahIt is diffusion constant, corresponding to the standard deviation of point spread function, be directly proportional i.e. σ to confuson disc radiush= Cr, C value depend on optical system parameter, in most practical cases, can approximately takeCan be with by (2) (3) two formula Light distribution after obtaining in face of projection image is
Modulation of fringes before and after optical grating projection image planes is distributed as
M0(x, y) is to project the modulation degree distribution in image planes, due to a diffusion constant σhIt is directly proportional to confuson disc radius r, and R is directly proportional to defocusing amount δ, and therefore (5) formula can be rewritten as
D is distance of the tested point to reference planes, d in formulaiDistance of the optical grating projection image planes to reference planes, c be by The constant that system is determined.
Further, the modulation degree distribution of striped is calculated by Fourier transformation method or N step (N≤3) phase shift algorithms, when adopting When being handled with Fourier transformation method, any one pixel of the image set of collection as shown in formula (4) is made in Fu along time shaft Leaf transformation can be obtained
G(fdi)=G0(fdi)+G1(fdi)+G-1(fdi) (7)
Suitable spectral window is chosen by fundamental frequency G1(fdi) filter out, then its progress inverse Fourier transform can be obtained
By B (di) pixel can be calculated along the contrast C (d on time shafti), so as to obtain the pixel in the time Modulation degree distribution on axle.Each pixel on to bar graph makees Fourier transformation, and space filtering, inverse Fourier becomes Change, you can obtain the modulation degree distribution of whole striped;
When walking the method for (N≤3) step phase shift using N, in the pictures collected, for any one position (m Frame) bar graph, the modulation degree for calculating the position using N-1 bar graphs before and after at the position is distributed, wherein, m1Frame is to m2Frame, m1=round [(N-1)/2], m2=N-m1- 1, round represent the operation that rounds up, and its expression formula is as follows:
Wherein, Mm(x, y) represents the modulation angle value at m frame positions, Mod represents modulo operation.
As the preferred scheme of the embodiment of the present invention, the processing unit includes computer;The mobile terminal includes mobile phone.
As the preferred scheme of the embodiment of the present invention, the food data analysis method based on cloud computing also passes through detection Unit is shunted to violation product.
As the preferred scheme of the embodiment of the present invention, the wireless transmitter module built in the detection unit passes through routing unit With monitoring unit wireless connection.
The present invention greatly improves supervision, more accurately understands the process of factory, more transparent, has prevented to throw Machine is resorted to trickery to serve oneself, the irregular mode of production, has also contained that the production of evil mind manufacturer is harmful to food;More facilitate simultaneously, timesaving is entered The progress of row censorship.
The recognition methods of the present invention, improves the efficiency and accuracy rate of identification, preferably protects the safety of food;Extract Image feature vector method, improves resolution to a certain extent, is conducive to the collection and identification of image.
The present invention proposes a probabilistic model, interaction mechanism in the model energy concentrated expression food production, The non-linear process and dynamic process of complexity can be characterized, the risk factor of research object can be predicted, with big data, cloud The development of the technologies such as calculating, Internet of Things and maturation, it is of the invention by the importance as the data research of bromatology University of Science and Technology.
The operation principle of the present invention:
When plant produced food, image unit can be monitored in real time, by picture by processing unit via routing unit, Across network Cloud Server, monitoring unit can obtain production food picture in real time, so as to prevent irregular production.
It is both to find substandard product, control detection unit is shunted to substandard product, it is to avoid obscured, Er Qieneng Reaching to review.
It is described above to be only the preferred embodiments of the present invention, any formal limitation not is made to the present invention, Every technical spirit according to the present invention is belonged to any simple modification made for any of the above embodiments, equivalent variations and modification In the range of technical solution of the present invention.

Claims (7)

1. a kind of food data analysis method based on cloud computing, it is characterised in that the food data based on cloud computing point Analysis method includes:
The information of food production is shot by image unit;
The information transmitted by processing unit to image unit is handled;The processing unit is sentenced by built-in image recognition Cover half block enters the identification of row information;Specifically include:
N number of sample is collected as training set X, sample mean m is obtained using following formula:
Wherein, xi∈ sample training collection X=(x1, x2..., xN);
Obtain scatter matrix S:
Obtain the eigenvalue λ of scatter matrixiWith corresponding characteristic vector ei, wherein, eiPrincipal component, by characteristic value from greatly to It is small to be arranged in order λ1, λ2...;Take out p value, λ1, λ2..., λpDetermine identification space E=(e1, e2..., eP), recognize herein Spatially, in training sample X, the point that each element projects to the space is obtained by following formula:
x'i=Etxi, t=1,2 ..., N;
What is obtained by above formula is p dimensional vectors by former vector after PCA dimensionality reductions;
Described feature extraction is based on sparse representation, and many image recognitions are carried out using recognizer;
Specific method is:
Current frame image is detected and sorted by coordinate and draws the recognition result of each image of present frame;According to each figure of present frame The recognition result of picture calculates each corresponding image each adjacent n frames recognition result;The feature of each image is counted, by more than half Number n/2 uniform characteristics determine the final feature of target;
Wherein, the reconstruction error { r between calculating picture to be identified and image library are of all categories1, r2……rn, r1<r2<……<rn, will Obtained Similarity value according toRule determine final recognition result;Wherein T1For than Rate value, T1=0.65;
Pass through wireless routing unit transmission processing unit information to monitoring unit;
By network Cloud Server, the information that monitoring unit is transmitted to routing unit is monitored;And violation information is reported Alert prompting;The monitoring unit is monitored by built-in monitoring module to the safety of food;Specifically include:
First, the integrated information appraisement system set up between analysis object and the safety index factor, appraisement system is by n analysis pair As the system that m index is constituted, so as to obtain initial information Evaluations matrix:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
To each index normalized in A':
Normalized index:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
- matrix A ' in jth row minimum value;
- matrix A ' in jth row maximum;
aijCorrespond to the element that the i-th row j is arranged in-normalization information matrix, normative information matrix A is represented by:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Then, according to normative information matrix, the proportion of the desired value of jth index under i-th of analysis object is determined:
Wherein, i=1,2 ..., n;J=1,2 ..., m;
Finally, the entropy of i-th of analysis object is calculated by entropy assessment
Wherein, Ti- it is defined as the comentropy of i-th of analysis object;
pijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Similarly, safe sub-information entropy can be tried to achieve, i.e.,:
Wherein Si- it is defined as the safe sub-information entropy of i-th of analysis object;
qijThe proportion of jth index under-i-th analysis object;
mijThe proportion of jth index under-i-th analysis object;
I=1,2 ..., n;J=1,2 ..., m;
Information entropy is normalized, formula is normalized:
According to comentropy and the relation of risk factor, the risk factor classification standard based on comentropy is divided into:
0.8≤Hc≤ 1, extremely low danger;
0.6≤Hc< 0.8, low degree of hazard;
0.4≤Hc< 0.6, poor risk;
0.2≤Hc< 0.4, highly dangerous;
0≤Hc< 0.2, high danger;
Data sharing is carried out by network Cloud Server and monitoring unit using mobile terminal, multi-faceted regulation and control are carried out.
2. the food data analysis method as claimed in claim 1 based on cloud computing, it is characterised in that the image unit bag Include CCD camera;The information of the CCD camera acquisition food production, which obtains method, to be included:Using circulation projection N frames successively (N≤ 3) relation is that N walks the sinusoidal grating of phase shift on testee, it is assumed that system lateral magnification is M, and body surface reflectivity is R (x, y), the then light distribution that N is walked in the sinusoidal grating image plane of (N≤3) phase shift can be expressed as
Wherein I0For background light intensity, C0(x, y) is the fringe contrast on grating image face, and f is the grating frequency of image plane;
According to imaging theory, in face of grating image after vague image Id(x, y;δ) by its focusing as Ii(x, y) and corresponding system Fuzzifying equation is that the point spread function h (x, y) of system convolution is obtained, i.e.,
Id(x, y, δ)=h (x, y) * Ii(x, y) (2)
Symbol * represents convolution, Id(x, y;It is δ) away from the light distribution at imaging surface δ positions.
3. the food data analysis method as claimed in claim 2 based on cloud computing, it is characterised in that in actual optical system In, due to the distortion factor of the diffraction of optical system, dispersion and lens, the fuzzifying equation of system is represented using two-dimensional Gaussian function H (x, y), i.e.,
σ in formulahIt is diffusion constant, corresponding to the standard deviation of point spread function, be directly proportional i.e. σ to confuson disc radiush=Cr, C's Value depends on optical system parameter, in most practical cases, can approximately take
By (2) (3) two formula can obtain in face of projection image after light distribution be
Modulation of fringes before and after optical grating projection image planes is distributed as
M0(x, y) is to project the modulation degree distribution in image planes, due to a diffusion constant σhBe directly proportional to confuson disc radius r, and r with Defocusing amount δ is directly proportional, and therefore (5) formula can be rewritten as
D is distance of the tested point to reference planes, d in formulaiIt is distance of the optical grating projection image planes to reference planes, c is determined by system Fixed constant.
4. the food data analysis method as claimed in claim 2 based on cloud computing, it is characterised in that the modulation degree of striped point Cloth is calculated by Fourier transformation method or N step (N≤3) phase shift algorithms, when using Fourier transformation method processing, to collection Any one pixel of the image set as shown in formula (4) make Fourier transformation along time shaft and can obtain
G(fdi)=G0(fdi)+G1(fdi)+G-1(fdi) (7)
Suitable spectral window is chosen by fundamental frequency G1(fdi) filter out, then its progress inverse Fourier transform can be obtained
By B (di) pixel can be calculated along the contrast C (d on time shafti), so as to obtain the pixel on a timeline Modulation degree distribution.Each pixel on to bar graph makees Fourier transformation, space filtering, inverse Fourier transform, i.e., It can obtain the modulation degree distribution of whole striped;
When walking the method for (N≤3) step phase shift using N, in the pictures collected, for any one position (m frames) Bar graph, the modulation degree for calculating the position using N-1 bar graphs before and after at the position is distributed, wherein, m1Frame is to m2Frame, m1=round [(N-1)/2], m2=N-m1- 1, round represent the operation that rounds up, and its expression formula is as follows:
Wherein, Mm(x, y) represents the modulation angle value at m frame positions, Mod represents modulo operation.
5. the food data analysis method as claimed in claim 1 based on cloud computing, it is characterised in that the processing unit bag Include computer;The mobile terminal includes mobile phone.
6. the food data analysis method as claimed in claim 1 based on cloud computing, it is characterised in that described to be based on cloud computing Food data analysis method also violation product is shunted by detection unit.
7. the food data analysis method as claimed in claim 6 based on cloud computing, it is characterised in that in the detection unit The wireless transmitter module put passes through routing unit and monitoring unit wireless connection.
CN201710547747.2A 2017-07-06 2017-07-06 A kind of food data analysis method based on cloud computing Pending CN107273886A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710547747.2A CN107273886A (en) 2017-07-06 2017-07-06 A kind of food data analysis method based on cloud computing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710547747.2A CN107273886A (en) 2017-07-06 2017-07-06 A kind of food data analysis method based on cloud computing

Publications (1)

Publication Number Publication Date
CN107273886A true CN107273886A (en) 2017-10-20

Family

ID=60073381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710547747.2A Pending CN107273886A (en) 2017-07-06 2017-07-06 A kind of food data analysis method based on cloud computing

Country Status (1)

Country Link
CN (1) CN107273886A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108742737A (en) * 2018-06-12 2018-11-06 南通市第人民医院 A kind of minimally invasive retractor of lumbar vertebrae three-dimensional
CN109631798A (en) * 2018-12-28 2019-04-16 成都信息工程大学 A kind of 3 d shape vertical measurement method based on π phase shifting method
CN110348760A (en) * 2019-07-18 2019-10-18 秒针信息技术有限公司 Catering manufacture evaluation method and device
CN116993165A (en) * 2023-09-25 2023-11-03 乐百氏(广东)饮用水有限公司 Safety evaluation and risk prediction method and system for fruit and vegetable juice of children

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761504A (en) * 2013-12-31 2014-04-30 江苏图云智能科技发展有限公司 Face recognition system
CN104061879A (en) * 2014-06-19 2014-09-24 四川大学 Continuous-scanning structured light three-dimensional surface shape perpendicular measuring method
CN104239862A (en) * 2014-09-11 2014-12-24 中国电子科技集团公司第二十九研究所 Face recognition method
CN104331744A (en) * 2014-10-17 2015-02-04 中国科学院、水利部成都山地灾害与环境研究所 Debris flow risk degree evaluation method
CN205195873U (en) * 2015-12-02 2016-04-27 陕西电子科技职业学院 Food and beverage safety monitoring and system of traceing back based on thing networking and cloud calculate

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761504A (en) * 2013-12-31 2014-04-30 江苏图云智能科技发展有限公司 Face recognition system
CN104061879A (en) * 2014-06-19 2014-09-24 四川大学 Continuous-scanning structured light three-dimensional surface shape perpendicular measuring method
CN104239862A (en) * 2014-09-11 2014-12-24 中国电子科技集团公司第二十九研究所 Face recognition method
CN104331744A (en) * 2014-10-17 2015-02-04 中国科学院、水利部成都山地灾害与环境研究所 Debris flow risk degree evaluation method
CN205195873U (en) * 2015-12-02 2016-04-27 陕西电子科技职业学院 Food and beverage safety monitoring and system of traceing back based on thing networking and cloud calculate

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘丙昌等: "基于云服务的食品安全实时监控系统", 《现代电信科技》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108742737A (en) * 2018-06-12 2018-11-06 南通市第人民医院 A kind of minimally invasive retractor of lumbar vertebrae three-dimensional
CN109631798A (en) * 2018-12-28 2019-04-16 成都信息工程大学 A kind of 3 d shape vertical measurement method based on π phase shifting method
CN110348760A (en) * 2019-07-18 2019-10-18 秒针信息技术有限公司 Catering manufacture evaluation method and device
CN116993165A (en) * 2023-09-25 2023-11-03 乐百氏(广东)饮用水有限公司 Safety evaluation and risk prediction method and system for fruit and vegetable juice of children
CN116993165B (en) * 2023-09-25 2024-01-30 乐百氏(广东)饮用水有限公司 Safety evaluation and risk prediction method and system for fruit and vegetable juice of children

Similar Documents

Publication Publication Date Title
Jin et al. A survey of infrared and visual image fusion methods
Lu et al. Exploiting embedding manifold of autoencoders for hyperspectral anomaly detection
CN107273886A (en) A kind of food data analysis method based on cloud computing
EP3089074B1 (en) Hyperspectral demixing using foveated compressive projections
Sengupta et al. Edge information based image fusion metrics using fractional order differentiation and sigmoidal functions
Zabalza et al. Structured covariance principal component analysis for real-time onsite feature extraction and dimensionality reduction in hyperspectral imaging
CN104866831B (en) The face recognition algorithms of characteristic weighing
CN103716579A (en) Video monitoring method and system
Kwan et al. Performance of change detection algorithms using heterogeneous images and extended multi-attribute profiles (EMAPs)
Cohen et al. Subpixel hyperspectral target detection using local spectral and spatial information
JP2014038614A (en) Method of evaluating confidence of matching signature of hyperspectral image
Chen et al. Identification of various food residuals on denim based on hyperspectral imaging system and combination optimal strategy
Qian et al. Analyzing the effect of incident angle on echo intensity acquired by hyperspectral lidar based on the Lambert-Beckman model
Martino et al. Material recognition by feature classification using time-of-flight camera
EP3940370A1 (en) Method for extracting spectral information of object to be detected
Wu et al. Spectral spatio-temporal fire model for video fire detection
Wang et al. A novel filter-based anomaly detection framework for hyperspectral imagery
CN115184281B (en) Method and system for determining concentration of solution components based on two-dimensional spectrum
Wang et al. Infrared weak-small targets fusion based on latent low-rank representation and DWT
Luo et al. Moment‐Based Shape‐Learning Holography for Fast Classification of Microparticles
Ozturk et al. Semi-supervised gas detection in hyperspectral imaging
Taghipour et al. Visual attention-driven framework to incorporate spatial-spectral features for hyperspectral anomaly detection
JP2007199750A (en) Method for calculating parameter of pixel group, method for detecting target, system for calculating parameter of pixel group, and system for detecting target
Wang et al. Collaborative representation with multipurification processing and local salient weight for hyperspectral anomaly detection
Kishore et al. Hyperspectral imaging technique for plant leaf identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171020

RJ01 Rejection of invention patent application after publication