CN111079806A - Monitoring method and system for kitchen sanitation - Google Patents
Monitoring method and system for kitchen sanitation Download PDFInfo
- Publication number
- CN111079806A CN111079806A CN201911217505.2A CN201911217505A CN111079806A CN 111079806 A CN111079806 A CN 111079806A CN 201911217505 A CN201911217505 A CN 201911217505A CN 111079806 A CN111079806 A CN 111079806A
- Authority
- CN
- China
- Prior art keywords
- kitchen
- neural network
- deep neural
- score
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
- G06F18/2193—Validation; Performance evaluation; Active pattern learning techniques based on specific statistical tests
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Probability & Statistics with Applications (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a monitoring method and a system for kitchen sanitation, which solve the problem that the kitchen sanitation monitoring is difficult to implement. The whole process is objective and fair, the possibility of artificial counterfeiting does not exist, good supervision is provided for the hygiene of the kitchen, and a good improvement effect can be achieved.
Description
Technical Field
The present disclosure relates to monitoring systems, and particularly to a method and system for monitoring kitchen hygiene.
Background
China is a large food country, eight major cuisines have flavor characteristics, technological progress enables people to taste delicious foods all over the country and all over the world at present in any city, restaurants with various flavors are distributed in streets and lanes of the city, but in the era of information explosion, various problems about food safety and sanitation of restaurants are also endless, such as famous 'West Bei oat flour village' and 'grandmother' and the like, kitchen problems of restaurants are endless, exposure is only one corner of iceberg, and many existing human factors of the problems are present, but what is important is that a simple and perfect monitoring system is really lacked to monitor and manage the kitchen.
Disclosure of Invention
The purpose of the disclosure is to provide a method and a system for monitoring kitchen sanitation, so that the kitchen sanitation can be effectively monitored.
The technical purpose of the present disclosure is achieved by the following technical solutions:
a method of monitoring kitchen hygiene comprising:
collecting standard images of all areas of a kitchen, extracting features and inputting the features into a first deep neural network to train a first recognition model;
collecting images of all areas when the kitchen works, extracting features and inputting the features into a second deep neural network to train a second recognition model;
and acquiring images of all areas of the kitchen during working, extracting features, inputting the features into the first recognition model and the second recognition model, and obtaining a first recognition result and a second recognition result.
Scoring the first recognition result and the second recognition result to obtain a first score X and a second score Y, and scoringWherein, A, X, Y is equal to [0.100 ]]。
As a specific embodiment, the a is input to a terminal, the terminal displays the a, and the number of the terminals is at least one.
A kitchen hygiene monitoring system comprising:
a control module, a central processing unit;
the image acquisition module is used for acquiring standard images of various areas of a kitchen and working images;
the characteristic extraction module is used for extracting the characteristics of the collected image;
the deep neural network module is used for training the extracted features to obtain a recognition model;
the identification module is used for identifying the collected image by adopting the identification model to obtain an identification result;
As a specific embodiment, the deep neural network module includes a first deep neural network and a second deep neural network, and the first deep neural network and the second deep neural network are the same or different.
Furthermore, the models obtained by the training of the first deep neural network and the training of the second deep neural network are respectively a first recognition model and a second recognition model, and a first recognition result and a second recognition result are correspondingly obtained.
Further, the scoring module scores the first recognition result as a first score X, and scores the second recognition result as a second score Y, where the scoring module scores the first recognition result as a first score X and the second recognition result as a second score YThe X, Y is equal to [0.100 ∈ ]]。
As a specific embodiment, the monitoring system further includes a terminal, and the terminal includes a display unit, and inputs and displays the a to the terminal.
Further, the number of the terminals is at least one.
A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of the present disclosure.
In conclusion, the beneficial effects of the present disclosure are: the invention provides a monitoring method and a system for kitchen sanitation, which are characterized in that firstly, a standard image of a kitchen and an image in working are input into two recognition models for deep neural network training, during monitoring, the image of the kitchen is input into the two trained recognition models for recognition to obtain a recognition result, then the recognition result is input into a grading module for grading, the average value of the grades is the grade of the state of the kitchen at that time, a grading input terminal is displayed to a user, and the user can visually see the sanitary condition of the kitchen. The whole process is objective and fair, the possibility of artificial counterfeiting does not exist, good supervision is provided for the hygiene of the kitchen, and a good improvement effect can be achieved.
Drawings
Fig. 1 is a schematic flow diagram of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the attached drawings, and it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature.
The monitoring system for kitchen sanitation comprises a control module, an image acquisition module, a feature extraction module, a deep neural network module, an identification module and a grading module. The control module is a central processing unit and controls the operation of the whole system; the deep neural network module includes a first deep neural network and a second deep neural network, which may be the same or different.
Fig. 1 is a schematic flow chart of the present disclosure, in which an image acquisition module first acquires a standard image of a kitchen and an image of the kitchen during working, extracts features of the two images, and then respectively sends the two images to a first deep neural network and a second deep neural network for training to obtain a first recognition model and a second recognition model.
After the recognition model is obtained, the image acquisition module acquires images in all working time periods of the kitchen, extracts features and simultaneously inputs the features into the first recognition model and the second recognition model to obtain a first recognition result and a second recognition result, then the first recognition result and the second recognition result are sent into the scoring module to obtain a first score X and a second score Y, and the scoring is carried outWherein, A, X, Y is equal to [0.10 ]0]。
So far, the score of the kitchen hygiene monitoring is finished, the A is input into different terminals, the score is displayed by a display module of the terminal, and a user can clearly see the score of the kitchen. The sanitary condition of the kitchen can be objectively known according to the score, the whole process is monitored by a monitoring system instead of manual monitoring, and the possibility of counterfeiting is very objective and also avoided. When a user goes to any restaurant to eat, he can firstly see the kitchen hygiene score displayed on the terminal, and the high score indicates that the hygiene condition is good and the user is relieved to eat.
As one of the preferred embodiments, the monitoring system of kitchen hygiene may further include a voice synthesis module and a storage module. The function of voice broadcast can be realized to the speech synthesis module, if monitored control system finds that a certain item in kitchen is scored less than passing, the speech synthesis module should in time send out sound and remind, plays better supervision effect.
On the other hand, a storage module is arranged in the monitoring system and connected with the image acquisition module, the acquired images can be completely stored in the storage module, data is stored, and the storage module is cleared when being set, so that the burden of an internal memory is reduced.
The equipment comprising the kitchen monitoring system is provided with a display module, the display module is also connected with the control module, scores can be displayed on the display module one by one, if a merchant presents the display module in front of a customer, the monitoring of the merchant is realized, the self-confidence and the strength are also embodied, and the customer has a dinner more relieved.
In keeping with the above disclosure of exemplary embodiments, it is intended that the scope of the disclosure be defined by the claims and their equivalents.
Claims (9)
1. A method of monitoring kitchen hygiene comprising:
collecting standard images of all areas of a kitchen, extracting features and inputting the features into a first deep neural network to train a first recognition model;
collecting images of all areas when the kitchen works, extracting features and inputting the features into a second deep neural network to train a second recognition model;
collecting images of all areas when the kitchen works, extracting features, inputting the features into the first recognition model and the second recognition model, and obtaining a first recognition result and a second recognition result;
2. A method of kitchen hygiene monitoring as claimed in claim 1, characterized in that said a is entered into a terminal, said terminal displaying said a and said terminal being at least one.
3. A kitchen hygiene monitoring system, comprising:
a control module, namely a central processing unit;
the image acquisition module is used for acquiring standard images of various areas of a kitchen and working images;
the characteristic extraction module is used for extracting the characteristics of the collected image;
the deep neural network module is used for training the extracted features to obtain a recognition model;
the identification module is used for identifying the collected image by adopting the identification model to obtain an identification result;
4. The monitoring system for kitchen hygiene of claim 3, wherein the deep neural network module includes a first deep neural network and a second deep neural network, the first deep neural network and the second deep neural network being the same or different.
5. The system for monitoring kitchen hygiene of claim 4, wherein the models obtained by the training of the first deep neural network and the training of the second deep neural network are a first recognition model and a second recognition model, respectively, and the first recognition result and the second recognition result correspond to each other.
6. The system for monitoring kitchen hygiene of claim 5, wherein said scoring module scores said first recognition result with a first score X and scores said second recognition result with a second score Y, and wherein said scoring module scores said first recognition result with a first score X and scores said second recognition result with a second score YThe X, Y is equal to [0.100 ∈ ]]。
7. The kitchen hygiene monitoring system of claims 3-6, characterized in that the monitoring system further comprises a terminal, which comprises a display unit, into which the A is input and displayed.
8. The kitchen hygiene monitoring system of claim 7, characterized in that at least one of said terminals is provided.
9. A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of claims 1-2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911217505.2A CN111079806A (en) | 2019-12-03 | 2019-12-03 | Monitoring method and system for kitchen sanitation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911217505.2A CN111079806A (en) | 2019-12-03 | 2019-12-03 | Monitoring method and system for kitchen sanitation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111079806A true CN111079806A (en) | 2020-04-28 |
Family
ID=70312504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911217505.2A Pending CN111079806A (en) | 2019-12-03 | 2019-12-03 | Monitoring method and system for kitchen sanitation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111079806A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113038082A (en) * | 2021-03-24 | 2021-06-25 | 安徽超视野智能科技有限公司 | Kitchen environment monitoring equipment and method based on image recognition |
CN113177519A (en) * | 2021-05-25 | 2021-07-27 | 福建帝视信息科技有限公司 | Density estimation-based method for evaluating messy differences of kitchen utensils |
KR20220113243A (en) * | 2021-02-05 | 2022-08-12 | 이승민 | Accredited system restaurant store information system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101426150A (en) * | 2008-12-08 | 2009-05-06 | 青岛海信电子产业控股股份有限公司 | Video image quality evaluation method and system |
CN110163788A (en) * | 2019-05-15 | 2019-08-23 | 东喜和仪(珠海市)数据科技有限公司 | Kitchen monitoring method based on artificial intelligence |
CN110473130A (en) * | 2019-07-30 | 2019-11-19 | 五邑大学 | A kind of garbage classification evaluation method, device and storage medium based on deep learning |
-
2019
- 2019-12-03 CN CN201911217505.2A patent/CN111079806A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101426150A (en) * | 2008-12-08 | 2009-05-06 | 青岛海信电子产业控股股份有限公司 | Video image quality evaluation method and system |
CN110163788A (en) * | 2019-05-15 | 2019-08-23 | 东喜和仪(珠海市)数据科技有限公司 | Kitchen monitoring method based on artificial intelligence |
CN110473130A (en) * | 2019-07-30 | 2019-11-19 | 五邑大学 | A kind of garbage classification evaluation method, device and storage medium based on deep learning |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220113243A (en) * | 2021-02-05 | 2022-08-12 | 이승민 | Accredited system restaurant store information system |
KR102515418B1 (en) * | 2021-02-05 | 2023-03-30 | 이승민 | Accredited system restaurant store information system |
CN113038082A (en) * | 2021-03-24 | 2021-06-25 | 安徽超视野智能科技有限公司 | Kitchen environment monitoring equipment and method based on image recognition |
CN113177519A (en) * | 2021-05-25 | 2021-07-27 | 福建帝视信息科技有限公司 | Density estimation-based method for evaluating messy differences of kitchen utensils |
CN113177519B (en) * | 2021-05-25 | 2021-12-14 | 福建帝视信息科技有限公司 | Density estimation-based method for evaluating messy differences of kitchen utensils |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111079806A (en) | Monitoring method and system for kitchen sanitation | |
Damen et al. | Scaling egocentric vision: The epic-kitchens dataset | |
Butterworth et al. | Performance profiling in sports coaching: a review | |
Prättälä et al. | Consistency and variation in unhealthy behaviour among Finnish men, 1982–1990 | |
Guàrdia et al. | Sensory characterization of dry-cured ham using free-choice profiling | |
WO2018209834A1 (en) | Questionnaire survey system and method used for health management | |
Back et al. | Social networks and psychological conditions in diet preferences: Gourmets and vegetarians | |
CN109409635B (en) | Agricultural product sensory data processing and acquiring method, device and system | |
MARCHISANO et al. | Consumers report preferences when they should not: A cross‐cultural study | |
CN110310142A (en) | Elevator card put-on method and device based on crowd's value analysis | |
Jacobusse et al. | An interval scale for development of children aged 0–2 years | |
Warrington et al. | A circumscribed refractory access disorder: A verbal semantic impairment sparing visual semantics | |
CN107607419A (en) | A kind of discrimination method of donkey-hide gelatin jujube texture quality | |
Haiek et al. | Understanding breastfeeding behavior: Rates and shifts in patterns in Quebec | |
WO2018209833A1 (en) | Questionnaire survey system and method having incentive model | |
CN116758607A (en) | Person identification method based on eye and facial features | |
Lipowska et al. | Children's Awareness of Healthy Behaviours-validity of Beauty & Health and Dietary Knowledge & Habits Scales | |
CN114679455B (en) | Distributed cloud service system | |
Bárcenas et al. | An international ring trial for the sensory evaluation of raw ewes’ milk cheese texture | |
CN109961220A (en) | Cherry organoleptic quality data processing system and processing method | |
Walsh et al. | Feeling the pulse | |
US11797630B2 (en) | Method for providing information, method for controlling communication terminal, communication terminal, and non-transitory computer-readable recording medium storing program | |
CN104706373A (en) | Heart vital index calculating method based on heart sounds | |
Olsen | Moderate alcohol consumption in pregnancy and subsequent left-handedness. A follow-up study | |
Matlabi et al. | A study on the fish consumption according to health education models constructs in 2012 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20211028 Address after: 223809 Room 201, building B19, insurance Town, Hubin new area, Suqian City, Jiangsu Province Applicant after: Suqian silicon based Intelligent Technology Co.,Ltd. Address before: No.66-1, software Avenue, Yuhuatai District, Nanjing City, Jiangsu Province, 210012 Applicant before: NANJING SILICON INTELLIGENCE TECHNOLOGY Co.,Ltd. |