CN110909689A - Kitchen monitoring method and system - Google Patents

Kitchen monitoring method and system Download PDF

Info

Publication number
CN110909689A
CN110909689A CN201911174584.3A CN201911174584A CN110909689A CN 110909689 A CN110909689 A CN 110909689A CN 201911174584 A CN201911174584 A CN 201911174584A CN 110909689 A CN110909689 A CN 110909689A
Authority
CN
China
Prior art keywords
kitchen
classification model
judging whether
background template
judging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911174584.3A
Other languages
Chinese (zh)
Inventor
李世林
房爱印
陈萌
刘宝祥
刘泽昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Software Co Ltd
Original Assignee
Inspur Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Software Co Ltd filed Critical Inspur Software Co Ltd
Priority to CN201911174584.3A priority Critical patent/CN110909689A/en
Publication of CN110909689A publication Critical patent/CN110909689A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kitchen monitoring method and a kitchen monitoring system, belongs to monitoring analysis, and aims to solve the technical problem of how to monitor a kitchen based on an intelligent monitoring technology. The method monitors the kitchen by combining a target detection algorithm with a trained classification model, and comprises the following steps: extracting an initial background template from a kitchen monitoring video through a target detection algorithm; performing parameter optimization on the constructed classification model by taking the collected training sample as input to obtain a trained classification model, wherein the classification model is constructed based on a convolutional neural network; extracting a foreground target from the kitchen monitoring video through background subtraction based on the initial background template; and (5) distinguishing the foreground target through the trained classification model, and outputting a distinguishing result. The system comprises an acquisition module, a background template extraction module, a model training module, a foreground object extraction module and a discrimination module.

Description

Kitchen monitoring method and system
Technical Field
The invention relates to the field of monitoring analysis, in particular to a kitchen monitoring method and a kitchen monitoring system.
Background
The sanitation condition and the quality problem of catering manufacture are relatively concerned by customers, the restaurant environment and workers can be observed when the customers enter the store for eating, and the network ordering customers can only judge through display pictures provided by the store, so that the restaurant ordering customers have one-sidedness and are particularly important for monitoring the kitchen in real time.
With the development of science and technology and the maturity of computer software and hardware technology, more and more artificial intelligence technologies are successfully applied to production and life, and obtain good results, thereby greatly liberating productivity. The intelligent monitoring technology is to use a computer image visual analysis technology to analyze and track the target in the camera scene by separating the background and the target in the scene.
The advantage of the intelligent monitoring technology is how to monitor the kitchen based on the intelligent monitoring technology, which is a technical problem to be solved.
Disclosure of Invention
The invention aims at the defects and provides a kitchen monitoring method and a kitchen monitoring system to solve the problem of how to monitor a kitchen based on an intelligent monitoring technology.
In a first aspect, the present invention provides a kitchen monitoring method, which monitors a kitchen by combining a trained classification model with a target detection algorithm, and comprises the following steps:
extracting an initial background template from a kitchen monitoring video through a target detection algorithm;
performing parameter optimization on the constructed classification model by taking the collected training sample as input to obtain a trained classification model, wherein the classification model is constructed based on a convolutional neural network;
extracting a foreground target from the kitchen monitoring video through background subtraction based on the initial background template;
and (5) distinguishing the foreground target through the trained classification model, and outputting a distinguishing result.
Preferably, the method further comprises:
and analyzing the judgment result and generating alarm information for giving an alarm in real time for the foreground target which does not accord with the regulation.
Preferably, the initial background template is a binary background template.
Preferably, the method for extracting the initial background template from the kitchen monitoring video through the target detection algorithm comprises the following steps:
acquiring a kitchen monitoring video through a camera;
creating a binary background template by a binary background template modeling method;
and continuously updating the binary background template by an interframe difference method until the binary background template is unchanged to obtain an initial background template.
Preferably, the foreground objects include workers wear, kitchen waste, and living animals;
the classification model comprises a staff classification model, a kitchen waste classification model and a living animal classification model;
the staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not;
the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not;
the living animal classification model is used for judging whether living animals exist in the kitchen or not.
Preferably, the method for distinguishing the foreground target by the trained classification model comprises the following steps:
judging the foreground target through the trained worker classification model, generating a corresponding judgment result, and judging whether the wearing of the worker is in compliance;
judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, and judging whether garbage exists in the kitchen;
and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, and judging whether a living animal exists in the kitchen.
Preferably, the foreground target is judged through the trained worker classification model, a corresponding judgment result is generated, whether the wearing of workers is in compliance or not is judged, and if the wearing of the workers is not in compliance in the judgment result, corresponding alarm information is generated;
judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, judging whether garbage exists in the kitchen, and if the garbage exists in the kitchen in the judgment result, generating corresponding alarm information;
and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, judging whether a living animal exists in the kitchen or not, and generating corresponding alarm information if the living animal exists in the kitchen in the judgment result.
In a second aspect, the present invention provides a kitchen monitoring system comprising:
the acquisition module is used for acquiring a kitchen monitoring video;
the background template extraction module is used for extracting an initial background template from the kitchen monitoring video through a target detection algorithm;
the model training module is used for acquiring training samples, constructing a classification model based on a convolutional neural network, performing parameter optimization on the constructed classification model by taking the acquired training samples as input, and outputting the trained classification model;
the foreground target extraction module is used for extracting a foreground target from the kitchen monitoring video through background subtraction based on an initial background template;
and the judging module is used for judging the foreground target through the trained classification model and outputting a judging result.
Preferably, the system further comprises an analysis module, wherein the analysis module is used for analyzing the judgment result and generating alarm information, and real-time alarming is carried out on the foreground target which does not accord with the regulation through the alarm information.
Preferably, the foreground objects include workers wear, kitchen waste, and living animals;
the classification model comprises a staff classification model, a kitchen waste classification model and a living animal classification model;
the staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not;
the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not;
the living animal classification model is used for judging whether living animals exist in the kitchen or not.
The kitchen monitoring method and the kitchen monitoring system have the following advantages:
1. extracting a background template based on a target detection algorithm, training a classification model based on deep learning, extracting a foreground target in a kitchen monitoring video based on the background template, and then distinguishing the foreground target through the trained classification model, so that the internal condition of a kitchen can be automatically monitored and recognized in real time, the manpower is reduced, and no dead angle is monitored;
2. the classification model relates to that the staff dresses, kitchen rubbish and live body animal, and monitoring range is wide, has guaranteed kitchen health safety.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
The invention is further described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a kitchen monitoring method according to embodiment 1.
Detailed Description
The present invention is further described in the following with reference to the drawings and the specific embodiments so that those skilled in the art can better understand the present invention and can implement the present invention, but the embodiments are not to be construed as limiting the present invention, and the embodiments and the technical features of the embodiments can be combined with each other without conflict.
The embodiment of the invention provides a kitchen monitoring method and system, which are used for solving the technical problem of how to monitor a kitchen based on an intelligent monitoring technology.
Example 1:
as shown in fig. 1, the kitchen monitoring method of the present invention monitors a kitchen by combining a trained classification model with a target detection algorithm, and includes the following steps:
s100, extracting an initial background template from a kitchen monitoring video through a target detection algorithm;
s200, performing parameter optimization on the constructed classification model by taking the collected training samples as input to obtain a trained classification model, wherein the classification model is constructed based on a convolutional neural network;
s300, extracting a foreground target from the kitchen monitoring video through background subtraction based on the initial background template;
and S400, distinguishing the foreground target through the trained classification model, and outputting a distinguishing result.
In this embodiment, the initial background template is a binarized background template. In step S100, an initial background template is extracted from the kitchen surveillance video by using a target detection algorithm, including the following steps:
s110, acquiring a kitchen monitoring video through a camera;
s120, creating a binary background template by a binary background template modeling method;
s130, continuously updating the binary background template by an interframe difference method until the binary background template is unchanged to obtain an initial background template.
In view of the fact that a monitoring video shot by a camera has certain angle limitation, workers, garbage and live animals in a kitchen move continuously, and therefore after the binary background template is created through the binary background template modeling method, the binary background template is updated continuously through an interframe difference method, and the background template can be obtained through contrastive analysis of each frame of video monitoring image for a long time.
Kitchen monitoring mainly involves wearing by staff, identifying garbage and living animals in the kitchen, and therefore the foreground object extracted in step S300 in this embodiment includes wearing by staff, garbage in the kitchen and living animals; correspondingly, the classification model constructed in step S200 includes a staff classification model, a kitchen waste classification model, and a living animal classification model. The staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not; the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not; the living animal classification model is used for judging whether living animals exist in the kitchen or not.
The method for distinguishing the foreground target through the trained classification model comprises the following steps:
s410, judging the foreground target through the trained worker classification model, generating a corresponding judgment result, and judging whether the wearing of workers is in compliance;
s420, judging the foreground target through a training kitchen garbage classification model, generating a corresponding judgment result, and judging whether garbage exists in a kitchen;
and S430, judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, and judging whether a living animal exists in the kitchen.
The kitchen monitoring method realizes the identification of wearing of workers in a kitchen, garbage in the kitchen and living animals in the kitchen, can monitor whether clothes, masks and caps of the workers are in compliance or not in real time, can judge whether garbage exists on the kitchen and the floor and the table top of the kitchen or not in real time, and can monitor whether living animals exist in the kitchen or not in real time.
As an improvement of this embodiment, the method further includes step S500, analyzing the determination result and generating warning information for real-time warning of the foreground object that does not meet the specification.
In the improved embodiment, the trained worker classification model is used for judging the foreground target and generating a corresponding judgment result, whether the wearing of workers is in compliance or not is judged, and if the wearing of the workers is not in compliance in the judgment result, corresponding alarm information is generated; judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, judging whether garbage exists in the kitchen, and if the garbage exists in the kitchen in the judgment result, generating corresponding alarm information; and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, judging whether a living animal exists in the kitchen or not, and generating corresponding alarm information if the living animal exists in the kitchen in the judgment result.
The alarm information can inform relevant responsible persons of correcting the condition in the kitchen in time, and inform consumers of the sanitary condition in the kitchen in time, thereby facilitating the selection of the consumers.
Example 2:
the invention discloses a kitchen monitoring system which comprises an acquisition module, a background template extraction module, a model training module, a foreground object extraction module and a discrimination module. The acquisition module is used for acquiring a kitchen monitoring video; the background template extraction module is used for extracting an initial background template from the kitchen monitoring video through a target detection algorithm; the model training module is used for acquiring training samples, constructing a classification model based on the convolutional neural network, performing parameter optimization on the constructed classification model by taking the acquired training samples as input, and outputting the trained classification model; the foreground target extraction module is used for extracting a foreground target from the kitchen monitoring video through background subtraction based on the initial background template; and the discrimination module is used for discriminating the foreground target through the trained classification model and outputting a discrimination result.
The acquisition module comprises at least one high-definition camera, and real-time kitchen monitoring videos are obtained through the high-definition camera.
The background template extraction module constructs an initial background template by the following method: acquiring a kitchen monitoring video from an acquisition module, and creating a binary background template by a binary background template modeling method; and continuously updating the binary background template by an interframe difference method until the binary background template is unchanged to obtain an initial background template.
Kitchen monitoring mainly relates to staff and dresses, the discernment of rubbish and live body animal in the kitchen, therefore, in this embodiment, the classification model that model training module constructed includes staff classification model, kitchen rubbish classification model and live body animal classification model. The staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not; the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not; the living animal classification model is used for judging whether living animals exist in the kitchen or not. Correspondingly, the foreground object extracted by the foreground object extraction module comprises the foreground object worn by workers, kitchen waste and living animals.
Correspondingly, the main functions of the discrimination module are as follows: judging the foreground target through the trained worker classification model, generating a corresponding judgment result, and judging whether the wearing of the worker is in compliance; judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, and judging whether garbage exists in the kitchen; and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, and judging whether a living animal exists in the kitchen.
The kitchen monitoring system can realize the kitchen monitoring method disclosed in embodiment 1, and specifically comprises the following steps:
(1) extracting an initial background template from the kitchen monitoring video through a target detection algorithm based on a background template extraction module;
(2) the model-based training module is used for performing parameter optimization on a classification model constructed based on a convolutional neural network by taking the collected training samples as input to obtain a trained classification model;
(3) extracting a foreground target from the kitchen monitoring video through background subtraction at a foreground target extraction module;
(4) and in the judging module, judging the foreground target through the trained classification model and outputting a judging result.
As an improvement of this embodiment, the system further includes an analysis module, where the analysis module is configured to analyze the determination result and generate warning information, and perform real-time warning on the foreground object that does not meet the specification through the warning information.
Specifically, when the judgment result is analyzed in the analysis module, if the wearing of the worker is not in compliance in the judgment result, corresponding alarm information is generated; or judging whether the kitchen has garbage or not, and generating corresponding alarm information if the kitchen has garbage in the judgment result; or judging whether the living animal exists in the kitchen or not, and generating corresponding alarm information if the living animal exists in the kitchen in the judgment result.
The improved kitchen monitoring system of the embodiment can realize the improved kitchen monitoring method disclosed in embodiment 1, and specifically includes:
(1) extracting an initial background template from the kitchen monitoring video through a target detection algorithm based on a background template extraction module;
(2) the model-based training module is used for performing parameter optimization on a classification model constructed based on a convolutional neural network by taking the collected training samples as input to obtain a trained classification model;
(3) extracting a foreground target from the kitchen monitoring video through background subtraction at a foreground target extraction module;
(4) in a judging module, judging the foreground target through the trained classification model, and outputting a judging result;
(5) analyzing the judgment result in an analysis module, and generating corresponding alarm information if the wearing of the worker is not in compliance in the judgment result; or judging whether the kitchen has garbage or not, and generating corresponding alarm information if the kitchen has garbage in the judgment result; or judging whether the living animal exists in the kitchen or not, and generating corresponding alarm information if the living animal exists in the kitchen in the judgment result.
The above-mentioned embodiments are merely preferred embodiments for fully illustrating the present invention, and the scope of the present invention is not limited thereto. The equivalent substitution or change made by the technical personnel in the technical field on the basis of the invention is all within the protection scope of the invention. The protection scope of the invention is subject to the claims.

Claims (10)

1. A kitchen monitoring method is characterized in that a kitchen is monitored by combining a trained classification model through a target detection algorithm, and the method comprises the following steps:
extracting an initial background template from a kitchen monitoring video through a target detection algorithm;
performing parameter optimization on the constructed classification model by taking the collected training sample as input to obtain a trained classification model, wherein the classification model is constructed based on a convolutional neural network;
extracting a foreground target from the kitchen monitoring video through background subtraction based on the initial background template;
and (5) distinguishing the foreground target through the trained classification model, and outputting a distinguishing result.
2. A kitchen monitoring method according to claim 1, characterized in that the method further comprises:
and analyzing the judgment result and generating alarm information for giving an alarm in real time for the foreground target which does not accord with the regulation.
3. The kitchen monitoring method according to claim 1 or 2, wherein said initial background template is a binary background template.
4. The kitchen monitoring method according to claim 3, wherein the extracting of the initial background template from the kitchen monitoring video by the object detection algorithm comprises the following steps:
acquiring a kitchen monitoring video through a camera;
creating a binary background template by a binary background template modeling method;
and continuously updating the binary background template by an interframe difference method until the binary background template is unchanged to obtain an initial background template.
5. A kitchen monitoring method according to claim 1 or 2, characterized in that said foreground objects comprise staff wear, kitchen waste and live animals;
the classification model comprises a staff classification model, a kitchen waste classification model and a living animal classification model;
the staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not;
the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not;
the living animal classification model is used for judging whether living animals exist in the kitchen or not.
6. The kitchen monitoring method according to claim 5, wherein the identification of the foreground object by the trained classification model comprises the following steps:
judging the foreground target through the trained worker classification model, generating a corresponding judgment result, and judging whether the wearing of the worker is in compliance;
judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, and judging whether garbage exists in the kitchen;
and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, and judging whether a living animal exists in the kitchen.
7. The kitchen monitoring method according to claim 6, characterized in that the trained staff classification model is used for judging the foreground object and generating a corresponding judgment result, judging whether the wearing of the staff is in compliance, and if the wearing of the staff is not in compliance in the judgment result, generating corresponding alarm information;
judging the foreground object by training a garbage classification model of the kitchen and generating a corresponding judgment result, judging whether garbage exists in the kitchen, and if the garbage exists in the kitchen in the judgment result, generating corresponding alarm information;
and judging the foreground target through the trained living animal classification model, generating a corresponding judgment result, judging whether a living animal exists in the kitchen or not, and generating corresponding alarm information if the living animal exists in the kitchen in the judgment result.
8. A kitchen monitoring system, comprising:
the acquisition module is used for acquiring a kitchen monitoring video;
the background template extraction module is used for extracting an initial background template from the kitchen monitoring video through a target detection algorithm;
the model training module is used for acquiring training samples, constructing a classification model based on a convolutional neural network, performing parameter optimization on the constructed classification model by taking the acquired training samples as input, and outputting the trained classification model;
the foreground target extraction module is used for extracting a foreground target from the kitchen monitoring video through background subtraction based on an initial background template;
and the judging module is used for judging the foreground target through the trained classification model and outputting a judging result.
9. The kitchen monitoring system according to claim 8, further comprising an analysis module, wherein the analysis module is configured to analyze the determination result and generate alarm information, and perform real-time alarm on the foreground object that does not meet the specification through the alarm information.
10. A kitchen monitoring system according to claim 8 or 9, characterized in that said foreground objects comprise staff wear, kitchen waste and live animals;
the classification model comprises a staff classification model, a kitchen waste classification model and a living animal classification model;
the staff classification model is used for judging whether wearing of the staff is in compliance or not, and comprises judging whether clothing, a mask and a hat of the staff are in compliance or not;
the kitchen waste classification model is used for judging whether waste exists in a kitchen or not, and comprises the steps of judging whether waste exists on the bottom surface of the kitchen or on the table top or not;
the living animal classification model is used for judging whether living animals exist in the kitchen or not.
CN201911174584.3A 2019-11-26 2019-11-26 Kitchen monitoring method and system Pending CN110909689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911174584.3A CN110909689A (en) 2019-11-26 2019-11-26 Kitchen monitoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911174584.3A CN110909689A (en) 2019-11-26 2019-11-26 Kitchen monitoring method and system

Publications (1)

Publication Number Publication Date
CN110909689A true CN110909689A (en) 2020-03-24

Family

ID=69819602

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911174584.3A Pending CN110909689A (en) 2019-11-26 2019-11-26 Kitchen monitoring method and system

Country Status (1)

Country Link
CN (1) CN110909689A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523488A (en) * 2020-04-26 2020-08-11 上海集光安防科技股份有限公司 Real-time monitoring method for kitchen staff behaviors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
CN107273880A (en) * 2017-07-31 2017-10-20 秦皇岛玥朋科技有限公司 A kind of multi-storied garage safety-protection system and method based on intelligent video monitoring
CN109117827A (en) * 2018-09-05 2019-01-01 武汉市蓝领英才科技有限公司 Work clothes work hat wearing state automatic identifying method and alarm system based on video
CN110287787A (en) * 2019-05-21 2019-09-27 平安国际智慧城市科技股份有限公司 Image-recognizing method, device and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030058111A1 (en) * 2001-09-27 2003-03-27 Koninklijke Philips Electronics N.V. Computer vision based elderly care monitoring system
CN107273880A (en) * 2017-07-31 2017-10-20 秦皇岛玥朋科技有限公司 A kind of multi-storied garage safety-protection system and method based on intelligent video monitoring
CN109117827A (en) * 2018-09-05 2019-01-01 武汉市蓝领英才科技有限公司 Work clothes work hat wearing state automatic identifying method and alarm system based on video
CN110287787A (en) * 2019-05-21 2019-09-27 平安国际智慧城市科技股份有限公司 Image-recognizing method, device and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈震;张紫涵;曾希萌;: "复杂背景下的视频前景检测方法研究" *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523488A (en) * 2020-04-26 2020-08-11 上海集光安防科技股份有限公司 Real-time monitoring method for kitchen staff behaviors

Similar Documents

Publication Publication Date Title
CN103839085B (en) A kind of detection method of compartment exception crowd density
JP4668978B2 (en) Flame detection method and apparatus
CN107133564B (en) Tooling cap detection method
CN109377713B (en) Fire early warning method and system
CN109672863A (en) A kind of construction personnel's safety equipment intelligent monitoring method based on image recognition
CN111783744A (en) Operation site safety protection detection method and device
CN109697716B (en) Identification method and equipment of cyan eye image and screening system
CN110633697A (en) Intelligent monitoring method for kitchen sanitation
KR101196678B1 (en) Real-time fire detection device and method
CN110096945B (en) Indoor monitoring video key frame real-time extraction method based on machine learning
CN101316371B (en) Flame detecting method and device
CN111223263A (en) Full-automatic comprehensive fire early warning response system
CN111401310B (en) Kitchen sanitation safety supervision and management method based on artificial intelligence
CN117035419B (en) Intelligent management system and method for enterprise project implementation
CN114579791A (en) Construction safety violation identification method and system based on operation ticket
CN116580350A (en) Laboratory safety monitoring and early warning method and system
CN110634557B (en) Medical care resource auxiliary allocation method and system based on deep neural network
CN111177468A (en) Laboratory personnel unsafe behavior safety inspection method based on machine vision
CN110909689A (en) Kitchen monitoring method and system
CN116416281A (en) Grain depot AI video supervision and analysis method and system
CN109740527A (en) Image processing method in a kind of video frame
CN113191273A (en) Oil field well site video target detection and identification method and system based on neural network
CN111582149A (en) Production safety management and control system based on machine vision
CN116682034A (en) Dangerous behavior detection method under complex production operation scene
CN116419059A (en) Automatic monitoring method, device, equipment and medium based on behavior label

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination