CN111046849A - Kitchen safety implementation method and device, intelligent terminal and storage medium - Google Patents

Kitchen safety implementation method and device, intelligent terminal and storage medium Download PDF

Info

Publication number
CN111046849A
CN111046849A CN201911402520.4A CN201911402520A CN111046849A CN 111046849 A CN111046849 A CN 111046849A CN 201911402520 A CN201911402520 A CN 201911402520A CN 111046849 A CN111046849 A CN 111046849A
Authority
CN
China
Prior art keywords
image
kitchen
dish
person
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911402520.4A
Other languages
Chinese (zh)
Other versions
CN111046849B (en
Inventor
宋德超
陈翀
陈勇
郑威
李斌山
李雨铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201911402520.4A priority Critical patent/CN111046849B/en
Publication of CN111046849A publication Critical patent/CN111046849A/en
Application granted granted Critical
Publication of CN111046849B publication Critical patent/CN111046849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B40/00Technologies aiming at improving the efficiency of home appliances, e.g. induction cooking or efficient technologies for refrigerators, freezers or dish washers

Abstract

The application discloses a method and a device for realizing kitchen safety, an intelligent terminal and a storage medium, which relate to the technical field of computers, and the method comprises the following steps: firstly, acquiring an image of a kitchen to obtain an image to be analyzed; identifying dish state categories corresponding to stir-frying image contents of a cooking bench area in the image to be analyzed based on a trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting; and if the identified dish state type is the pan pasting state type, closing the stove fire and giving an alarm. The method can ensure the safety of the kitchen and avoid the problem that the dishes are burnt and even fire disasters can be caused because people leave the kitchen or do not pay attention to the dishes in time.

Description

Kitchen safety implementation method and device, intelligent terminal and storage medium
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for realizing kitchen safety, an intelligent terminal and a storage medium.
Background
Many families may leave a kitchen for taking food materials in another place when cooking dishes, or the dishes in the wok cannot be turned in time because of chatty with people, so that the dishes in the wok are not processed for a long time, and the dishes are damaged or even fire is caused.
Therefore, how to avoid the problem that dishes are deteriorated and even cause fire because people do not timely treat the dishes on the cooking bench is a concern for the technical personnel in the field.
Disclosure of Invention
The embodiment of the application provides a kitchen safety realization method and device, an intelligent terminal and a storage medium, which are used for ensuring the kitchen safety and avoiding the problem that the dishes are burnt and even possibly cause fire because people leave the kitchen or do not pay attention to the dishes in time.
In a first aspect, an embodiment of the present application provides a method for implementing kitchen safety, where the method includes:
acquiring an image of a kitchen to obtain an image to be analyzed;
identifying dish state categories corresponding to stir-frying image contents of a cooking bench area in the image to be analyzed based on a trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting;
and if the identified dish state type is the pan pasting state type, closing the stove fire and giving an alarm.
Optionally, before identifying, based on the trained first neural network, a dish state category corresponding to the cooking image content of the cooktop area in the image to be analyzed, the method further includes:
identifying the image to be analyzed based on a pre-trained second neural network; and determining that the identification result is the cooking image content of the cooking range area in the image to be analyzed.
Optionally, the identifying, by the trained first neural network, the dish state category corresponding to the content of the cooking image of the cooking bench region in the image to be analyzed includes:
dividing the monitored video stream into a plurality of video segments;
processing each video segment into a plurality of image frames respectively;
the first neural network is used for taking the characteristics of the current frame and the image frame before the current frame as input aiming at each current frame so as to extract the characteristic information of the current frame;
and determining the dish state category corresponding to the current frame according to the characteristic information of the current frame.
Optionally, the method further includes:
detecting whether a person exists in the kitchen based on a millimeter wave radar wave technology; alternatively, the first and second electrodes may be,
detecting whether a person exists in the kitchen based on an infrared induction technology;
if no person is detected in the kitchen, turning down the cooking bench heat and executing the step of recognizing dish state categories corresponding to the dish cooking image contents of the cooking bench area in the image to be analyzed based on the trained first neural network;
and if the person is detected in the kitchen, executing the step of identifying the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network.
Optionally, the detecting whether someone is in the kitchen based on the millimeter wave radar wave technology includes:
sending a millimeter wave radar signal to position a person;
performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals;
if the cluster analysis result does not include the cluster of the corresponding person, determining that the kitchen does not have the person;
and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen.
Optionally, the method further includes:
if the identified dish state category does not reach the pan-pasting state category, when waiting to receive a preset trigger event, closing the stove fire and giving an alarm, wherein the preset trigger event is at least one or combination of the following events:
receiving an event of reaching the timing duration of a preset timer;
determining the oil smoke increment as a designated increment;
and determining that the temperature of the stove fire reaches a preset temperature threshold value.
Optionally, the turning off the stove fire and alarming includes:
sending alarm information to the appointed terminal equipment; or the like, or, alternatively,
and alarming is carried out through an installed alarm.
In a second aspect, an embodiment of the present invention further provides an apparatus for implementing kitchen safety, where the apparatus includes:
the acquisition module is used for acquiring an image of the kitchen to obtain an image to be analyzed;
the identification module is used for identifying dish state categories corresponding to the stir-frying image contents of the cooking bench area in the image to be analyzed based on the trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting;
and the processing module is used for closing the stove fire and giving an alarm if the identified dish state type is the pan-burnt state type.
Optionally, the apparatus further comprises:
the pre-recognition module is used for recognizing the image to be analyzed based on a pre-trained second neural network before recognizing the dish state category corresponding to the stir-frying image content of the cooking bench area in the image to be analyzed based on the trained first neural network; and determining that the identification result is the cooking image content of the cooking range area in the image to be analyzed.
Optionally, the identification module is specifically configured to:
dividing the monitored video stream into a plurality of video segments;
processing each video segment into a plurality of image frames respectively;
the first neural network is used for taking the characteristics of the current frame and the image frame before the current frame as input aiming at each current frame so as to extract the characteristic information of the current frame;
and determining the dish state category corresponding to the current frame according to the characteristic information of the current frame.
Optionally, the apparatus further comprises:
the detection module is used for detecting whether a person exists in the kitchen based on a millimeter wave radar wave technology; alternatively, the first and second electrodes may be,
detecting whether a person exists in the kitchen based on an infrared induction technology;
the adjusting module is used for reducing the cooking bench temperature and executing the step of recognizing the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network if the fact that no person exists in the kitchen is detected;
and if the person is detected in the kitchen, executing the step of identifying the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network.
Optionally, the detection module is specifically configured to:
sending a millimeter wave radar signal to position a person;
performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals;
if the cluster analysis result does not include the cluster of the corresponding person, determining that the kitchen does not have the person;
and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen.
Optionally, the apparatus further comprises:
the waiting module is used for closing the stove fire and giving an alarm when waiting for receiving a preset triggering event if the identified dish state category does not reach the pan-pasting state category, wherein the preset triggering event is at least one or combination of the following:
receiving an event of reaching the timing duration of a preset timer;
determining the oil smoke increment as a designated increment;
and determining that the temperature of the stove fire reaches a preset temperature threshold value.
Optionally, the turning off the stove fire and alarming includes:
sending alarm information to the appointed terminal equipment; or the like, or, alternatively,
and alarming is carried out through an installed alarm.
In a third aspect, an embodiment of the present invention further provides an intelligent terminal, including:
a memory and a processor;
a memory for storing program instructions;
and the processor is used for calling the program instructions stored in the memory and implementing the kitchen safety method according to any one of the first aspect of the obtained program.
In a fourth aspect, the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores computer-executable instructions, and the computer-executable instructions are configured to cause a computer to execute the method for implementing kitchen security according to any one of the embodiments of the present application.
The application discloses a method and a device for realizing kitchen safety, an intelligent terminal and a storage medium, which relate to the technical field of computers, and the method comprises the following steps: firstly, acquiring an image of a kitchen to obtain an image to be analyzed; identifying dish state categories corresponding to stir-frying image contents of a cooking bench area in the image to be analyzed based on a trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting; and if the identified dish state type is the pan pasting state type, closing the stove fire and giving an alarm. The method can ensure the safety of the kitchen and avoid the problem that the dishes are burnt and even fire disasters can be caused because people leave the kitchen or do not pay attention to the dishes in time.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for implementing kitchen security according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a method for implementing kitchen security according to another embodiment of the present application;
FIG. 3 is a schematic diagram of a neural network model provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an implementation device for kitchen safety provided by an embodiment of the present application;
fig. 5 is a schematic structural diagram of an intelligent terminal provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the descriptions so used are interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein.
Many families can leave a kitchen for a moment because of going to another place to take food materials when cooking dishes, or cannot turn over the dishes in the wok in time because of chatty with people, so that dishes are deteriorated and even fire disasters are caused.
In view of this, the present application provides a method for realizing kitchen safety, and the design concept of the method is divided into three main parts: the first part is a human body detection part for detecting whether a human body is in a kitchen area, thereby performing different operations according to different detection results; the second part is a dish detection part and is used for judging the state classification of the dishes through the images of the dishes so as to process the dishes in time according to the classification of the images of the dishes; the third part is an alarm part for alarming when the stove fire is closed due to the detection of abnormal conditions, and reminding people to timely process related problems so as to avoid problems.
To further illustrate the technical solutions provided by the embodiments of the present application, the following detailed description is made with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide the method steps described in the embodiments below or shown in the drawings, more or less steps may be included in the method based on conventional or non-inventive efforts. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application.
First, according to a first part of the design concept, a human body detection is first performed for detecting whether a person is in the kitchen area. In one possible embodiment, the presence of a person in the kitchen may be detected based on millimeter wave radar wave technology; when the method is specifically implemented, firstly, millimeter wave radar signals are sent to position a person; performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals; after the cluster analysis result is obtained, if the cluster of the corresponding person is not included in the cluster analysis result, determining that the person does not exist in the kitchen; and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen. Through the millimeter wave radar technology in this embodiment, have the characteristics of high sensitivity to the rate of accuracy of target position's detection is high, in addition, still has the advantage that is disturbed little by environmental factor such as kitchen oil smoke. In another possible embodiment, whether a person is in the kitchen or not can be detected based on an infrared sensing technology, and a detection result of whether the person is in the kitchen or not can be obtained by sensing an object by infrared light which cannot be sensed by human eyes.
By detecting human bodies, if no person is detected in the kitchen, the cooking bench is turned down to execute the implementation of the dish detecting part; if it is detected that there is a person in the kitchen, at this time, although the person is in the kitchen, other things may be in progress while ignoring the cooktop area, and thus, it is necessary to perform the dish detecting part without turning down the cooktop heat at this time. The state of the human body and the kitchen is monitored by detecting the human body, and the condition that the dish is not damaged can be guaranteed to a great extent by timely turning down the stove fire when the human body is not in the kitchen; or the person is in the kitchen but does not move, the state of the dishes is detected to ensure that the dishes are not damaged.
After the above-described detection of the human body, the dish detecting section will be described with reference to specific examples. Referring to fig. 1, a method for implementing kitchen security provided by an embodiment of the present application includes:
step 101: and acquiring an image of the kitchen to obtain an image to be analyzed.
Acquiring an image to be analyzed if the kitchen needs to be subjected to image acquisition; first, optionally, a captured image of the kitchen is obtained by monitoring the video stream. In specific implementation, the acquired monitored video stream can be divided into a plurality of video segments, and then the video segments are respectively processed into a plurality of image frames; therefore, different image frames are used as images to be analyzed to judge the dish state category.
Further, identifying the determined image to be analyzed based on a pre-trained second neural network; and determining the recognition result as the cooking image content of the cooking range area in the image to be analyzed. In specific implementation, after a plurality of image frames about the kitchen are acquired, target images are detected based on pre-training of the neural network, namely, the content of the cooking images in the image frames is identified, and preparation is made for detecting the dish state category.
Step 102: identifying dish state categories corresponding to stir-frying image contents of a cooking bench area in the image to be analyzed based on a trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting.
It should be noted that the first neural network trained here and the pre-trained second neural network described above are used only for differentiation, and the first neural network and the second neural network may employ the same neural network model.
The dish state category is used for representing the state of dishes from frying to pan pasting. For example, the dish status categories may include a "raw" status, a "half-cooked" status, and a "cooked status," wherein when the dish status category is detected to be the "cooked" status, then the dish may be determined to have been previously burnt.
In one embodiment, referring to fig. 2, a schematic diagram of a neural network model provided in the embodiment of the present application is first processed into a series of image frames V ═ I0,I1,…,In}; for example, the image frame I1, the image frame I2, and the image frame I3 in fig. 2 are three image frames processed from the surveillance video stream. The purpose of this model is to recover the detected target object { O } from the frame level0,O1,…,OnEach of which is OkThe representation corresponds to an image frame IkThe bounding box of (1) and the list of predicted classes, the target object in this application being an image of a dish in an image frame. And then determining the corresponding dish classification according to the determined dish image. For example, there are three categories of dishes in fig. 2, including: "raw", "half-cooked" and "cooked", the image frame I1 is determined to be classified as "raw" and "cooked" after being identified by the neural network model,The image frame I2 was determined to be the dish classified as "half-cooked" after being recognized by the neural network model, and the image frame I3 was determined to be the dish classified as "cooked" after being recognized by the neural network model. Fig. 2 is an example provided for the purpose of illustrating the present application and is not intended to limit the present application.
In order to construct the model shown in fig. 2, a SSD (Single-Shot multi-box Detector) framework based on a mobile neural network (a lightweight neural network) architecture may be adopted first, and in order to reduce the number of parameters and make the operation faster, all convolution layers in the SSD feature layers are optionally replaced with separable deep convolutions, and the last layer used for classification in the original mobile network is deleted at the same time; the convolutional LSTM (Long Short-Term Memory) layer is injected directly into the single frame detector. The convolutional LSTM layer allows the network to encode spatial and temporal information, and thus creates a uniform model for processing the temporal stream of images.
Then, for each frame of image detection, feature extraction is performed from both the Conv (vector convolution) layer and the ConvLSTM (vector convolution in combination with LSTM) layer, where each ConvLSTM layer passes the extracted features to the SSD for computation and prediction. The state variable C of the LSTM will be passed between image framestIn association, the state of the LSTM layer corresponding to the previous frame image is transmitted to the LSTM layer corresponding to the next frame image, and the LSTM layer corresponding to the next frame image is calculated in feature calculation and prediction of the next frame image, so that information between the previous frame image and the next frame image is transmitted, and the accuracy of target object detection can be improved. For example, as shown in fig. 2, the feature information of the ConvLSTM layer is extracted and added to the calculation of the next frame image, and multi-scale feature information can be obtained by such an embodiment.
In specific implementation, the prediction model provided herein can be described by the following equation: f (I)t,St-1)=(Ot,St) Wherein Sk ═ { Sk ═ Sk0,Sk1,…,Skm-1It refers to a feature map vector representing a segment of video that is cut off to the current frame k. In addition, an LST comprising m layers may be utilizedM, where the feature vector St-1As a state input to an LSTM; feature vector StEach feature map of (a) is from the state output of the LSTM. In the embodiment of the present application, in order to obtain the detection result of the whole video, it is only necessary to pass each image frame through the neural network in sequence, and then the last image frame with the information of each image frame can be obtained, and the detection result is determined according to the image frame.
Step 103: and if the identified dish state type is the pan pasting state type, closing the stove fire and giving an alarm.
In addition, when the dish is detected, the detection of a preset trigger event is started, and in the process of detecting the dish, if the preset trigger event is received, the stove fire is turned off and an alarm is given; wherein the preset trigger event is at least one or combination of the following events: receiving an event of reaching the timing duration of a preset timer; determining the oil smoke increment as a designated increment; and determining that the temperature of the stove fire reaches a preset temperature threshold value.
In one embodiment, if the preset trigger event is an event of the preset timer reaching the timing duration; in specific implementation, the dish detection is started, the safety timing is started, different safety timing durations are set according to different dishes, and if the safety timing for the dish reaches the correspondingly set safety timing duration, even if the dish state type is not detected to be the pan pasting state, the stove fire is turned off and an alarm is given.
In another embodiment, if the preset trigger event is to determine that the lampblack increment is a designated increment; when the specific implementation is carried out, the increment of the oil smoke is detected in time, the threshold value of the increment of the oil smoke in unit time is determined to be the designated increment, and when the acquired increment of the oil smoke in the current unit time reaches the designated increment or exceeds the designated increment, the stove fire is closed and an alarm is given.
In another embodiment, if the predetermined trigger event is a determination that the temperature of the stove fire reaches a predetermined temperature threshold; in particular, the temperature detector is arranged near the stove fire to determine the temperature of the stove fire; or acquiring a stove fire image, and determining the temperature of the stove fire according to the stove fire image; and then comparing the acquired stove fire temperature with a preset temperature threshold, and when the stove fire temperature reaches the preset temperature threshold, closing the stove fire and giving an alarm.
It should be noted that one implementation of the alarm is to send alarm information to a specific terminal device, for example, send a prompt short message or ring to a preset specific terminal device, such as a mobile phone, a computer, etc.; or another alarm is given by an installed alarm, wherein one possible scenario is that the alarm is installed in a kitchen and the alarm is given by the bell of the alarm.
Through the method for realizing kitchen safety provided by the application, the safety of dishes and the kitchen cooking bench area is guaranteed by combining a millimeter wave radar technology and a neural network learning model, and the abnormal conditions can be processed in time and an alarm is given, so that people are reminded of processing in time, and the dishes are prevented from being deteriorated or even generating the safety problem of a kitchen.
For a more clear understanding of the method provided by the present application, referring to fig. 3, a flowchart of a method for implementing kitchen security provided by another embodiment of the present application is further described, which includes:
firstly, the detection of a human body detection part is carried out, and the following steps are carried out:
step 301: and sending the radar signal.
Step 302: an echo signal is received.
Step 303: and processing the echo signals.
Step 304: and acquiring the electric cloud data and time-frequency information.
The point cloud data and the time-frequency information are processed through Kalman filtering, so that the detected targets are clustered and tracked, and classification is performed through a classification algorithm, so that the category of the dish state can be determined. When there is no person, the class is not obtained at the time of clustering, and thus it can be judged that the person is not in the kitchen.
Step 305: and carrying out human body detection and tracking.
And the state of the person is detected and tracked in time based on the millimeter wave radar technology.
Step 306: it is detected whether a person is in the kitchen area.
If a person is detected in the kitchen area, go to step 307 a; if it is detected that the person is not in the kitchen area, step 307b is performed.
Step 307 a: and determining the dish state category aiming at the dish detection image.
Step 307 b: and (5) reducing the fire in the dish detection image and determining the dish state category.
Further, when determining the dish state category with respect to the dish detection image, analysis may be performed in conjunction with the smell. For example, if it is determined that the dish does not reach the state of being burnt currently based on the dish detection image, but the odor of being burnt is detected and analyzed by the odor detector, it may be determined that the current state of the dish is the state of being burnt, and a different operation may be performed with respect to the determination result as in step 308 below.
Step 308: and judging whether the pan needs to be burnt or not according to the dish detection image.
If the dish detection image is classified as being burnt, executing step 310; if the dish detection image is not classified into the pan-pasting state, step 309 is executed.
Step 309: whether a preset trigger event is detected.
If the preset trigger time is detected, step 310 is executed. Otherwise, the detection is continued.
Step 310: the range is turned off and an alarm is given.
Based on the same inventive concept, referring to fig. 4, a device for implementing kitchen safety provided by the embodiment of the present application includes: an acquisition module 401, an identification module 402 and a processing module 403.
The acquisition module 401 is used for acquiring an image of a kitchen to obtain an image to be analyzed;
the identification module 402 is configured to identify, based on a trained first neural network, a dish state category corresponding to a stir-frying image content of a cooking bench region in the image to be analyzed; the dish state category is used for representing the state of dishes from frying to pan pasting;
and the processing module 403 is configured to turn off the stove fire and alarm if the identified dish state type is the pan-burnt state type.
Optionally, the apparatus further comprises:
the pre-recognition module is used for recognizing the image to be analyzed based on a pre-trained second neural network before recognizing the dish state category corresponding to the stir-frying image content of the cooking bench area in the image to be analyzed based on the trained first neural network; and determining that the identification result is the cooking image content of the cooking range area in the image to be analyzed.
Optionally, the identifying module 402 is specifically configured to:
dividing the monitored video stream into a plurality of video segments;
processing each video segment into a plurality of image frames respectively;
the first neural network is used for taking the characteristics of the current frame and the image frame before the current frame as input aiming at each current frame so as to extract the characteristic information of the current frame;
and determining the dish state category corresponding to the current frame according to the characteristic information of the current frame.
Optionally, the apparatus further comprises:
the detection module is used for detecting whether a person exists in the kitchen based on a millimeter wave radar wave technology; alternatively, the first and second electrodes may be,
detecting whether a person exists in the kitchen based on an infrared induction technology;
the adjusting module is used for reducing the cooking bench temperature and executing the step of recognizing the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network if the fact that no person exists in the kitchen is detected;
and if the person is detected in the kitchen, executing the step of identifying the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network.
Optionally, the detection module is specifically configured to:
sending a millimeter wave radar signal to position a person;
performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals;
if the cluster analysis result does not include the cluster of the corresponding person, determining that the kitchen does not have the person;
and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen.
Optionally, the apparatus further comprises:
the waiting module is used for closing the stove fire and giving an alarm when waiting for receiving a preset triggering event if the identified dish state category does not reach the pan-pasting state category, wherein the preset triggering event is at least one or combination of the following:
receiving an event of reaching the timing duration of a preset timer;
determining the oil smoke increment as a designated increment;
and determining that the temperature of the stove fire reaches a preset temperature threshold value.
Optionally, the turning off the stove fire and alarming includes:
sending alarm information to the appointed terminal equipment; or the like, or, alternatively,
and alarming is carried out through an installed alarm.
After the method and the device for implementing kitchen security in the exemplary embodiment of the present application are introduced, a smart terminal in another exemplary embodiment of the present application is introduced next.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible embodiments, a smart terminal according to the present application may include at least one processor, and at least one memory. Wherein the memory stores a computer program which, when executed by the processor, causes the processor to perform the steps of the kitchen security implementation method according to various exemplary embodiments of the present application described above in the present specification. For example, the processor may perform steps 101-103 as shown in FIG. 1.
The smart terminal 130 according to this embodiment of the present application is described below with reference to fig. 5. The smart terminal 130 shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 5, the smart terminal 130 is represented in the form of a general smart terminal. The components of the intelligent terminal 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The intelligent terminal 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.) and/or any device (e.g., router, modem, etc.) that enables the intelligent terminal 130 to communicate with one or more other intelligent terminals. Such communication may occur via input/output (I/O) interfaces 135. Also, the intelligent terminal 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 136. As shown, the network adapter 136 communicates with other modules for the intelligent terminal 130 over the bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the smart terminal 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In some possible embodiments, various aspects of the control method of the smart terminal provided in the present application may also be implemented in the form of a program product including a computer program, when the program product is run on a computer device, for causing the computer device to execute the steps in the implementation method of kitchen security according to various exemplary embodiments of the present application described above in this specification, for example, the computer device may execute the steps 101 to 103 shown in fig. 1.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for kitchen security implementation of embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include a computer program, and may run on a smart terminal. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with a readable computer program embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer program embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer programs for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer program may execute entirely on the target object smart terminal, partly on the target object device, as a stand-alone software package, partly on the target object smart terminal and partly on a remote smart terminal, or entirely on the remote smart terminal or server. In the case of remote intelligent terminals, the remote intelligent terminals may be connected to the target object intelligent terminal through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external intelligent terminals (for example, through the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having a computer-usable computer program embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (16)

1. A method for realizing kitchen safety is characterized by comprising the following steps:
acquiring an image of a kitchen to obtain an image to be analyzed;
identifying dish state categories corresponding to stir-frying image contents of a cooking bench area in the image to be analyzed based on a trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting;
and if the identified dish state type is the pan pasting state type, closing the stove fire and giving an alarm.
2. The method of claim 1, wherein prior to identifying, based on the trained first neural network, a dish state category corresponding to pan image content of a cooktop region in the image to be analyzed, the method further comprises:
identifying the image to be analyzed based on a pre-trained second neural network; and determining that the identification result is the cooking image content of the cooking range area in the image to be analyzed.
3. The method of claim 1 or 2, wherein the identifying, based on the trained first neural network, a dish state category corresponding to a pan image content of a cooktop region in the image to be analyzed comprises:
dividing the monitored video stream into a plurality of video segments;
processing each video segment into a plurality of image frames respectively;
the first neural network is used for taking the characteristics of the current frame and the image frame before the current frame as input aiming at each current frame so as to extract the characteristic information of the current frame;
and determining the dish state category corresponding to the current frame according to the characteristic information of the current frame.
4. The method of claim 1, further comprising:
detecting whether a person exists in the kitchen based on a millimeter wave radar wave technology; alternatively, the first and second electrodes may be,
detecting whether a person exists in the kitchen based on an infrared induction technology;
if no person is detected in the kitchen, turning down the cooking bench heat and executing the step of recognizing dish state categories corresponding to the dish cooking image contents of the cooking bench area in the image to be analyzed based on the trained first neural network;
and if the person is detected in the kitchen, executing the step of identifying the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network.
5. The method of claim 4, wherein the detecting whether a person is in the kitchen based on millimeter wave radar wave technology comprises:
sending a millimeter wave radar signal to position a person;
performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals;
if the cluster analysis result does not include the cluster of the corresponding person, determining that the kitchen does not have the person;
and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen.
6. The method of claim 1, further comprising:
if the identified dish state category does not reach the pan-pasting state category, when waiting to receive a preset trigger event, closing the stove fire and giving an alarm, wherein the preset trigger event is at least one or combination of the following events:
receiving an event of reaching the timing duration of a preset timer;
determining the oil smoke increment as a designated increment;
and determining that the temperature of the stove fire reaches a preset temperature threshold value.
7. The method of claim 1 or 6, wherein the shutting down a fire and alerting comprises:
sending alarm information to the appointed terminal equipment; or the like, or, alternatively,
and alarming is carried out through an installed alarm.
8. An apparatus for realizing kitchen safety, which is characterized in that the apparatus comprises:
the acquisition module is used for acquiring an image of the kitchen to obtain an image to be analyzed;
the identification module is used for identifying dish state categories corresponding to the stir-frying image contents of the cooking bench area in the image to be analyzed based on the trained first neural network; the dish state category is used for representing the state of dishes from frying to pan pasting;
and the processing module is used for closing the stove fire and giving an alarm if the identified dish state type is the pan-burnt state type.
9. The apparatus of claim 8, further comprising:
the pre-recognition module is used for recognizing the image to be analyzed based on a pre-trained second neural network before recognizing the dish state category corresponding to the stir-frying image content of the cooking bench area in the image to be analyzed based on the trained first neural network; and determining that the identification result is the cooking image content of the cooking range area in the image to be analyzed.
10. The apparatus according to claim 8 or 9, wherein the identification module is specifically configured to:
dividing the monitored video stream into a plurality of video segments;
processing each video segment into a plurality of image frames respectively;
the first neural network is used for taking the characteristics of the current frame and the image frame before the current frame as input aiming at each current frame so as to extract the characteristic information of the current frame;
and determining the dish state category corresponding to the current frame according to the characteristic information of the current frame.
11. The apparatus of claim 8, further comprising:
the detection module is used for detecting whether a person exists in the kitchen based on a millimeter wave radar wave technology; alternatively, the first and second electrodes may be,
detecting whether a person exists in the kitchen based on an infrared induction technology;
the adjusting module is used for reducing the cooking bench temperature and executing the step of recognizing the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network if the fact that no person exists in the kitchen is detected;
and if the person is detected in the kitchen, executing the step of identifying the dish state category corresponding to the dish cooking image content of the cooking bench area in the image to be analyzed based on the trained first neural network.
12. The apparatus according to claim 11, wherein the detection module is specifically configured to:
sending a millimeter wave radar signal to position a person;
performing cluster analysis after receiving echo signals corresponding to the millimeter wave radar signals;
if the cluster analysis result does not include the cluster of the corresponding person, determining that the kitchen does not have the person;
and if the clustering analysis result comprises the clustering of the corresponding person, determining that the person is in the kitchen.
13. The apparatus of claim 8, further comprising:
the waiting module is used for closing the stove fire and giving an alarm when waiting for receiving a preset triggering event if the identified dish state category does not reach the pan-pasting state category, wherein the preset triggering event is at least one or combination of the following:
receiving an event of reaching the timing duration of a preset timer;
determining the oil smoke increment as a designated increment;
and determining that the temperature of the stove fire reaches a preset temperature threshold value.
14. The apparatus of claim 8 or 13, wherein the shut down and alarm of a cooking fire comprises:
sending alarm information to the appointed terminal equipment; or the like, or, alternatively,
and alarming is carried out through an installed alarm.
15. An intelligent terminal, comprising: a memory and a processor;
a memory for storing program instructions;
a processor for calling program instructions stored in said memory to execute the method of any one of claims 1 to 7 in accordance with the obtained program.
16. A computer storage medium storing computer-executable instructions for performing the method of any one of claims 1-7.
CN201911402520.4A 2019-12-30 2019-12-30 Kitchen safety realization method and device, intelligent terminal and storage medium Active CN111046849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911402520.4A CN111046849B (en) 2019-12-30 2019-12-30 Kitchen safety realization method and device, intelligent terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911402520.4A CN111046849B (en) 2019-12-30 2019-12-30 Kitchen safety realization method and device, intelligent terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111046849A true CN111046849A (en) 2020-04-21
CN111046849B CN111046849B (en) 2023-07-21

Family

ID=70241397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911402520.4A Active CN111046849B (en) 2019-12-30 2019-12-30 Kitchen safety realization method and device, intelligent terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111046849B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111565303A (en) * 2020-05-29 2020-08-21 深圳市易链信息技术有限公司 Video monitoring method, system and readable storage medium based on fog calculation and deep learning
CN111594881A (en) * 2020-06-01 2020-08-28 朱永凤 Intelligent gas stove detection and control system based on big data
CN111706884A (en) * 2020-05-13 2020-09-25 宁波方太厨具有限公司 Working method of intelligent kitchen

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104955187A (en) * 2014-03-24 2015-09-30 美的集团股份有限公司 Electromagnetic heating device as well as control assembly and control method thereof
US20160307459A1 (en) * 2015-04-20 2016-10-20 NSF International Computer-implemented techniques for interactively training users to perform food quality, food safety, and workplace safety tasks
CN106383452A (en) * 2016-11-24 2017-02-08 北京地平线机器人技术研发有限公司 Smart control module and kitchen appliances employing same
CN108256474A (en) * 2018-01-17 2018-07-06 百度在线网络技术(北京)有限公司 For identifying the method and apparatus of vegetable
CN108932814A (en) * 2018-07-18 2018-12-04 东华大学 A kind of embedded image type cooking fire warning device
CN109948488A (en) * 2019-03-08 2019-06-28 上海达显智能科技有限公司 A kind of intelligence smoke eliminating equipment and its control method
CN110135342A (en) * 2019-05-15 2019-08-16 东喜和仪(珠海市)数据科技有限公司 Kitchen monitoring device based on artificial intelligence
CN110163788A (en) * 2019-05-15 2019-08-23 东喜和仪(珠海市)数据科技有限公司 Kitchen monitoring method based on artificial intelligence
WO2019223361A1 (en) * 2018-05-23 2019-11-28 北京国双科技有限公司 Video analysis method and apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104955187A (en) * 2014-03-24 2015-09-30 美的集团股份有限公司 Electromagnetic heating device as well as control assembly and control method thereof
US20160307459A1 (en) * 2015-04-20 2016-10-20 NSF International Computer-implemented techniques for interactively training users to perform food quality, food safety, and workplace safety tasks
CN106383452A (en) * 2016-11-24 2017-02-08 北京地平线机器人技术研发有限公司 Smart control module and kitchen appliances employing same
CN108256474A (en) * 2018-01-17 2018-07-06 百度在线网络技术(北京)有限公司 For identifying the method and apparatus of vegetable
WO2019223361A1 (en) * 2018-05-23 2019-11-28 北京国双科技有限公司 Video analysis method and apparatus
CN108932814A (en) * 2018-07-18 2018-12-04 东华大学 A kind of embedded image type cooking fire warning device
CN109948488A (en) * 2019-03-08 2019-06-28 上海达显智能科技有限公司 A kind of intelligence smoke eliminating equipment and its control method
CN110135342A (en) * 2019-05-15 2019-08-16 东喜和仪(珠海市)数据科技有限公司 Kitchen monitoring device based on artificial intelligence
CN110163788A (en) * 2019-05-15 2019-08-23 东喜和仪(珠海市)数据科技有限公司 Kitchen monitoring method based on artificial intelligence

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111706884A (en) * 2020-05-13 2020-09-25 宁波方太厨具有限公司 Working method of intelligent kitchen
CN111565303A (en) * 2020-05-29 2020-08-21 深圳市易链信息技术有限公司 Video monitoring method, system and readable storage medium based on fog calculation and deep learning
CN111594881A (en) * 2020-06-01 2020-08-28 朱永凤 Intelligent gas stove detection and control system based on big data

Also Published As

Publication number Publication date
CN111046849B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN110097037B (en) Intelligent monitoring method and device, storage medium and electronic equipment
CN108228705B (en) Automatic object and activity tracking device, method and medium in live video feedback
CN111046849B (en) Kitchen safety realization method and device, intelligent terminal and storage medium
EP3407200B1 (en) Method and device for updating online self-learning event detection model
US20200019887A1 (en) Data-driven activity prediction
US20170155877A1 (en) System and method for predicting patient falls
CN111680535B (en) Method and system for real-time prediction of one or more potential threats in video surveillance
CN112836676B (en) Abnormal behavior detection method and device, electronic equipment and storage medium
CN107005679A (en) A kind of intelligent Target identifying device based on cloud service, system and method
CN109614906A (en) A kind of security system and security alarm method based on deep learning
US11145174B2 (en) Methods and system for monitoring an environment
CN109544862A (en) Activity recognition method, apparatus, storage medium and equipment based on smart home
US11295167B2 (en) Automated image curation for machine learning deployments
CN111414829B (en) Method and device for sending alarm information
Hao et al. An end-to-end human abnormal behavior recognition framework for crowds with mentally disordered individuals
US20220139180A1 (en) Custom event detection for surveillance cameras
CN116030370A (en) Behavior recognition method and device based on multi-target tracking and electronic equipment
CN115766401B (en) Industrial alarm information analysis method and device, electronic equipment and computer medium
CN113642509A (en) Garbage bin overflow state detection method and device, storage medium and electronic equipment
CN116915958B (en) One-time operation video monitoring and analyzing method and related device
US20240153312A1 (en) Providing assistance based on machine learning model analysis of video data
US11984011B2 (en) Systems and methods for disturbance detection and identification based on disturbance analysis
US11620895B2 (en) Systems and methods for disturbance detection and identification based on disturbance analysis
US20240153275A1 (en) Determining incorrect predictions by, and generating explanations for, machine learning models
US20230360402A1 (en) Video-based public safety incident prediction system and method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant