CN113465265B - Intelligent refrigerator and food material detection method - Google Patents
Intelligent refrigerator and food material detection method Download PDFInfo
- Publication number
- CN113465265B CN113465265B CN202010342779.0A CN202010342779A CN113465265B CN 113465265 B CN113465265 B CN 113465265B CN 202010342779 A CN202010342779 A CN 202010342779A CN 113465265 B CN113465265 B CN 113465265B
- Authority
- CN
- China
- Prior art keywords
- food material
- material detection
- detection frame
- currently selected
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F25—REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
- F25D—REFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
- F25D11/00—Self-contained movable devices, e.g. domestic refrigerators
- F25D11/02—Self-contained movable devices, e.g. domestic refrigerators with cooling compartments at different temperatures
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F25—REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
- F25D—REFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
- F25D29/00—Arrangement or mounting of control or safety devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Thermal Sciences (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Cold Air Circulating Systems And Constructional Details In Refrigerators (AREA)
Abstract
The application discloses an intelligent refrigerator and a food material detection method. In the application, the food material image is detected, and the position and the size of the food material detection frame and the food material type of the food material detection frame are obtained; determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames; and clustering the food material detection boxes for clustering, and determining the food material type of the corresponding food material group according to the food material type of the food material detection boxes contained in the food material group obtained by clustering.
Description
Technical Field
The application relates to the technical field of smart home, in particular to an intelligent refrigerator and a food material detection method.
Background
Artificial intelligence has been advanced to all industries, and the intellectualization of home appliances has become an important direction for the development of the home appliance industry. The refrigerator is one of core household appliances in a kitchen and even a family, and can provide intelligent service for family members. Food material management is one of basic functions of an intelligent refrigerator and is the basis of other intelligent management functions.
In the aspect of food material management, how to effectively detect and identify clustered food materials, such as food materials placed in the form of boxes, piles, trays and the like, is a problem to be solved at present.
Disclosure of Invention
An exemplary embodiment of the application provides an intelligent refrigerator and a food material detection method, which are used for realizing effective detection and identification of aggregated food materials.
According to an aspect of exemplary embodiments, there is provided an intelligent refrigerator including: a box body and a refrigerating part;
the box body is provided with a camera module used for collecting food material images;
the camera module is connected with a controller, and the controller is configured to:
detecting the food material image to obtain the position and the size of a food material detection frame and the food material type of the food material detection frame;
determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames;
and clustering the food material detection boxes for clustering, and determining the food material type of the corresponding food material group according to the food material type of the food material detection boxes contained in the food material group obtained by clustering.
In some embodiments of the present application, the controller is further configured to: performing the following operations at least once until the candidate food material detection box set is empty:
selecting a food material detection frame with the largest area from a candidate food material detection frame set according to the size of the food material detection frame, determining the currently selected food material detection frame as a food material detection frame for clustering, and deleting the currently selected food material detection frame from the candidate food material detection frame set; wherein the candidate food material detection box set is initially set to include all detected food material detection boxes;
and if the currently selected food material detection frame is covered by other food material detection frames, deleting the food material detection frame covered by the currently selected food material detection frame from the candidate food material detection frame set.
In some embodiments of the present application, the controller is further configured to:
if the position relation between the center point of the currently selected food material detection frame and the center point of a first food material detection frame in the candidate food material detection frame set meets the following conditions, judging that the first food material detection frame is covered by the currently selected food material detection frame:
the horizontal distance between the center point of the first food material detection frame and the center point of the currently selected food material detection frame is smaller than a first numerical value, the vertical distance is smaller than a second numerical value, the first numerical value is equal to the sum of half of the width of the currently selected food material detection frame and a first set value, and the first numerical value is equal to the sum of half of the height of the currently selected food material detection frame and a second set value.
In some embodiments of the present application, the controller is further configured to: if the food materials covered by the currently selected food material detection frame are judged to be the aggregated food materials according to the number of the food material detection frames covered by the currently selected food material detection frame, and the currently selected food material detection frame and the food material types covered by the currently selected food material detection frame are inconsistent, determining the food material type to which each food material detection frame belongs in the currently selected food material detection frame and the food material detection frames covered by the currently selected food material detection frame, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest detection probability average value as the food material type to which the currently selected food material detection frame belongs.
In some embodiments of the present application, the controller is further configured to: determining the food material type of each food material detection box in one clustered food material group, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest average value of the detection probability as the food material type of the food material group.
According to an aspect of the exemplary embodiments, there is provided a food material detection method applied to an intelligent refrigerator, including:
detecting the food material image to obtain the position and the size of the food material detection frame and the food material type of the food material detection frame;
determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames;
and clustering the food material detection boxes for clustering, and determining the food material type of the corresponding food material group according to the food material type of the food material detection boxes contained in the food material group obtained by clustering.
In some embodiments of the present application, determining a coverage relationship between food material detection frames according to positions and sizes of the food material detection frames, and determining a food material detection frame for clustering according to the coverage relationship between the food material detection frames includes:
performing the following operations at least once until the candidate food material detection box set is empty:
selecting a food material detection frame with the largest area from a candidate food material detection frame set according to the size of the food material detection frame, determining the currently selected food material detection frame as a food material detection frame for clustering, and deleting the currently selected food material detection frame from the candidate food material detection frame set; wherein the candidate food material detection box set is initially set to include all detected food material detection boxes;
and if the currently selected food material detection frame is covered by other food material detection frames, deleting the food material detection frame covered by the currently selected food material detection frame from the candidate food material detection frame set.
In some embodiments of the present application, it is determined whether other food material detection boxes are covered in the currently selected food material detection box by the following method:
if the position relation between the center point of the currently selected food material detection frame and the center point of a first food material detection frame in the candidate food material detection frame set meets the following conditions, judging that the first food material detection frame is covered by the currently selected food material detection frame:
the horizontal distance between the center point of the first food material detection frame and the center point of the currently selected food material detection frame is smaller than a first numerical value, the vertical distance is smaller than a second numerical value, the first numerical value is equal to the sum of half of the width of the currently selected food material detection frame and a first set value, and the first numerical value is equal to the sum of half of the height of the currently selected food material detection frame and a second set value.
Some embodiments of the present application further comprise: if the food materials covered by the currently selected food material detection frame are judged to be the aggregated food materials according to the number of the food material detection frames covered by the currently selected food material detection frame, and the currently selected food material detection frame and the food material types covered by the currently selected food material detection frame are inconsistent, determining the food material type to which each food material detection frame belongs in the currently selected food material detection frame and the food material detection frames covered by the currently selected food material detection frame, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest detection probability average value as the food material type to which the currently selected food material detection frame belongs.
In some embodiments of the present application, determining the food material type to which the corresponding food material group belongs according to the food material type to which the food material detection box included in the food material group obtained by clustering belongs includes: determining the food material type of each food material detection box in one clustered food material group, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest average value of the detection probability as the food material type of the food material group.
In the above embodiment of the application, after the food material detection frames in the image are detected, firstly, the covering relationship among the food material detection frames is determined according to the positions and sizes of the food material detection frames, the food material detection frames for clustering are selected according to the covering relationship among the food material detection frames, then the selected food material detection frames are clustered, and the food material types to which the corresponding food material groups belong are determined according to the food material types to which the food material detection frames contained in the clustered food material groups belong, so that the detection and identification of the clustered food materials are realized.
On the basis of the common knowledge in the field, the above preferred conditions can be combined randomly to obtain the preferred embodiments of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 schematically illustrates a schematic diagram of an intelligent refrigerator in an embodiment of the present application;
fig. 2 schematically illustrates a front side region of a refrigerator in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a functional structure of a controller of the intelligent refrigerator in the embodiment of the present application;
fig. 4 schematically illustrates a flow chart of a food material detection method provided in an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a food material detection result in an embodiment of the application;
fig. 6 exemplarily shows a food material detection flow in a specific application scenario in the embodiment of the present application;
fig. 7a to 7c schematically illustrate a food material detection process in the embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. Wherein in the description of the embodiments of the present application, "/" means or, unless otherwise stated, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first", "second", may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application "a plurality" means two or more unless stated otherwise.
In daily life, the food materials stored in the refrigerator are various, the attribute of the stored food materials is not only single, but many food materials are stored in a gathering form, such as a box, a pile, a tray, a branch, a bag and the like, for example, a box of cherry tomato, a pile of strawberry, a dish of cherry, a bundle of banana and the like. In the embodiment of the present application, the food material units are collectively referred to as a group, that is, a group of food materials may include a plurality of food material individuals, for example, a group of strawberries is formed by stacking a plurality of strawberries together.
Due to the complex diversity of the food materials, certain errors inevitably exist when the food materials are detected and identified based on the images, and particularly for the gathered food materials, the attributes of the food materials cannot be accurately detected or identified. In order to improve the accuracy of food material detection and identification, and particularly to improve the accuracy of detection and identification of aggregated food materials, an embodiment of the present application provides an image-based food material detection method, and the method provided by the embodiment of the present application can be used to detect and identify the type and quantity of food materials no matter what form (single form or aggregated form) the stored food materials are.
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 1 schematically illustrates a structure of an intelligent refrigerator provided in an embodiment of the present application.
As shown in fig. 1, the intelligent refrigerator includes a cabinet 10, a cooling portion (not shown in the figure), and other accessories (for example, an illumination lamp, a thermometer, etc. may be provided in the cabinet, not shown in the figure). The refrigerating system mainly comprises a compressor, a condenser, an evaporator, a capillary restrictor and other components, and a closed circulating system is formed by the components. The evaporator can be installed above the inside of the intelligent refrigerator, and other components are installed on the back of the intelligent refrigerator.
The cabinet 10 is provided with a door 20, and the door 20 may further be provided with a display screen 50, and the display screen 50 is coupled with the controller (e.g., connected via a circuit).
A camera module 30 may also be provided on the housing 10, which may capture images in the front region of the housing 10. The plane of the refrigerator door is taken as a first plane, the front area of the refrigerator body 10 at least comprises an area which is extended to the outside of the refrigerator by a certain distance by taking the first plane as a reference, and the camera module can acquire images of the area, namely, hand motion images and images of stored food materials in the food material storing and taking process after a user opens the door body 20 can be shot.
Fig. 2 shows an exemplary illustration of the front region of the housing 10. As shown in the figure, the plane where the intelligent refrigerator door is in the closed state is called a plane H2, the plane d away from the plane H2 is called a plane H1, the plane H1 is parallel to the plane H2, and a space between the plane H1 and the plane H2 is a front side area of the refrigerator body 10. The camera module is at least capable of acquiring images of the area. The value of d can be set according to factors such as the length parameter of the human arm, the image recognition precision and the like, and can be set to be 25 cm-35 cm for example.
In some embodiments, the camera module 30 may be disposed at an upper portion of the cabinet 10 near the door 20 so as to be able to photograph an image in a front region of the cabinet 10.
In some embodiments, the camera module 30 may be disposed on a fixing part, which may enable a lens of the camera module to protrude a certain distance from a plane where the door body 20 is located, so as to be able to better capture an image in a front region of the cabinet 10. The camera module can be triggered to be opened when a user opens the door body, and can be triggered to be closed when the user closes the door body. The shape, connection mode and material of the fixing component are not limited in the embodiments of the present application.
In other embodiments, the camera module 30 may be disposed on a movable part, and the movable part may eject the camera module 30 when the door 20 is opened, so that the camera module 30 protrudes out of the plane where the door 20 is located by a certain distance, and when the door 20 is closed, the movable part retracts the camera module 30. The movable part can be linked with the door body 20 in a mechanical connection mode or other modes, so that the door body 20 triggers the control camera module of the movable part when being opened and closed. In some embodiments, the movable component may further include a camera protection cover, the protection cover is opened and the camera module is started after the camera module is ejected, and the protection cover is closed and the camera module is closed after the camera module is retracted, so that the camera module is protected on one hand, and power consumption is saved on the other hand.
It should be noted that the structure of the intelligent refrigerator shown in fig. 1 is only an example, and the size of the intelligent refrigerator and the number of door bodies (for example, a single door body or a plurality of door bodies) are not limited in the embodiment of the present application.
The intelligent refrigerator provided by the embodiment of the application comprises a controller (not shown in fig. 1), the controller is coupled with the camera module (for example, connected through a circuit), and the controller can detect an image acquired by the camera module when a door body of the intelligent refrigerator is opened, so as to obtain the position and size of the food material detection frame and the food material type to which the food material detection frame belongs; then, determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames; and clustering the selected food material detection boxes, and determining the food material type of the corresponding food material group according to the food material type of the food material detection boxes contained in the food material group obtained by clustering.
Based on the functions implemented by the above-described controller, fig. 3 exemplarily shows a functional structure of the controller.
Fig. 3 schematically shows a functional structure of the intelligent refrigerator controller in the embodiment of the present application. As shown, the controller 300 may include the following functional modules: the food detection module 301, the food information decision module 302 and the food information output module 303.
The camera module shoots the images of the food materials accessed by the user. The food material detection module 301 obtains the image, detects the food material in the image to obtain the position and size of the food material detection frame and the food material type to which the food material detection frame belongs, and transmits the detection result to the food material information decision module 302.
The food material information decision module 302 determines a covering relationship between the food material detection frames according to the positions and sizes of the food material detection frames, selects the food material detection frames for clustering according to the covering relationship between the food material detection frames, clusters the selected food material detection frames for clustering, and determines the food material type to which the corresponding food material group belongs according to the food material type to which the food material detection frames included in the food material group obtained by clustering belong.
The food material information output module 302 outputs the information of the food material groups determined by the food material information decision module 302, which may specifically include the number of the food material groups and the food material types to which the food material groups belong.
Fig. 4 exemplarily shows a flowchart of a food material management method provided by the embodiment of the present application.
As shown, the process may include the following steps:
s401: the method comprises the steps of detecting images collected by the intelligent refrigerator camera module to obtain the position and the size of the food material detection frame and the food material type to which the food material detection frame belongs.
Under some scenes, when the door body of the intelligent refrigerator is opened, the camera module arranged on the refrigerator body of the intelligent refrigerator is started, the camera module collects images in the front side area of the refrigerator body of the intelligent refrigerator, and transmits the collected images to the controller for food material detection.
In this step, the food material detection may be performed by using a currently mature object detection algorithm, for example, SSD (target detection algorithm) or Yolov3 algorithm may be used. And inputting the image to be detected into the detection algorithm model, and outputting a detection result. It should be noted that, the detection method of the food material is not limited in the embodiment of the present application.
The output detection result may include information of one or more food material detection boxes, and one food material detection box may include one food material individual or may include a plurality of food material individuals, which mainly depends on factors such as food material placement form and aggregation degree. The information of one food material detection frame may include the position, size and food material type of the food material detection frame. For example, the information of the ith food material detection box can be represented as (x)i,yi,wi,hi,ci,pi) Wherein i denotes an ith food material detection frame, i ═ 1, 2., (x)i,yiIs the coordinate of the center point of the food material detection frame, (w)i,hi) Width and height of the food material detection frame, ciFor the food material category to which the detection box belongs, piIs the probability (or confidence) of identifying the food material type.
Fig. 5 schematically illustrates a food material detection result. As shown in the figure, a plurality of crabapples stacked together are arranged on the left side of the figure, some crabapples are arranged sporadically on the right side of the figure, and the detected food material detection boxes after food material detection are shown in the figure. The area of the detection frame 501 is large, the detection frame almost covers the left-side stacked crabapple fruits, the areas of the detection frames 502-504 are small, only a single crabapple fruit with a complete shape is included, and the detection frames 502-504 are covered by the detection frame 501 (as shown by oblique line frames in the figure); the detection frames 505-506 only include a single crabapple with a complete shape, and the detection frames 505-506 are not covered by the detection frame 501. The food material type identified by each food material detection box in the figure is "crabapple fruit".
S402: and determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames.
In some embodiments, a set of candidate food material detection boxes may be set, and all detected food material detection boxes are included in the set during initialization. Based on the candidate food material detection box set, the following iterative process can be adopted to select the food material detection boxes for clustering:
selecting a food material detection frame with the largest area from the candidate food material detection frame set as a current reference frame according to the size of the food material detection frame; after the reference frame is selected, deleting the currently selected reference frame from the candidate food material detection frame set;
and if the current reference frame is covered with other food material detection frames, deleting the food material detection frame covered by the current reference from the candidate food material detection frame set.
And repeating the process until the candidate food material detection frame set is empty, and obtaining all the food material detection frames selected as the reference frames at the moment.
When the food material detection frame with the largest current area is selected as the reference frame, the food material detection frame cannot be selected from the determined reference frame and the food material detection frames covered by the determined reference frame (namely, the detection frame selected as the reference frame and the food material detection frame covered by the reference frame cannot be selected as the reference frame in the subsequent iteration process). And when no food material detection frame can be selected as the reference frame, ending the iteration process, wherein all the obtained food material detection frames selected as the reference frames are the food material detection frames which need to be clustered subsequently.
Still taking fig. 5 as an example, after the food material detection frame is obtained by detection, since the detection frames 502-504 are covered by the detection frame 501, after the processing of S402, the obtained food material detection frame for clustering includes: detection block 501, detection block 505, and detection block 506.
In some embodiments, whether one food material detection frame is covered by the reference frame may be determined by:
if the positional relationship between the center point of a certain food material detection frame (referred to herein as a first food material detection frame for descriptive purposes) and the center point of the reference frame satisfies the following condition, it is determined that the first food material detection frame is covered by the reference frame:
the horizontal distance between the center point of the first food material detection frame and the center point of the reference frame is smaller than a first numerical value, the vertical distance is smaller than a second numerical value, the first numerical value is equal to the sum of half of the width of the reference frame and a first set value, and the first numerical value is equal to the sum of half of the height of the reference frame and a second set value. The values of the first set value and the second set value can be the same or different.
For example, the information of the ith food material detection box is (x)i,yi,wi,hi,ci,pi) The information of the jth food material detection box is (x)j,yj,wj,hj,cj,pj) If the jth food material detection frame meets the following conditions, judging that the jth food material detection frame is covered by the current reference frame:
wherein epsilon is a set parameter.
For each food material detection frame except the ith food material detection frame, whether the food material detection frame is covered by the ith food material detection frame (namely the current reference frame) is judged according to the mode.
By adopting the covering judgment method, some food material detection frames which are not completely covered by the current reference frame but are partially covered by the current reference frame can be judged as the food material detection frames covered by the current reference frame, so that the method has certain error tolerance.
In some embodiments, after the reference frame is selected and the food material detection frame covered by the reference frame is determined, the food material type to which the reference frame belongs may be determined according to the food material type to which the reference frame and the food material detection frame covered by the reference frame belong.
In some embodiments, for a current reference frame, the food material type to which the reference frame belongs and the food material type to which the food material detection frame covered by the reference frame belongs may be obtained, and corresponding processing is performed according to the following conditions:
case 1: if the food material covered by the reference frame is a single food material type, the food material type of the reference frame is the food material type contained in the information of the reference frame;
case 2: if the food materials covered by the reference frame are the gathered food materials and the reference frame and the food material detection frame covered by the reference frame belong to the same food material types, determining the food material type as the food material type of the reference frame, namely determining the food material type of the reference frame as the food material type contained in the information of the reference frame;
case 3: if the food materials covered by the reference frame are the aggregated food materials and the types of the food materials to which the reference frame and the food material detection frames covered by the reference frame belong are not consistent, determining the type of the food material to which each food material detection frame belongs in the reference frame and the food material detection frames covered by the reference frame, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest detection probability average value as the food material type to which the reference frame belongs.
The number of the food material detection frames covered by the reference frame can be used for judging whether the food materials covered by the reference frame are single food material types or aggregated food materials. For example, if the number of the food materials covered by the reference frame is greater than or equal to the set threshold, it is determined that the food materials covered by the reference frame are the aggregated food materials, otherwise, it is determined that the food materials covered by the reference frame are the single food material.
For example, if the frame of reference UiAnd food material detection frame U covered by samei-jIf the parameter of the food material type in the information is the food material type c, it is determined that the reference frame and the food material detection frame covered by the reference frame constitute the aggregated food material, and the food material type is the type c, i.e. the food material covered by the base station frame is a group (such as a box, a pile, a tray, a bundle, a bag, etc.) of food materials of which the type is c.
If the reference frame UiAnd food material detection frame U covered by samei-jIf the information in (2) is different in food material type parameter, for example, if N different food material types are detected in total, the corresponding probability average values are calculated for the N food material types, respectively, and the food material type with the largest average value is taken as the food material type to which the reference frame belongs. For example, the reference frame UiAnd food material detection frame U covered by samei-jThe number of the detection frames is 5, the food material types in the detection frame information are 2 (type A, type B), the type parameter in the information of 3 detection frames is type A, the probability (or confidence) is (0.8, 0.7, 0.6), the type parameter in the information of the other 2 detection frames is type B, and the probability is (0.8 ). Calculating a probability mean P identified as class AAProbability average P of class B identified as (0.8+0.7+0.6)/3 of 0.7B=(0.8+0.8)/2=0.8,PA<PBTherefore, the type of the material to which the reference frame belongs is determined as type B.
S403: and clustering the food material detection boxes selected in the step S402, and determining the food material type of the corresponding food material group according to the food material type of the food material detection boxes contained in the food material group obtained by clustering.
In the step, a K-means clustering algorithm can be adopted for clustering calculation. The K-means clustering algorithm is a clustering analysis algorithm for iterative solution, and the steps of the algorithm are that K objects are randomly selected as initial clustering centers (namely seed clustering centers), then the distance between each object and each seed clustering center is calculated, and each object is allocated to the nearest clustering center. The cluster centers and the objects assigned to them represent a cluster. The cluster center of a cluster is recalculated for each sample assigned based on the objects existing in the cluster. This process will be repeated until some termination condition is met. The termination condition may be that no (or minimum number) objects are reassigned to different clusters, no (or minimum number) cluster centers are changed again, and the sum of squared errors is locally minimal. In the embodiment of the application, the most appropriate K value in the K-means clustering algorithm can be found by using a contour coefficient method.
After the food material detection boxes are clustered, the food material detection boxes are divided into one or more clusters to form one or more food material groups, and each food material group comprises one or more food material detection boxes.
For each food material group, the food material type of the food material group can be determined according to the food material type of each clustered food material detection box in the food material group.
In some embodiments, an average of the detection probabilities of each food material category in the food material group can be calculated, and the food material category with the highest average probability can be determined as the food material category to which the food material group belongs. For example, there are 5 food material detection boxes in a certain food material group obtained by clustering, of which 3 detection boxes are identified as the food material type a with identification probabilities of (0.8, 0.7, 0.6), and the other 2 detection boxes are identified as the food material type B with identification probabilities of (0.8 ). Calculating a probability average P of the food material types AA(0.8+0.7+ 0.6)/3-0.7, probability average P identified as food material type BB=(0.8+0.8)/2=0.8,PA<PBTherefore, the food material category to which the food material group belongs is determined as category B.
If only one food material detection box (reference box) is arranged in a certain clustered food material group, the food material type of the food material group is the food material type of the food material detection box.
Further, after determining the food material group and the food material type to which the food material group belongs, the method may further include the following steps:
s404: and outputting the number of the food material groups and the belonging food material type information so that a user can know the information and further know the food material access condition.
In some embodiments, the food material group information may be displayed on a display screen of the smart refrigerator, and the food material group information may also be pushed to a user terminal, for example, the information may be sent to a server, so that the server may send the information to the user terminal through a mobile communication network. Of course, under the condition that the intelligent refrigerator supports the voice function, the information can be subjected to voice broadcast through the intelligent refrigerator, and also can be sent to the intelligent sound box, so that the voice broadcast is performed by the intelligent sound box. The embodiment of the present application does not limit the output manner of the above information.
According to one or a combination of the foregoing embodiments, fig. 6 exemplarily shows a food material detection process in a specific application scenario, and as shown in the figure, the process may include the following steps:
s601: and acquiring and detecting the food material image to obtain a food material detection frame.
The information of the food material detection frame is represented by (x, y, w, h, c, p), wherein (x, y) represents the center point coordinates of the food material detection frame, (w, h) represents the width and height of the food material detection frame, c represents the food material type to which the detection frame belongs, and p represents the probability of being identified as the food material type c.
S602: judging whether the candidate food material detection box set is empty or not, if so, indicating that the food material detection boxes are processed, and then switching to S607 for clustering; if not, it indicates that there is an unprocessed detection frame, and therefore the process proceeds to S603 to perform the overlay process. Wherein the candidate food material detection box set is initially set to include all detected food material detection boxes.
S603-606: and (5) covering the treatment process.
In S603, the area of each detection frame is calculated, taking the ith detection frame as an example, the area is Si=Wi*hi. According to the area of each detection frame, accessing the detection frame with the largest area as a reference frame U1。
At S604, a coverage calculation is performed. The reference frame and other detection frames can be covered by the formula (1) to determine the detection frame covered by the reference frame, and the covered detection frame is marked as U1-1,U1-2,....。
According to the number of the detection frames covered by the reference frame, whether the food materials covered by the reference frame are the aggregated food materials or the single food material can be determined. Specifically, can make NiRepresents a reference frame UiThe number of covered detection boxes. If N is presentjIf the value is more than or equal to eta, the reference frame U is considerediThe covered food materials are aggregated food materials, otherwise, the food materials covered by the reference frame are determined to be single food materials. Wherein eta is a selected quantity parameter, and eta is more than or equal to 0.
For example, the following shows a case of determining whether the food materials covered by the reference frame belong to the aggregated food materials or the single food material under several values of η:
case 1: η is 0, no matter the reference frame UiWhether or not other detection frames are covered, the reference frame UiThe food materials under covering are gathering food materials;
case 2: eta is 1, then the reference frame UiCovered with at least one detection frame, the reference frame UiThe covered food materials are the gathering food materials, if the reference frame U isiUncovered by other detection frames, the reference frame UiThe food materials under covering are single food material;
case 3: eta is 2, then the reference frame UiCovered with at least two detection frames, the reference frame UiThe covered food materials are the gathering food materials, if the reference frame U isiThe number of covered detection frames is less than 2, and the reference frame UiThe food material under covering is a single food material.
And so on. By adopting the method, the judgment error can be reduced aiming at different conditions by setting or adjusting the value of eta, and the detection precision is improved.
At S605, a reference frame U is determinediThe food material category to which it belongs.
Specifically, if the reference frame UiIf the covered food materials are single food material types, the reference frame U is formediThe belonged food material category is the reference frame UiInformation (x) ofi,yi,wi,hi,ci,pi) Food material category ci。
If the reference frame UiIf the covered food materials are gathering food materials, judging the kinds of the gathering food materials in the following modes:
if the reference frame UiAnd the covered detection frame Ui-jIf the food material type is the same type c, the reference frame U is considerediAnd the detection boxes covered by the food material c are represented as the gathering food materials c, and the number of the gathering food materials is one group; if the reference frame UiAnd the detection frame U covered by iti-jIf the types of the food materials are different, calculating and identifying the probability average value of each food material type, and taking the type with the largest average value as the type of the aggregated food material.
At S606, the reference frame U is deleted from the candidate food material detection frame setiAnd the detection frames covered by the detection frame are used for avoiding processing the detection frames again in the subsequent processing process.
The above steps S603 to S606 are repeated until the set of candidate food material detection boxes is empty (i.e., all detection boxes are processed).
Taking the process of S603 to S606 as an example for the second time, the detection frame with the largest area is selected from the candidate food material detection frame set as the second reference frame U2The coverage calculation with other detection frames in the set is carried out, and the frame to be referenced U2The covered detection frame is marked as U2-1,U2-2…, and judging the reference frame U2And the food material types of the detection frames covered by the food material type detection device.
S607: and clustering all the reference frames, and determining the food material type of each food material group obtained by clustering.
Here, the "remaining detection frames" include all the reference frames.
S608: and outputting the food material detection result, specifically including the final clustered food material group and the type of each group of food materials.
In the above flow shown in fig. 6, the detailed implementation of some steps may refer to the description of the foregoing embodiments, and will not be repeated here.
Next, the embodiments of the present application will be described by taking fig. 7a to 7c as examples.
As shown in FIG. 7a, after the image is detected, detection frames 701-709 are obtained through detection, and the food material type of each detection frame is strawberry. The set of candidate food material detection boxes is initialized so that the detection boxes 701-709 are all included in the set, i.e. the set is denoted as { detection box 701, detection box 702, detection box 703, detection box 704, detection box 705, detection box 706, detection box 707, detection box 708, detection box 709 }.
In the first covering process, the detection frame 701 having the largest area is selected as the first reference frame covered with the detection frames 702 to 705, so that the detection frames 701 to 705 are deleted from the candidate food material detection frame set, and at this time, the candidate food material detection frame set is represented as { detection frame 706, detection frame 707, detection frame 708, detection frame 709 }. The food materials covered by the first reference frame are gathering food materials, and the food materials covered by the first reference frame and the detection frame are all strawberry types, so that the food material covered by the first reference frame is strawberry.
During the second covering process, the largest area of the detection frame 706 is selected as the second reference frame covered by the detection frame 707, so that the detection frames 706 and 707 are deleted from the candidate food material detection frame set, and the candidate food material detection frame set is represented as { detection frame 708, detection frame 709 }. The food materials covered by the second reference frame are gathering food materials, and the food materials covered by the second reference frame and the detection frame are all strawberry types, so that the food material covered by the second reference frame is strawberry.
In the third covering process, the detection frame 709 with the largest area is selected as the third reference frame, and the third reference frame does not cover other detection frames, so that only the detection frame 709 is deleted from the candidate food material detection frame set, and the candidate food material detection frame set is represented as { detection frame 708 }. The food materials covered by the third reference frame are single food material types, and the food material types are strawberry.
In the fourth covering process, the detection frame 708 with the largest area is selected as the fourth reference frame, and the fourth reference frame does not cover other detection frames, so the detection frame 709 is deleted from the candidate food material detection frame set, and the candidate food material detection frame set is empty. The food materials covered by the fourth reference frame are single food material types, and the food material types are strawberry.
Since the candidate food material detection box set is empty, it indicates that the detection boxes have been completely processed so far. In the four covering processes, the detection box 701, the detection box 706, the detection box 709, and the detection box 708 are sequentially selected as reference boxes, as shown in fig. 7b, a clustering algorithm is performed on the detection boxes to finally obtain 2 clusters, and it is determined that the food material types corresponding to the 2 clusters are all "strawberries", that is, the first strawberry group 710 and the second strawberry group 720 shown in fig. 7c are obtained.
In the above embodiment of the application, after the food material detection frames in the image are detected, firstly, the covering relationship among the food material detection frames is determined according to the positions and sizes of the food material detection frames, the food material detection frames for clustering are selected according to the covering relationship among the food material detection frames, then the selected food material detection frames are clustered, and the food material types to which the corresponding food material groups belong are determined according to the food material types to which the food material detection frames contained in the clustered food material groups belong, so that the detection and identification of the clustered food materials are realized.
According to yet another aspect of the exemplary embodiments, the present application further provides a computer storage medium, in which computer program instructions are stored, and when the instructions are run on a computer, the instructions cause the computer to execute the processing method as described above.
Since the intelligent terminal and the computer storage medium in the embodiment of the present application can be applied to the processing method, the technical effect that can be obtained by the intelligent terminal and the computer storage medium can also refer to the embodiment of the method, and the embodiment of the present application is not described herein again.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
While specific embodiments of the present application have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the present application is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and principles of this application, and these changes and modifications are intended to be included within the scope of this application.
Claims (8)
1. An intelligent refrigerator, comprising: a box body and a refrigerating part;
the box body is provided with a camera module used for collecting food material images;
the camera module is connected with a controller, and the controller is configured to:
detecting the food material image to obtain the position and the size of a food material detection frame and the food material type of the food material detection frame;
determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames;
clustering the food material detection boxes for clustering, and determining the food material types of the corresponding food material groups according to the food material types to which the food material detection boxes contained in the food material groups obtained by clustering belong;
the controller is further configured to:
performing the following operations at least once until the candidate food material detection box set is empty:
selecting a food material detection frame with the largest area from a candidate food material detection frame set according to the size of the food material detection frame, determining the currently selected food material detection frame as a food material detection frame for clustering, and deleting the currently selected food material detection frame from the candidate food material detection frame set; wherein the candidate food material detection box set is initially set to include all detected food material detection boxes;
and if the currently selected food material detection frame is covered by other food material detection frames, deleting the food material detection frame covered by the currently selected food material detection frame from the candidate food material detection frame set.
2. The intelligent refrigerator of claim 1, wherein the controller is further configured to:
if the position relation between the center point of the currently selected food material detection frame and the center point of a first food material detection frame in the candidate food material detection frame set meets the following conditions, judging that the first food material detection frame is covered by the currently selected food material detection frame:
the horizontal distance between the center point of the first food material detection frame and the center point of the currently selected food material detection frame is smaller than a first numerical value, the vertical distance is smaller than a second numerical value, the first numerical value is equal to the sum of half of the width of the currently selected food material detection frame and a first set value, and the first numerical value is equal to the sum of half of the height of the currently selected food material detection frame and a second set value.
3. The intelligent refrigerator of claim 1, wherein the controller is further configured to:
if the food materials covered by the currently selected food material detection frame are judged to be the aggregated food materials according to the number of the food material detection frames covered by the currently selected food material detection frame, and the currently selected food material detection frame and the food material types covered by the currently selected food material detection frame are inconsistent, determining the food material type to which each food material detection frame belongs in the currently selected food material detection frame and the food material detection frames covered by the currently selected food material detection frame, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest detection probability average value as the food material type to which the currently selected food material detection frame belongs.
4. The intelligent refrigerator of claim 1, wherein the controller is further configured to:
determining the food material type of each food material detection box in one clustered food material group, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest average value of the detection probability as the food material type of the food material group.
5. A food material detection method is applied to an intelligent refrigerator and is characterized by comprising the following steps:
detecting the food material image to obtain the position and the size of the food material detection frame and the food material type of the food material detection frame; determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and selecting the food material detection frames for clustering according to the covering relation among the food material detection frames;
clustering the food material detection boxes for clustering, and determining the food material types of the corresponding food material groups according to the food material types to which the food material detection boxes contained in the food material groups obtained by clustering belong;
determining the covering relation among the food material detection frames according to the positions and the sizes of the food material detection frames, and determining the food material detection frames for clustering according to the covering relation among the food material detection frames, wherein the method comprises the following steps:
performing the following operations at least once until the candidate food material detection box set is empty:
selecting a food material detection frame with the largest area from a candidate food material detection frame set according to the size of the food material detection frame, determining the currently selected food material detection frame as a food material detection frame for clustering, and deleting the currently selected food material detection frame from the candidate food material detection frame set; wherein the candidate food material detection box set is initially set to include all detected food material detection boxes;
and if the currently selected food material detection frame is covered by other food material detection frames, deleting the food material detection frame covered by the currently selected food material detection frame from the candidate food material detection frame set.
6. The method of claim 5, wherein determining whether the currently selected food material detection box is covered with other food material detection boxes is performed by:
if the position relation between the center point of the currently selected food material detection frame and the center point of a first food material detection frame in the candidate food material detection frame set meets the following conditions, judging that the first food material detection frame is covered by the currently selected food material detection frame:
the horizontal distance between the center point of the first food material detection frame and the center point of the currently selected food material detection frame is smaller than a first numerical value, the vertical distance is smaller than a second numerical value, the first numerical value is equal to the sum of half of the width of the currently selected food material detection frame and a first set value, and the first numerical value is equal to the sum of half of the height of the currently selected food material detection frame and a second set value.
7. The method of claim 5, further comprising:
if the food materials covered by the currently selected food material detection frame are judged to be the aggregated food materials according to the number of the food material detection frames covered by the currently selected food material detection frame, and the currently selected food material detection frame and the food material types covered by the currently selected food material detection frame are inconsistent, determining the food material type to which each food material detection frame belongs in the currently selected food material detection frame and the food material detection frames covered by the currently selected food material detection frame, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest detection probability average value as the food material type to which the currently selected food material detection frame belongs.
8. The method of claim 5, wherein determining the food material type of the corresponding food material group according to the food material type to which the food material detection box included in the clustered food material group belongs comprises:
determining the food material type of each food material detection box in one clustered food material group, calculating the average value of the detection probability of each food material type, and determining the food material type with the highest average value of the detection probability as the food material type of the food material group.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010342779.0A CN113465265B (en) | 2020-04-27 | 2020-04-27 | Intelligent refrigerator and food material detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010342779.0A CN113465265B (en) | 2020-04-27 | 2020-04-27 | Intelligent refrigerator and food material detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113465265A CN113465265A (en) | 2021-10-01 |
CN113465265B true CN113465265B (en) | 2022-04-01 |
Family
ID=77865832
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010342779.0A Active CN113465265B (en) | 2020-04-27 | 2020-04-27 | Intelligent refrigerator and food material detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113465265B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106907890A (en) * | 2017-02-23 | 2017-06-30 | 海信(山东)冰箱有限公司 | The food management method and intelligent refrigerator of a kind of intelligent refrigerator |
CN107886028A (en) * | 2016-09-29 | 2018-04-06 | 九阳股份有限公司 | The food materials input method and food materials input device of a kind of refrigerator |
CN108960266A (en) * | 2017-05-22 | 2018-12-07 | 阿里巴巴集团控股有限公司 | Image object detection method and device |
CN109725117A (en) * | 2017-10-31 | 2019-05-07 | 青岛海尔智能技术研发有限公司 | The method and device that foodstuff calories detect in refrigerator |
WO2019177343A1 (en) * | 2018-03-13 | 2019-09-19 | 삼성전자주식회사 | Refrigerator, and system and method for controlling same |
CN110674789A (en) * | 2019-10-12 | 2020-01-10 | 海信集团有限公司 | Food material management method and refrigerator |
-
2020
- 2020-04-27 CN CN202010342779.0A patent/CN113465265B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107886028A (en) * | 2016-09-29 | 2018-04-06 | 九阳股份有限公司 | The food materials input method and food materials input device of a kind of refrigerator |
CN106907890A (en) * | 2017-02-23 | 2017-06-30 | 海信(山东)冰箱有限公司 | The food management method and intelligent refrigerator of a kind of intelligent refrigerator |
CN108960266A (en) * | 2017-05-22 | 2018-12-07 | 阿里巴巴集团控股有限公司 | Image object detection method and device |
CN109725117A (en) * | 2017-10-31 | 2019-05-07 | 青岛海尔智能技术研发有限公司 | The method and device that foodstuff calories detect in refrigerator |
WO2019177343A1 (en) * | 2018-03-13 | 2019-09-19 | 삼성전자주식회사 | Refrigerator, and system and method for controlling same |
CN110674789A (en) * | 2019-10-12 | 2020-01-10 | 海信集团有限公司 | Food material management method and refrigerator |
Also Published As
Publication number | Publication date |
---|---|
CN113465265A (en) | 2021-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200218888A1 (en) | Target Re-Identification | |
CN112767389B (en) | Gastroscope image focus identification method and device based on FCOS algorithm | |
WO2019007253A1 (en) | Image recognition method, apparatus and device, and readable medium | |
CN111310821B (en) | Multi-view feature fusion method, system, computer equipment and storage medium | |
CN110942091A (en) | Semi-supervised few-sample image classification method for searching reliable abnormal data center | |
CN110727756A (en) | Management method and device of space-time trajectory data | |
WO2007099495A1 (en) | Identifying set of image characteristics for assessing similarity of images | |
CN113465266B (en) | Food material storage management method, intelligent refrigerator and server | |
JP2013097645A (en) | Recognition support device, recognition support method and program | |
KR20120112293A (en) | Apparatus and method for detecting position of moving unit | |
CN113947770B (en) | Method for identifying object placed in different areas of intelligent cabinet | |
CN113465264A (en) | Intelligent refrigerator and food material management method | |
CN113465265B (en) | Intelligent refrigerator and food material detection method | |
CN110443267A (en) | Erroneous detection filter method, device, filming apparatus and storage medium | |
CN111738184B (en) | Commodity picking and placing identification method, device, system and equipment | |
CN114743195A (en) | Thyroid cell pathology digital image recognizer training method and image recognition method | |
CN113124635B (en) | Refrigerator with a door | |
CN118114985A (en) | Customer service risk assessment method and system based on bidirectional feedback mechanism | |
Adak | Identification of plant species by deep learning and providing as a mobile application | |
CN115424054A (en) | Image identification method, storage medium and system for refrigerator and refrigerator | |
CN113496245A (en) | Intelligent refrigerator and method for identifying food material storing and taking | |
CN106557523A (en) | Presentation graphics system of selection and equipment and object images search method and equipment | |
CN113465251B (en) | Intelligent refrigerator and food material identification method | |
KR20180090724A (en) | Method for controlling refrigerator and refrigerator controller | |
CN114359957A (en) | Human body image classification method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |