CN113780182A - Method and system for detecting food freshness - Google Patents

Method and system for detecting food freshness Download PDF

Info

Publication number
CN113780182A
CN113780182A CN202111068990.9A CN202111068990A CN113780182A CN 113780182 A CN113780182 A CN 113780182A CN 202111068990 A CN202111068990 A CN 202111068990A CN 113780182 A CN113780182 A CN 113780182A
Authority
CN
China
Prior art keywords
food
image
freshness
storage device
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111068990.9A
Other languages
Chinese (zh)
Inventor
刘艳芳
黄玉全
郭斌
田雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuyang Institute Of Technology
Original Assignee
Fuyang Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuyang Institute Of Technology filed Critical Fuyang Institute Of Technology
Priority to CN202111068990.9A priority Critical patent/CN113780182A/en
Publication of CN113780182A publication Critical patent/CN113780182A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The embodiment of the specification provides a method and a system for detecting food freshness, and the method comprises the steps of acquiring a first image of food stored in a food storage device based on an image acquisition device, wherein the image acquisition device shoots based on the movement of a sliding rail at a plurality of positions in the food storage device; food product types of the food products and freshness information corresponding to each food product are determined based on the first image.

Description

Method and system for detecting food freshness
Technical Field
The specification relates to the field of food safety detection, in particular to a method and a system for detecting food freshness.
Background
With the improvement of the life quality of people, the safety problem of food becomes more and more a topic of concern for people. The freshness of the sealed packaged food can be judged according to the production date and the shelf life of the food. However, people have difficulty in judging the freshness of food and unsealed and packaged food such as fruits and vegetables which do not have exact production date and guarantee period, and the problem may cause safety hazard to public health.
Therefore, a method for detecting the freshness of food is needed to help people to detect the freshness of food.
Disclosure of Invention
One of the embodiments of the present specification provides a method for detecting food freshness. The method for detecting food freshness comprises the following steps: acquiring a first image of food stored by a food storage device based on an image acquisition device, wherein the image acquisition device shoots based on the movement of a sliding rail at a plurality of positions in the food storage device; determining a food type of the food items and freshness information corresponding to each of the food items based on the first image.
One of the embodiments of the present specification provides a system for detecting food freshness, including: the food storage device comprises a first image acquisition module, a second image acquisition module and a control module, wherein the first image acquisition module acquires a first image of food stored in the food storage device based on an image acquisition device, and the image acquisition device shoots based on the movement of a sliding rail at a plurality of positions in the food storage device; a first determining module for determining a food type of the food and freshness information corresponding to each of the food based on the first image.
One of the embodiments of the present specification provides an apparatus for detecting food freshness, including a processor, which is used to execute a method for detecting food freshness according to any one of the above embodiments.
One of the embodiments of the present specification provides a computer-readable storage medium, which stores computer instructions, and when the computer instructions in the storage medium are read by a computer, the computer executes a method for detecting freshness of food as described in any one of the above embodiments.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a method for detecting food freshness according to some embodiments of the present description;
FIG. 2 is a block diagram of a system 200 for detecting food freshness in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow chart illustrating the determination of food product type and freshness information corresponding to each food product according to some embodiments of the present description;
FIG. 4 is an exemplary flow chart for adjusting freshness information of food products, shown in accordance with some embodiments of the present description;
FIG. 5 is a schematic illustration of determining a reminder time for each food item according to some embodiments of the present description;
FIG. 6 is a schematic illustration of determining a reminder time for each food item according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of a method for detecting food freshness according to some embodiments of the present description.
The method for detecting the freshness of the food can be used for detecting the freshness of the food. The application scenario of the method for detecting food freshness may include the server 110, the network 120, the food storage 130, the storage device 140, the user terminal 150, and the like.
The server 110 can process data and/or information from at least one component in an application scenario of a method of detecting food freshness. The server may communicate with the food storage device to provide various functions of the service. For example, the server may retrieve data (e.g., a first image) from a food storage device and output the data (e.g., a reminder time) to the user terminal via the network. The server may be used to process data and/or information from at least one component in an application scenario or external data source (e.g., a cloud data center) of the method of detecting food freshness. For example, the server may process the second image to obtain the number of uses of the food. In some embodiments, the server may be a single server or a group of servers. The set of servers can be centralized or distributed (e.g., the servers can be a distributed system). In some embodiments, the server may be regional or remote. In some embodiments, the server may be implemented on a cloud platform, or provided in a virtual manner. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-tiered cloud, and the like, or any combination thereof.
The network 120 may connect the various components of the system and/or connect the system with external portions. The network enables communication between the system components and with the system and external components, facilitating the exchange of data and/or information. In some embodiments, the network may be any one or more of a wired network or a wireless network. For example, the network may include a cable network, a fiber optic network, a telecommunications network, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network (ZigBee), Near Field Communication (NFC), an in-device bus, an in-device line, a cable connection, and the like, or any combination thereof. In some embodiments, the network connections between the various parts of the system may be in one of the manners described above, or in multiple manners. In some embodiments, the network may be a point-to-point, shared, centralized, etc. variety of topologies or a combination of topologies. In some embodiments, the network may include one or more network access points. For example, the network 120 may include wired or wireless network access points, such as base stations and/or network switching points 120-1, 120-2, …, through which one or more components in an application scenario of the method of detecting food freshness may connect to the network to exchange data and/or information.
The food storage device 130 may be used to store food such as radish 130-3, cabbage 130-4, milk 130-5, etc. The food storage device 130 may further include an image capture device 130-1 and a slide rail 130-2. In some embodiments, the food storage device may further acquire the first image and the second image based on the image capture device and send to the server over the network. In some embodiments, the food storage device may also store temperature data. The food storage unit may include one or more storage assemblies, each of which may be a separate device or part of another device.
Storage device 140 may store data and/or instructions. In some embodiments, the storage device may store data/information obtained from a server or the like. In some embodiments, a storage device may store data and/or instructions for execution by, or for use by, a server to perform the exemplary methods described in this specification. In some embodiments, the storage device may include mass storage, removable storage, Random Access Memory (RAM), Read Only Memory (ROM), the like, or any combination thereof. Exemplary mass storage may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary random access memories may include Dynamic RAM (DRAM), double-data-rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), thyristor RAM (T-RAM), zero-capacitance RAM (Z-RAM), and the like. Exemplary read-only memories may include Mask ROM (MROM), Programmable ROM (PROM), erasable programmable ROM (PEROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), digital versatile disk ROM, and the like. In some embodiments, the storage device may be implemented by a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-tiered cloud, and the like, or any combination thereof. In some embodiments, the storage device may be connected to a network to communicate with one or more components in an application scenario of the method of detecting food freshness. One or more components in an application scenario of the method of detecting food freshness may access data or instructions stored in a storage device over a network. In some embodiments, the storage device may be directly connected or in communication with one or more components in an application scenario of the method of detecting food freshness. In some embodiments, the storage device may be part of a server.
The user terminal 150 refers to a terminal device used by a user. In some embodiments, a user may query the food item storage device for information about the food item via a user terminal. In some embodiments, the server obtains information about food items in the food storage device through the network and then reminds the user through the user terminal. In some embodiments, the user terminal 140 may be a mobile device 140-1, a tablet computer 140-2, a laptop computer 140-3, a desktop computer 140-4, or other device having input and/or output capabilities, the like, or any combination thereof. The above examples are intended only to illustrate the broad scope of the user terminal device and not to limit its scope.
It should be noted that the above description of the application scenario of the method for detecting food freshness is only for convenience of description, and does not limit the present specification to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of components or sub-system may be configured to connect to other components without departing from such teachings.
FIG. 2 is a block diagram of a system 200 for detecting food freshness in accordance with some embodiments of the present description.
In some embodiments, detecting food freshness system 200 can include a first image acquisition module 210, a first determination module 220.
The first image acquisition module can be used for acquiring a first image of food stored in the food storage device based on the image acquisition device, wherein the image acquisition device shoots based on a plurality of position movements of the sliding rail in the food storage device. For more details on the image capturing device, the food storage device, the sliding rail and the first image, refer to fig. 3 and the related description thereof, which are not repeated herein.
The first determining module can be used for determining the food type of the food and the freshness information corresponding to each food based on the first image. For more details on the type of food and the freshness information, refer to fig. 3 and its related description, which are not repeated herein.
In some embodiments, detecting food freshness system 200 can further include a second image acquisition module 230, a second determination module 240, and an adjustment module 250.
The second image acquisition module can be used for acquiring a second image of the food storage device in a door-opening state based on the image acquisition device. Further details regarding the door open state and the second image are provided in fig. 4 and its related description, which are not repeated herein.
The second determination module may be configured to determine a number of uses of the food item based on the second image. For more details on the number of uses, refer to fig. 4 and its related description, which are not repeated herein.
The adjustment module may be configured to adjust freshness information of the corresponding food based on the number of uses.
In some embodiments, detecting food freshness system 200 can further include a first reminder time determination module 260. The first reminding time determining module may be configured to take a sequence of images as an input of the time determining model, and obtain a reminding time corresponding to each food, where the sequence of images includes first images of the same food at multiple time points. For more details on the image sequence, the time determination model, the reminding time and the plurality of time points, refer to fig. 5 and the related description thereof, which are not repeated herein.
In some embodiments, detecting food freshness system 200 can further include a second reminder time determination module 270. The second reminding time determining module can be used for determining the reminding time corresponding to each food based on the freshness information of each food.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, a processing device and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system (e.g., a microprocessor or specially designed hardware). Those skilled in the art will appreciate that the processing devices and modules thereof described above may be implemented via computer-executable instructions. The system and its modules of the present specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, or the like, but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above descriptions of the candidate item display and determination system and the modules thereof are only for convenience of description, and the description is not limited to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. In some embodiments, each module disclosed in fig. 2 may be a different module in a system, or may be a module that implements the functions of two or more modules described above. For example, each module may share one memory module, and each module may have its own memory module. Such variations are within the scope of the present disclosure.
FIG. 3 is an exemplary flow chart for detecting food freshness in accordance with some embodiments of the present description. As shown in fig. 3, the process 300 includes the following steps. In some embodiments, the process 300 may be performed by the first image acquisition module 210 and the first determination module 220.
Step 310, acquiring a first image of food stored in the food storage device based on an image acquisition device, wherein the image acquisition device performs shooting based on a plurality of position movements of a sliding rail in the food storage device.
The image capturing device may refer to a device that can perform photographing. In some embodiments, the image capture device may be a camera. The camera may capture a first image of food items in the food item storage device. In some embodiments, the image capture device may also be another device capable of taking a picture, such as a camera. The camera can shoot videos of food in the food storage device, and video frames are selected from the videos to serve as first images.
In some embodiments, the image capture device within the food storage device may be one. In some embodiments, the image capture device within the food storage device may be multiple.
The food storage device may refer to a device that stores food. In some embodiments, the food storage device may be a refrigerator. In some embodiments, the food storage device may also be other devices capable of storing food, such as a freezer, a fresh food cabinet.
The first image is an image of the food item contained in the food storage device. For example, the first image is an image of an apple contained in a refrigerator.
In some embodiments, the image capture device may be moved to a plurality of different positions for capturing the first image using a slide rail mounted within the food storage device. For example, the camera may be moved between different levels of the refrigerator by a slide rail mounted within the refrigerator to acquire the first image.
In some embodiments, the image capture device may acquire a first image of food items within the food item storage device at a plurality of points in time. The plurality of time points may be time points preset in advance, and for example, a first image may be acquired at intervals of 10 min.
In step 320, the food type of the food and the freshness information corresponding to each food are determined based on the first image.
The food type may refer to the type of food in the food storage device. For example, the food type may be apple, cabbage, milk, etc.
Freshness information may refer to information indicating the degree of freshness of food items in a food storage device. In some embodiments, freshness information can be represented by a score. For example, the freshness score of banana is 80 points and the freshness score of spinach is 40 points in the food storage device, with the higher the freshness score, the fresher the food. In some embodiments, the freshness information can also be represented in other ways, for example, the freshness information can also be represented by a grade, wherein the freshness grade is classified into I-V grades, I being least fresh, and V being most fresh.
In some embodiments, the first image input may be input to a classification and discrimination model that determines the food type of the food items within the food storage device and the freshness information corresponding to each food item. For more details on determining the food types of the food in the food storage device and the freshness information corresponding to each food based on the classification and discrimination model, refer to fig. 5 and the related description thereof, and are not repeated herein.
The image acquisition device utilizes the slide rail to move in a plurality of positions in food storage device and shoots, can acquire more, more comprehensive first image of food in the food storage device, and the food type and the new freshness information that every food corresponds that acquire based on above-mentioned first image are also more accurate. Each food in the food storage device has its corresponding freshness information, such as a freshness score, and such an arrangement makes the freshness of the food obvious and easy for a user to understand.
FIG. 4 is an exemplary flow diagram illustrating adjusting food freshness information according to some embodiments of the present description. As shown in fig. 4, the process 400 includes the following steps. In some embodiments, the flow 400 may be performed by the second image acquisition module 230, the second determination module 240, and the adjustment module 250.
And step 410, acquiring a second image of the food storage device in the door opening state based on the image acquisition device.
The open state may refer to a state in which a door of the food storage device is open. In the door-opened state, the user can use the food in the food storage device and also can place the food storage device. For example, in the open door state, the user may use the butter in the food storage device and replace the used butter in the food storage device.
The second image may refer to an image about the used food item in a state that the food storage device is opened. For example, the second image is an image in which the user takes milk in the refrigerator in the door opened state.
In some embodiments, the second image may be acquired with an image acquisition device. Specifically, the door of the refrigerator can be sensed through the sensor arranged in the food storage device, the lens of the image acquisition device is adjusted to be a wide-angle lens to shoot food in the food storage device, and a wide-angle image is acquired. And inputting the wide-angle image into a convolutional neural network model to determine the position where the user will use the food. And moving the image acquisition device to a position near the food using position based on the sliding rail, and adjusting the lens of the image acquisition device to be a common lens to acquire a second image.
In some embodiments, the convolutional neural network model may output a predicted location where the user will use the food based on the input wide-angle image. Illustratively, the convolutional neural network model outputs the position where the user is going to use the food as the first floor of the refrigerator based on the input wide-angle image. In some embodiments, the convolutional neural network model may be obtained by training. Specifically, the training sample with the identifier is input into a convolutional neural network model, and the parameters of the convolutional neural network model are updated through training. Wherein the training sample may be a wide-angle image with the location where the food is used. The training label may be the location where the food item is to be used. The training samples can be obtained from history shooting wide-angle images, and the training labels can be obtained through manual labeling based on the history shooting wide-angle images.
Based on the second image, the number of uses of the food item is determined, step 420.
The number of uses may refer to the number of times the food in the food storage device is used. For example, the milk in the refrigerator is used twice.
In some embodiments, the second image may be input to an image classification model, outputting the type of food used. Based on the type of food used and the historical number of uses of the corresponding food, the number of uses of the corresponding food is determined. For example, the second image is input into an image classification model, and the type of food used at the position is determined to be milk. The number of uses of the milk history is 2, and thus it can be determined that the number of uses of the milk is 3. For more details on determining the type of food to use based on the image classification model, refer to fig. 6 and its associated description, which are not repeated herein.
In some embodiments, the food items within the food item storage device may include a corresponding usage threshold. When the use frequency threshold of the food is exceeded by the user, the user can be reminded of the use frequency of the food, and is asked whether the food is actually replaced, if so, the user is reminded to record the use frequency of the replaced food and update the freshness information of the food, wherein the use frequency threshold corresponding to each food type can be determined through presetting. Illustratively, the preset threshold value of the number of uses of the milk is 10, and when the number of uses of the milk by the user exceeds 10, the user is reminded whether the milk has been replaced. The user determines that milk has been replaced and records that the number of uses of the replaced milk is 0. At this time, the adjustment module 250 may update the freshness information of the milk according to the data input by the user.
And step 430, adjusting the freshness information of the corresponding food based on the use times.
Specifically, the freshness information of the corresponding food can be adjusted by formula (1):
F=F0-RT (1)
wherein F is freshness information adjusted corresponding to food, and F0The freshness information before adjustment of the corresponding food is T, the number of times of using the corresponding food is T, and the adjustment coefficient preset for the corresponding food is R, wherein the adjustment coefficient can be determined according to the preset setting. For example, the original freshness information of milk is a freshness score of 90 points, the number of uses of the milk is 2 times, the adjustment coefficient is 3.5, and the adjusted freshness information of the milk can be calculated to be a freshness score of 83 points.
For some sealed food products that cannot capture information about the food product inside the package, the accuracy of determining the freshness information based on the first image is low. In this case, an embodiment of the present disclosure skillfully combines the number of times of using the food in the food storage device, adjusts the freshness information of the food, expands the types of the detected food, and also improves the accuracy of detecting the freshness of the food.
It should be noted that the above descriptions regarding the processes 300 and 400 are only for illustration and description, and do not limit the applicable scope of the present specification. Various modifications and changes to flow 300 and flow 400 will be apparent to those skilled in the art in light of this disclosure. However, such modifications and variations are intended to be within the scope of the present description.
FIG. 5 is a schematic illustration of determining a reminder time for each food item according to some embodiments of the present description. In some embodiments, fig. 5 may be performed by the first determination module 220 and the second reminder time determination module 270.
In some embodiments, a reminder time corresponding to each food item may be determined based on freshness information of each food item.
As shown in fig. 5, the freshness information 540 of the food type 530 corresponding to each food can be obtained using the first image 510 as an input to the classification and discrimination model 520. Based on the freshness information 540 for each food item, a reminder time 550 for each of the food items can be determined. Wherein the classification and discrimination model 520 includes an image classification model 520-1 and a freshness discrimination model 520-2 connected in sequence.
In some embodiments, the first image is used as an input to an image classification model, and the type of food in the first image may be derived. Illustratively, the image classification model may output the food types in the first image as apple, spinach, and milk based on the input first image.
In some embodiments, the output of the image classification model further includes a region corresponding to each food type in the first image. For example, the first image is input into the image classification model, and the type of the food in the first image is apple and the corresponding region of the apple in the first image can be output.
In some embodiments, the image classification model may be a yolo (you Only Look one) model. In some embodiments, the image classification model may also be other models, such as a convolutional neural network model.
In some embodiments, the image classification model may be obtained based on historical first image training. Specifically, a training sample with a mark is input into the image classification model, and parameters of the image classification model are updated through training. The training sample may be the first image, and the label of the training sample may be the food type and the area corresponding to each food type. The training sample can be obtained from the first image shot in the history, and the training label can be obtained through manual labeling based on the first image shot in the history.
In some embodiments, different kinds of freshness discrimination models can be selected for identification of freshness information based on the type of food in the first image output by the image classification model. For example, the type of food in the first image output by the image classification model is apple and banana. When the apple needs to be identified by the freshness information, the freshness discrimination model corresponding to the apple is selected. And when the banana is required to be identified according to the freshness information, selecting a freshness discrimination model corresponding to the banana.
In some embodiments, the freshness degree information corresponding to each food item can be output by inputting the first image with the food item type and the region information corresponding to each food item type output by the image classification model into the freshness degree discrimination model. For example, the first image with the apple and the apple region information output by the image classification model may be input into the freshness degree discrimination model corresponding to the apple, and the freshness degree score of the apple may be output as 85 points.
In some embodiments, the freshness discrimination model may include a feature extraction layer and a prediction layer connected in sequence, wherein the feature extraction layer may be implemented based on a convolutional neural network algorithm, and the prediction layer may be implemented based on a deep neural network algorithm. The feature extraction layer in the freshness degree discrimination model can extract the image features of the food needing freshness degree information identification in the first image, and then input the extracted image features of the food into the prediction layer to obtain the freshness degree information of the food.
In some embodiments, the freshness discrimination model can be obtained based on historical first image training. Specifically, the first images shot in the history can be manually labeled, and the first images with the food types and the region information corresponding to each food type are obtained to be used as training samples. And then, manually labeling the freshness information of each food type in the training sample to obtain a training label. And then inputting the training sample with the label into the freshness degree discrimination model, and updating the parameters of the freshness degree discrimination model through training to obtain the trained freshness degree discrimination model.
In some embodiments, the classification and discrimination model can be obtained by joint training of the image classification model and the freshness discrimination model: and the image classification model and the freshness discrimination model are jointly trained based on training samples, and parameters are updated. In some embodiments, the training sample-based joint training comprises: acquiring a training sample, wherein the training sample is a first image and can be acquired from a first image shot in history; the labels of the training samples are food types, areas corresponding to the food types and freshness information corresponding to the food types in the first image, and the labels of the training samples can be obtained by manually labeling the first image shot in history. The training samples can be input into an image classification model, and parameters of each layer of the classification and discrimination model are updated based on a predicted value and a label output by the freshness discrimination model, so that a trained image classification model and a trained freshness discrimination model are obtained. In some embodiments, the training data and labels of the model may also include other information.
The parameters of the freshness degree distinguishing model are obtained through the training mode, and the problem that a large amount of complicated manual labeling needs to be carried out on the training samples when the freshness degree distinguishing model is trained independently is solved under some conditions.
In some embodiments, the type of food items in the food item storage device and the freshness information corresponding to each food item may be adjusted based on the first image at the plurality of points in time.
In some embodiments, the reminder time corresponding to each food item may be determined based on the freshness information of each food item output by the classification and discrimination model.
The reminder time may refer to a time at which the user is reminded to eat the food. In some embodiments, the reminding time corresponding to the freshness information of each food can be determined according to a preset rule, where the preset rule can be a preset corresponding rule between the freshness information of each food type and the reminding time. For example, when the freshness information of an apple is a freshness score of 85 minutes, it can be determined according to a preset rule that the reminding time of the apple is 8 hours later.
In some embodiments, the reminder information may be sent to the user terminal at the reminder time. In some embodiments, the reminder information can include freshness information, a freshness evaluation, and an eating recommendation for the food product. The freshness evaluation may refer to a freshness degree evaluation of freshness information for the food. The consumption recommendations may be determined based on the freshness scores according to preset rules. For example, when the freshness score is 100-90 minutes, the food is evaluated to be fresh, and the eating suggestion is that the food is eaten within 5 days; when the freshness evaluation is 90-75 minutes, the food is relatively fresh, and the eating suggestion is that the food is eaten within 3 days; when the freshness evaluation is 75-60 minutes, the food is evaluated to be not too fresh, and the eating suggestion is that the food is recommended to be eaten up today; when the freshness score is 60-0 minutes, the food is judged to be not fresh, and the food is recommended to be not recommended to be eaten.
Illustratively, the reminder sent to the user terminal may be "apple is rated with a freshness of 50, not fresh, not recommended to eat".
The reminding time of the user is determined by utilizing the freshness information of the food, the user is reminded, and the user can receive and know the freshness degree of the food more intuitively and conveniently. Meanwhile, the situation that the user forgets the food in a hidden place in the food storage device can be avoided.
FIG. 6 is a schematic illustration of determining a reminder time for each food item according to some embodiments of the present description. In some embodiments, fig. 6 may be performed by the first image acquisition module 210, the second image acquisition module 230, the second determination module 240, and the first reminder time determination module 260.
In some embodiments, a sequence of images is used as an input to the time determination model, and a corresponding reminder time is obtained for each food item, wherein the sequence of images includes a first image of the same food item at a plurality of time points.
The sequence of images may refer to a sequence of first images of the same food item at multiple points in time, wherein the first images of the same food item may be acquired according to an image classification model. For more details on the image classification model, refer to fig. 5 and the related description thereof, which are not repeated herein.
Illustratively, the image sequence of the first images containing the same apple obtained at 17, 18 and 19 is { a, b, c }, wherein a in the image sequence represents the first image containing the apple at 17, b in the image sequence represents the first image containing the same apple at 18, and c in the image sequence represents the first image containing the same apple at 19.
As shown in fig. 6, the image sequence 610, the sequence of number of uses 670, and the temperature data 680 may be input to the time determination model 620, resulting in freshness information 510 for each food product. Based on the freshness information 540 for each food item, a reminder time 550 for each of the food items can be determined. Wherein the image sequence 610 may include a first image 510 of the same food at a plurality of time points, and the time determination model 620 may include an image feature extraction layer 620-1, a sequence feature extraction layer 620-2, and a prediction layer 620-3 connected in sequence.
In some embodiments, the sequence of images may be used as an input to an image feature extraction layer, resulting in a sequence of image features. The image feature sequence may refer to a sequence of extracting image features of the same food item at multiple points in time. For example, the sequence of extracted image features may be a sequence of image features of banana peel in the sequence of images of the first image at a plurality of points in time. In some embodiments, the image feature extraction layer of the temporal determination model may be implemented based on a convolutional neural network algorithm.
In some embodiments, the output of the image feature extraction layer may be used as an input to the sequence feature extraction layer.
In some embodiments, the input to the sequence feature extraction layer 620-2 may also include temperature data 680 of the food storage device. In some embodiments, the temperature data may be a temperature sequence at a plurality of time points. For example, the temperature sequences at 17, 18, and 19 are {1.1, 1.2, 1.0}, where 1.1 represents the temperature of the food storage device at 17 in the temperature sequence to be 1.1 ℃, 1.2 represents the temperature of the food storage device at 18 to be 1.2 ℃, and 1.0 represents the temperature of the food storage device at 19 to be 1.0 ℃. In some embodiments, temperature data of the food storage device may be acquired by a temperature sensor.
In some embodiments, the input to the sequence feature extraction layer 620-2 may also include a sequence of usage times for the corresponding food item. In particular, the second image 630 may be input into the recognition model 640 and output as the use food 650. And determining the using times 660 of the food according to the historical using times of the food. And determining a use time sequence 670 of the food according to the time of the second image, wherein the historical use time is the number of times that the food has been used before, the initial use time is 0, and the historical use time is increased by 1 time for each time the food is determined to be used. Illustratively, the second image is input into the recognition model, and the food used is determined to be butter. And determining the use frequency of the butter to be 1 according to the historical use frequency of the butter of 0. The time of the second image shot is 17:58, so the sequence of usage times of the butter at 17, 18, and 19 is {0,1,1 }. In some embodiments, the recognition model may be acquired based on image training of the historical second images. Specifically, a training sample with a mark is input into the recognition model, and parameters of the recognition model are updated through training. Wherein the training sample may be the second image and the label of the training sample may be the food used. The training sample can be obtained from the second image shot in history, and the training label can be obtained by manual labeling based on the second image shot in history.
In some embodiments, the sequence feature model of the time determination model may be implemented based on a recurrent neural network algorithm.
In some embodiments, the output of the sequence feature model may be used as an input to a prediction layer, as freshness information for each food item. In some embodiments, the prediction layer of the temporal determination model may be implemented based on a deep neural network algorithm.
In some embodiments, the time alert model may be machine learning trained on the initial time alert model based on a large number of labeled training samples. The training samples of the time reminding model can be image sequences, temperature data and use time sequences, and the labels of the training samples of the time reminding model can be freshness information of each food. The image sequence in the training sample can be obtained based on the first image shot in the history, the temperature data in the training sample can be obtained based on the historical temperature data of the food storage device, the use time sequence in the training sample can be obtained based on the second image shot in the history, and the training label can be obtained based on the first image shot in the history. In some embodiments, in training the time alert model, the image feature extraction layer in the time alert model may share parameters with the feature extraction layer in the image classification model. By sharing parameters, the complexity of model training may be reduced.
In some embodiments, the reminder time corresponding to each food item may be determined based on the freshness information for each food item type output by the time determination model. For more details on determining the reminding time based on the freshness information, refer to fig. 5 and the related description thereof, which are not repeated herein.
One embodiment of the present specification can not only consider the current state of the food, but also show the influence of time variation on the food by inputting the image sequence, thereby improving the accuracy of food freshness detection. In addition, the freshness of the food can be influenced by the temperature in the food storage device and the using times of the food, and the information or data can be judged more effectively by model processing, so that the detection accuracy is improved.
The embodiment of the present specification further provides an apparatus for detecting food freshness, which includes a storage medium for storing computer instructions and a processor for executing at least a part of the computer instructions to implement the aforementioned method for detecting food freshness.
The embodiment of the specification also provides a computer readable storage medium. The storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer realizes the method for detecting the freshness of the food.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. A method of detecting food freshness, comprising:
acquiring a first image of food stored by a food storage device based on an image acquisition device, wherein the image acquisition device shoots based on the movement of a sliding rail at a plurality of positions in the food storage device;
determining a food type of the food items and freshness information corresponding to each of the food items based on the first image.
2. The method of claim 1, wherein the method further comprises:
acquiring a second image of the food storage device in a door opening state based on the image acquisition device;
determining a number of uses of the food item based on the second image;
and adjusting the freshness information of the corresponding food based on the using times.
3. The method of claim 1, wherein the method further comprises:
and taking an image sequence as an input of a time determination model, and obtaining a reminding time corresponding to each food, wherein the image sequence comprises first images of the same food at a plurality of time points.
4. The method of claim 1, wherein the method further comprises:
and determining the reminding time corresponding to each food based on the freshness information of each food.
5. A system for detecting freshness of a food product, comprising:
the food storage device comprises a first image acquisition module, a second image acquisition module and a control module, wherein the first image acquisition module acquires a first image of food stored in the food storage device based on an image acquisition device, and the image acquisition device shoots based on the movement of a sliding rail at a plurality of positions in the food storage device;
a first determining module for determining a food type of the food and freshness information corresponding to each of the food based on the first image.
6. The system of claim 5, wherein the system further comprises:
the second image acquisition module is used for acquiring a second image of the food storage device in a door opening state based on the image acquisition device;
a second determination module for determining the number of uses of the food based on the second image;
and the adjusting module is used for adjusting the corresponding freshness information of the food based on the using times.
7. The system of claim 5, wherein the system further comprises:
and the first reminding time determining module is used for taking an image sequence as the input of the time determining model and obtaining the reminding time corresponding to each food, wherein the image sequence comprises first images of the same food at a plurality of time points.
8. The system of claim 5, wherein the system further comprises:
and the second reminding time determining module is used for determining the reminding time corresponding to each food based on the freshness information of each food.
9. An apparatus for detecting food freshness comprising a processor, wherein the processor is configured to perform the method of detecting food freshness of any one of claims 1-4.
10. A computer-readable storage medium storing computer instructions, which when read by a computer, perform a method for detecting freshness of food according to any one of claims 1 to 4.
CN202111068990.9A 2021-09-13 2021-09-13 Method and system for detecting food freshness Withdrawn CN113780182A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111068990.9A CN113780182A (en) 2021-09-13 2021-09-13 Method and system for detecting food freshness

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111068990.9A CN113780182A (en) 2021-09-13 2021-09-13 Method and system for detecting food freshness

Publications (1)

Publication Number Publication Date
CN113780182A true CN113780182A (en) 2021-12-10

Family

ID=78843119

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111068990.9A Withdrawn CN113780182A (en) 2021-09-13 2021-09-13 Method and system for detecting food freshness

Country Status (1)

Country Link
CN (1) CN113780182A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114359895A (en) * 2021-12-30 2022-04-15 广东哈士奇制冷科技股份有限公司 Method and system for detecting freshness of food
CN114663379A (en) * 2022-03-17 2022-06-24 触媒净化技术(南京)有限公司 Method and system for determining regeneration capacity of denitration catalyst
CN114971933A (en) * 2022-05-25 2022-08-30 连云港银丰食用菌科技有限公司 Edible mushroom preservation method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184719A (en) * 2015-08-28 2015-12-23 小米科技有限责任公司 Foodstuff monitoring method and device
CN106871570A (en) * 2017-01-22 2017-06-20 浙江大学 A kind of device based on various dietary regimens in single multispectral imaging unit detection refrigerating chamber
CN107576126A (en) * 2017-09-05 2018-01-12 惠州Tcl家电集团有限公司 Mobile photographic device, control method and the refrigerator of refrigerator
CN108362073A (en) * 2018-02-05 2018-08-03 上海康斐信息技术有限公司 The management method and system of food in a kind of refrigerator
CN108663331A (en) * 2017-03-27 2018-10-16 青岛海尔智能技术研发有限公司 Detect the method and refrigerator of food freshness in refrigerator
CN110503314A (en) * 2019-08-02 2019-11-26 Oppo广东移动通信有限公司 A kind of freshness appraisal procedure and device, storage medium
US20190370602A1 (en) * 2018-06-04 2019-12-05 Olympus Corporation Learning management device, learning management method, and imaging device
CN112923655A (en) * 2021-03-01 2021-06-08 合肥美菱物联科技有限公司 Refrigerator fresh-keeping index monitoring system and method
CN113124631A (en) * 2019-12-31 2021-07-16 青岛海高设计制造有限公司 Prompting method and device for refrigerator and refrigerator

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184719A (en) * 2015-08-28 2015-12-23 小米科技有限责任公司 Foodstuff monitoring method and device
CN106871570A (en) * 2017-01-22 2017-06-20 浙江大学 A kind of device based on various dietary regimens in single multispectral imaging unit detection refrigerating chamber
CN108663331A (en) * 2017-03-27 2018-10-16 青岛海尔智能技术研发有限公司 Detect the method and refrigerator of food freshness in refrigerator
CN107576126A (en) * 2017-09-05 2018-01-12 惠州Tcl家电集团有限公司 Mobile photographic device, control method and the refrigerator of refrigerator
CN108362073A (en) * 2018-02-05 2018-08-03 上海康斐信息技术有限公司 The management method and system of food in a kind of refrigerator
US20190370602A1 (en) * 2018-06-04 2019-12-05 Olympus Corporation Learning management device, learning management method, and imaging device
CN110503314A (en) * 2019-08-02 2019-11-26 Oppo广东移动通信有限公司 A kind of freshness appraisal procedure and device, storage medium
CN113124631A (en) * 2019-12-31 2021-07-16 青岛海高设计制造有限公司 Prompting method and device for refrigerator and refrigerator
CN112923655A (en) * 2021-03-01 2021-06-08 合肥美菱物联科技有限公司 Refrigerator fresh-keeping index monitoring system and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114359895A (en) * 2021-12-30 2022-04-15 广东哈士奇制冷科技股份有限公司 Method and system for detecting freshness of food
CN114359895B (en) * 2021-12-30 2024-04-02 广东哈士奇制冷科技股份有限公司 Method and system for detecting freshness of food
CN114663379A (en) * 2022-03-17 2022-06-24 触媒净化技术(南京)有限公司 Method and system for determining regeneration capacity of denitration catalyst
CN114663379B (en) * 2022-03-17 2023-03-28 触媒净化技术(南京)有限公司 Method and system for determining regeneration capacity of denitration catalyst
CN114971933A (en) * 2022-05-25 2022-08-30 连云港银丰食用菌科技有限公司 Edible mushroom preservation method and system
CN114971933B (en) * 2022-05-25 2024-01-16 连云港银丰食用菌科技有限公司 Preservation method and system for edible fungi

Similar Documents

Publication Publication Date Title
CN113780182A (en) Method and system for detecting food freshness
US11181515B2 (en) Scent-based spoilage sensing refrigerator
US10281200B2 (en) Image-based spoilage sensing refrigerator
US11425338B2 (en) Refrigerator, and system and method for controlling same
US11680744B2 (en) Artificial intelligence refrigerator
JP6324360B2 (en) Refrigerator and network system including the same
CN105823778A (en) Article identification method, and apparatus and system thereof
CN108932512A (en) A kind of refrigerator food materials input method
TW201804860A (en) Refrigerator, network system provided with same, living circumstances reporting method and living circumstance reporting program
US10628792B2 (en) Systems and methods for monitoring and restocking merchandise
JP2021165604A (en) Storage container, refrigerator, and maturity estimation device
KR102017980B1 (en) Refrigerator with displaying image by identifying goods using artificial intelligence and method of displaying thereof
JP7340766B2 (en) refrigerator
US20210089769A1 (en) Refrigerator appliances and methods for tracking stored items
US20170363348A1 (en) System and method of refrigerator content tracking
US20230324113A1 (en) Food management system, and refrigerator
CN111488831B (en) Food association identification method and refrigerator
US20230076984A1 (en) Inventory management system in a refrigerator appliance
KR20180020214A (en) Object recognition for storage structures
US20220414391A1 (en) Inventory management system in a refrigerator appliance
US11965691B2 (en) Refrigerator appliance with smart drawers
US20230228481A1 (en) Refrigerator appliance with smart drawers
US20230308611A1 (en) Multi-camera vision system in a refrigerator appliance
JP7475000B2 (en) Storage/Receipt Management Device, Storage/Receipt Management System, and Storage/Receipt Management Method
WO2023151694A1 (en) Refrigeration appliance having intelligent door alarm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20211210