WO2022073430A1 - 制冷电器和用户识别的方法 - Google Patents

制冷电器和用户识别的方法 Download PDF

Info

Publication number
WO2022073430A1
WO2022073430A1 PCT/CN2021/120626 CN2021120626W WO2022073430A1 WO 2022073430 A1 WO2022073430 A1 WO 2022073430A1 CN 2021120626 W CN2021120626 W CN 2021120626W WO 2022073430 A1 WO2022073430 A1 WO 2022073430A1
Authority
WO
WIPO (PCT)
Prior art keywords
appendage
score
image
user
determining
Prior art date
Application number
PCT/CN2021/120626
Other languages
English (en)
French (fr)
Inventor
古德曼 施罗德迈克尔
弗吉尼亚 莫里斯莎拉
Original Assignee
海尔智家股份有限公司
青岛海尔电冰箱有限公司
海尔美国电器解决方案有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海尔智家股份有限公司, 青岛海尔电冰箱有限公司, 海尔美国电器解决方案有限公司 filed Critical 海尔智家股份有限公司
Priority to EP21876948.7A priority Critical patent/EP4206595A4/en
Priority to CN202180068477.1A priority patent/CN116348727A/zh
Publication of WO2022073430A1 publication Critical patent/WO2022073430A1/zh

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D29/00Arrangement or mounting of control or safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D2500/00Problems to be solved
    • F25D2500/06Stock management
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D2700/00Means for sensing or measuring; Sensors therefor
    • F25D2700/04Sensors detecting the presence of a person
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D2700/00Means for sensing or measuring; Sensors therefor
    • F25D2700/06Sensors detecting the presence of a product
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Definitions

  • the present invention generally relates to a technique for tracking a user of a storage enclosure and items within the storage enclosure, such as a technique for tracking a user of a refrigeration appliance and items within the refrigeration appliance.
  • Storage enclosures such as refrigeration appliances and food storage cabinets, typically provide an enclosed chamber for receiving multiple items or objects.
  • refrigeration appliances typically include a box that defines a refrigeration compartment. The user can place food or objects in the refrigerated compartment to prevent the spoilage of such food. Thereby, the usable life of the perishable item or object can be increased.
  • a large number of stored items may accumulate in the refrigeration compartment of the refrigerator.
  • users of the refrigeration appliance may have difficulty identifying items located within the refrigeration appliance. Additionally, the user may have difficulty determining the quantity of certain items within the refrigeration appliance. This is especially true when multiple users add/remove items to/from a common refrigeration appliance without communicating with other users.
  • the user may accidentally purchase excessive or unwanted items. For example, certain foods that do not perish easily in the refrigerated compartment and may not be consumed as often. Thereby, the period of time that such food can be kept in the refrigerated compartment is extended. The user may forget the food item and purchase a substitute despite already having an acceptable item of the same type. In this way, it may cause inconvenience to the user or generate unnecessary money consumption.
  • some users may be unaware that certain items have been retrieved or consumed. As such, the user may not be able to replace or replenish such items.
  • a tracking function of which items are consumed or withdrawn by a particular user may also be attractive. Such tracking can help users determine consumption habits or caloric intake.
  • Some existing appliances have attempted to address these problems by requiring the user to manually input each item being stored.
  • Other appliances have used various methods, such as rulers, to estimate or guess the quantity or identifying information of items stored or consumed.
  • this attempt is not cumbersome and prone to inaccuracy.
  • a refrigeration appliance with features that assist a user in tracking the contents of a refrigeration compartment of the refrigeration appliance. Additionally or alternatively, a refrigeration appliance with features to identify multiple users, and optionally, features to identify items added/removed from a refrigerated compartment would be a beneficial.
  • a refrigeration appliance may include a box body, a door body, a camera module and a controller.
  • the enclosure may define a refrigerated compartment.
  • the door may be rotatably hinged to the box to provide alternative access to the refrigerated compartment.
  • the camera module can be mounted to the case.
  • the controller may be operably coupled to the camera module.
  • the controller may be configured to initiate operational routines.
  • the operational routine may include initiating an image capture sequence at the camera module.
  • the image capture sequence may include two-dimensional images captured at the camera module.
  • the operational routine may further include: determining an appendage occupancy area within the two-dimensional image; and responsive to determining the appendage occupancy area and analyzing the appendage occupancy area.
  • the operational routine may further include: assigning a confidence score to a user profile stored in the controller based on the analysis of the area occupied by the appendage; comparing the assigned confidence score to a threshold score; and based on the assigned confidence score The confidence scores are compared to record metadata about 2D images.
  • a method of operating a refrigeration appliance may include initiating an image capture sequence at the camera module.
  • the image capture sequence may include two-dimensional images captured at the camera module.
  • the method may further include: determining an appendage occupancy area within the two-dimensional image; and responsive to determining the appendage occupancy area and analyzing the appendage occupancy area.
  • the method may further include: assigning a confidence score to a user profile stored in the controller based on the analysis of the area occupied by the appendage; comparing the assigned confidence score to a threshold score; and based on the assigned confidence score Degree scores are compared and metadata about the 2D image is recorded.
  • FIG. 1 provides a front elevational view of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention.
  • FIG. 2 provides a front elevational view of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention, with the refrigerated door shown in an open position.
  • FIG. 3 provides a schematic diagram of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention.
  • FIG. 4 illustrates an exemplary two-dimensional image of a drawer of a refrigeration appliance captured at a camera assembly of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention.
  • FIG. 5 shows an exemplary two-dimensional image of a drawer of a refrigeration appliance captured at a camera assembly of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention.
  • FIG. 6 provides a flowchart illustrating a method of operation of a refrigeration appliance formed in accordance with an exemplary embodiment of the present invention.
  • the present invention provides a method to assist in identifying the process of user interaction or engagement with a storage enclosure, such as a refrigeration appliance or a food storage cabinet.
  • the method may include means for automatically (eg, without direct user input) identifying which user added items to or removed items from the appliance (eg, to identify such actions in association with the user) one or more steps.
  • FIG. 1 provides a front elevation view of a refrigeration appliance 100 formed in accordance with an exemplary embodiment of the present invention, with a refrigeration door 128 of the refrigeration appliance 100 shown in a closed position.
  • FIG. 2 provides a front elevation view of the refrigeration appliance 100 with the refrigerator door 128 shown in an open position to reveal the fresh food compartment 122 of the refrigeration appliance 100 .
  • the refrigeration appliance 100 includes a housing or case 102 extending along a vertical direction V between a top portion 101 and a bottom portion 102 .
  • the bin 120 defines a refrigerated compartment for receiving food items for storage.
  • the box body 120 defines a food fresh-keeping chamber 122 and a freezing chamber 124 , the food fresh-keeping chamber 122 is provided at or adjacent to the top 101 of the box body 120 , and the freezing chamber 124 is provided at the bottom 102 of the box body 120 or its adjacent location.
  • the refrigeration appliance 100 is generally referred to as a bottom-mounted refrigerator.
  • the refrigerator door 128 is rotatably hinged to the edge of the box 120 for selective access to the food preservation compartment 122 .
  • a freezing door 130 is arranged below the refrigerating door 128 so as to selectively open to the freezing compartment 124 .
  • the freezer door 130 is coupled to a freezer drawer 142 (not shown), which is slidably mounted within the freezer compartment 124. As described above, the refrigerator door 128 and the freezer door 130 are shown in the closed position in FIG. 1 , and the refrigerator door 128 is shown in the open position in FIG. 2 .
  • the storage components include a box 140 installed in the food preservation compartment 122 , a drawer 142 and a shelf 144 .
  • Box 140, drawer 142, and shelf 144 are configured to receive stored items (eg, beverages or solid foods) and may help organize such foods.
  • the drawer 142 may receive fresh food (eg, vegetables, fruit, or cheese) and increase the useful life of such fresh food.
  • Refrigeration appliance 100 also includes features to assist a user in identifying food items located within food preservation compartment 122 or freezer compartment 124 .
  • a user may utilize these features, for example, to view food items (ie, stored items) stored in food preservation compartment 122 or freezer compartment 124, or to create an inventory of such stored items. This feature is discussed in more detail below.
  • FIG. 3 provides a schematic diagram of the refrigeration appliance 100 .
  • the refrigeration appliance 100 includes a controller 150 operatively coupled or in communication with components of a refrigeration system (not shown) of the refrigeration appliance 100 configured to cool the food preservation compartment 122 or the freezer compartment 124 .
  • Components include compressor 170 , evaporator fan 172 and condenser fan 174 .
  • Controller 150 may selectively operate such components to cool food preservation compartment 122 or freezer compartment 124 .
  • the controller 150 is also in communication with a thermostat 152 (eg, a thermocouple or thermistor).
  • Thermostat 152 may be disposed in food preservation compartment 122 or freezer compartment 124 (FIG. 2).
  • the controller 150 may receive a signal corresponding to the temperature of the food preservation compartment 122 or the freezer compartment 124 from the thermostat 152 .
  • the controller 150 may also include an internal timer for calculating the elapsed time period.
  • the controller 150 may include memory and one or more microprocessors, CPUs, etc., such as a general-purpose or special-purpose microprocessor, for executing programmed instructions or micro-control code associated with the operation of the refrigeration appliance 100 .
  • the memory may represent random access memory such as DRAM, or read only memory such as ROM or FLASH.
  • the processor executes permanently programmed instructions stored in memory.
  • the instructions include a software package configured to operate the appliance 100, or to execute an operational routine (eg, see the example method 600 of FIG. 6 in the description below).
  • the memory may be a separate component from the processor, or may be contained on a board within the processor.
  • the controller 150 may be configured to be sufficient to perform control functions without using a microprocessor (eg, using a combination of discrete analog or digital logic circuits; such as switches, amplifiers, integrators, comparators, flip-flops, AND gates, etc.), so that the realization of the above control functions does not depend on software.
  • a microprocessor eg, using a combination of discrete analog or digital logic circuits; such as switches, amplifiers, integrators, comparators, flip-flops, AND gates, etc.
  • the controller 150 may be provided at various locations throughout the refrigeration appliance 100 . Input/output (“I/O”) signals may be routed between the controller 150 and various operating components of the refrigeration appliance 100 .
  • I/O Input/output
  • One or more components of refrigeration appliance 100 may be in operative communication (eg, in electrical communication) with controller 150 via one or more conductive signal lines or a shared communication bus. Additionally or alternatively, one or more components of refrigeration appliance 100 may be in operative communication (e.g., wirelessly) with controller 150 via one or more wireless signal bands.
  • the refrigeration appliance 100 also includes a camera or camera module 160 .
  • Camera 160 may be any type of device suitable for capturing a two-dimensional photograph or image, such as the image shown in FIG. 4 or FIG. 5 .
  • the camera 160 may be a video camera or a digital camera with an electronic image sensor (eg, a charge coupled device (CCD) or CMOS sensor).
  • the controller 150 eg, electrically or wirelessly
  • the controller 150 can receive signals from the camera 160 corresponding to the images captured by the camera 160 .
  • camera 160 is disposed within refrigeration appliance 100 and directed toward one or more refrigeration compartments (eg, food preservation compartment 122—FIG. 2).
  • camera 160 is mounted at the top (eg, adjacent top 101 ) within food preservation compartment 122 .
  • camera 160 may be fixed to or positioned to point through the top wall defining the interior of food preservation compartment 122 .
  • the cameras 160 may be directed downward, and additionally or alternatively, a plurality of independent cameras 160 (eg, shown in dotted lines in FIG.
  • camera 160 When assembled, camera 160 may be directed at at least a portion of any particular one of drawer 142 and shelf 144 (FIG. 2), or may be directed at at least a portion of the combination of drawer 142 and shelf 144 (FIG. 2). As such, camera 160 may capture images of one of drawers 142, all drawers 142, one of shelves 144, all shelves 144, or any suitable combination thereof.
  • the refrigeration appliance 100 includes an integrated display 180 .
  • the integrated display 180 may be mounted on the refrigerator door 128 ( FIG. 1 ) or at any other suitable location on the refrigeration appliance 100 .
  • the integrated display 180 is in operative communication with the controller 150 such that the integrated display 180 can receive signals from the controller 150 corresponding to images captured by the camera 160 .
  • the integrated display 180 may receive such signals from the controller 150 and visually present the above-mentioned images to the user.
  • the integrated display 180 may include, for example, a liquid crystal display panel (LCD), a plasma display panel (PDP), or any other suitable mechanism for displaying images (eg, a projector).
  • LCD liquid crystal display panel
  • PDP plasma display panel
  • the refrigeration appliance 100 includes a network interface (not shown) that couples the refrigeration appliance 100 (eg, the controller 150 ) to the network 190 so that the refrigeration appliance 100 can pass the network 190 Transmission and reception of information.
  • Network 190 may be any wired or wireless network, such as a WAN, LAN or HAN.
  • refrigeration appliance 100 communicates with mobile display 182 via network 190 .
  • Mobile display 182 may be any device configured to communicate over network 190 and display images received from network 190 .
  • mobile display 182 may be a computer, smartphone or tablet.
  • the mobile display 182 is in communication with the controller 150 so that the mobile display 182 can receive signals from the controller 150 (via the network 190 ) corresponding to the user interface, or so that the mobile display 182 can receive from the controller 150 (via the network 190 ) and The signal corresponding to the image captured by the camera 160 .
  • Mobile display 182 may receive such signals from controller 150 and visually present one or more of the above-described images to the user.
  • Mobile display 182 may include, for example, a liquid crystal display panel (LCD), plasma display panel (PDP), or any other suitable mechanism for displaying images (eg, a projector). Mobile display 182 may also include an interface (eg, a tactile input such as a button, or a graphical user interface) to allow mobile display 182 to initiate communication with refrigeration appliance 100 over network 190 .
  • LCD liquid crystal display panel
  • PDP plasma display panel
  • Mobile display 182 may also include an interface (eg, a tactile input such as a button, or a graphical user interface) to allow mobile display 182 to initiate communication with refrigeration appliance 100 over network 190 .
  • an interface eg, a tactile input such as a button, or a graphical user interface
  • one or more cameras 160 may capture one or more two-dimensional images that may be transmitted to controller 150 (eg, in the form of data signals) Images (eg, as a video feed or a series of consecutive still images). From the captured images, items within the field of view (eg, set field of view) of one or more cameras 160 (eg, stored items or non-stored items, stored items such as food, etc., non-stored items such as User appendages, shelves, movable drawers, etc.) may be automatically recognized by the controller 150.
  • items within the field of view (eg, set field of view) of one or more cameras 160 eg, stored items or non-stored items, stored items such as food, etc., non-stored items such as User appendages, shelves, movable drawers, etc.
  • identifying or analyzing such items may be accomplished by edge matching, divide-and-conquer strategy searches, grayscale matching, histograms of receptive field responses, or any other suitable routine (eg, executed at controller 150 based on Routines performed from one or more images captured by one or more cameras 160).
  • FIGS. 4 and 5 various exemplary two-dimensional images related to drawer storage are shown, such as images that may be captured at camera 160 (FIG. 2), images that may be captured at integrated display 180 (FIG. 3) The image viewed, or viewable at the mobile display 182 (FIG. 3).
  • camera 160 may be selectively removably mounted or fixedly mounted (eg, on or within appliance 100). When assembled, camera 160 may have a set field of view (eg, an area of appliance 100 or its surroundings that may be captured at camera 160 to be included within a two-dimensional image).
  • FIG. 4 illustrates an exemplary two-dimensional image, such as may be captured at camera 160 (FIG. 2), that is part of an image capture sequence.
  • the image at FIG. 4 shows a set field of view (or a sub-region of the set field of view) of camera 160 directed toward food preservation compartment 122 .
  • the image at FIG. 4 may be viewed at display 180 or 182 once captured (eg, as part of an image capture sequence).
  • FIG. 4 provides a view/image of drawer 142 in an open state, empty inside (eg, not storing any of the above-mentioned stored items), and without line of sight (eg, in FIG. 5 ).
  • the space shown before or after the user appendage 230 is contained within the set field of view between the camera 160 and the drawer 142).
  • FIG. 5 shows another exemplary two-dimensional image, such as may be captured at camera 160, as part of an image capture sequence.
  • the image at FIG. 5 shows a set field of view (or a sub-region of the set field of view) of camera 160 ( FIG. 2 ) directed toward food preservation compartment 122 .
  • the image at FIG. 5 may be viewed at display 180 or 182 once captured (eg, as part of an image capture sequence).
  • Figure 5 provides a view/image of a drawer 142 in an open state, containing one or more stored items, and including a user's user appendages 230 (eg, hands).
  • controller 150 is configured to evaluate the content of one or more two-dimensional images from camera 160 to help identify a particular user or item.
  • the controller 150 may be configured to identify a particular user based on signals or images received from the camera 160 (eg, during or before the formation of the image capture sequence) with which the stored user profile or user is engaged
  • the stored items 234 of eg, for addition to or removal from the chamber 122 ) are associated.
  • controller 150 may identify drawers 142 that have been positioned within a predetermined sub-area (eg, border zone 216 ) that belongs to the setting at camera 160 field of view.
  • each two-dimensional image includes a plurality of pixels (eg, arranged in a predefined grid).
  • the predetermined boundary zone 216 establishes a two-dimensional pixel grid or sub-region that is fixed relative to (eg, forwards from) the food preservation compartment 122 .
  • another sub-portion of pixels may be identified as containing user appendages 230 or stored items 234.
  • an appendage occupancy area 232 defining a typical user appendage 230 has been captured in the two-dimensional image. It will be appreciated that the above-described determination process may be accomplished by detecting the outline or general shape of the user's appendage 230 in the two-dimensional image using, for example, any suitable detection routine (eg, executed at the controller 150). Once such an appendage occupied area 232 is determined to exist, the pixels therein may be further analyzed (eg, using an appropriate identification routine executed at controller 150).
  • the analysis process described above may include: comparing the appendage occupied area 232 to one or more stored user profiles, where each user profile corresponds to a different user and includes a corresponding data set (eg, records associated with the user) A dataset describing the metadata associated with the file or image). From the analysis of the appendage occupied area 232, one or more scores (eg, confidence scores) may be generated. For example, a unique confidence score may be generated for each user profile, thereby generally indicating the arithmetic likelihood that a user appendage 230 captured by the appendage footprint area 232 is associated with a particular user profile.
  • scores eg, confidence scores
  • a confidence score may indicate that the captured user appendage 230 (ie, the data of the user appendage 230 ) corresponds to the appendage corresponding to the particular user profile (ie, the appendage corresponding to the particular user profile). data) that match each other.
  • the stored items 234 in the two-dimensional image may be identified by an appropriate identification routine (eg, at the controller 150 ) Routines based on one or more images captured from one or more cameras 160), wherein the aforementioned identification routines such as edge matching, divide-and-conquer strategy search, grayscale matching, histograms of receptive field responses, or any other suitable routine.
  • an appropriate identification routine eg, at the controller 150
  • Routines based on one or more images captured from one or more cameras 160
  • the aforementioned identification routines such as edge matching, divide-and-conquer strategy search, grayscale matching, histograms of receptive field responses, or any other suitable routine.
  • a common user appendage 230 or a common stored item 234 may be matched across multiple images captured by one or more cameras 160 .
  • a particular item may be first determined to be present in multiple images by determining that a certain footprint 232 or 236 is present within both images (i.e., the same footprint). This occupied area 232 or 236 in the multiple images can then be matched to determine that the occupied area 232 or 236 is occupied by a common item (eg, appendage 230 or stored item 234) in the multiple images.
  • the multiple images can then be analyzed individually (eg, by analyzing the area of appendage occupancy within the multiple two-dimensional images) at the common occupancy area 232 or 236 to help identify a particular item (eg, identify the particular item as a certain item) A specific user appendage 230 or a specific stored item 234).
  • the same item 230 or 234 may be identified as being present in multiple images captured by the same camera 160 .
  • the single item can also be identified.
  • the same item may be identified as being present in multiple images captured by separate cameras 160 .
  • a single item eg, user appendage 230 or stored item 234 can be identified in a manner that is captured in multiple different fields of view.
  • FIG. 6 provides a flowchart of a method 600 in accordance with an exemplary embodiment of the present invention.
  • method 600 provides a method of operating a refrigeration appliance 100 (FIG. 1) that includes a camera 160 or a digital display (eg, integrated display 180 or mobile display 182), as previously described.
  • Method 600 may be performed by a device such as by controller 150 (FIG. 3).
  • controller 150 may communicate with camera 160, integrated display 180 (FIG. 3), or mobile display 182 (FIG. 3), as previously described.
  • the controller 150 may send signals to and receive signals from the camera 160 , the integrated display 180 , or the mobile display 182 .
  • the controller 150 may further communicate with other suitable components of the appliance 100 to facilitate the operation of the appliance 100 .
  • the method provided in accordance with the present invention may allow efficient processing of one or more two-dimensional images to identify a particular user engaged with an appliance at a given moment.
  • FIG. 6 depicts steps performed in a particular order for purposes of illustration and description. Those of ordinary skill in the art will appreciate when implementing the disclosure provided herein that the steps of any method described herein may be modified, adjusted, rearranged, omitted or expanded in various ways without departing from the scope of the present invention ( unless otherwise described).
  • step 610 it includes initiating an image capture sequence at the camera module.
  • the image capture sequence may include the capture of a plurality of two-dimensional images (eg, a first two-dimensional image, a second two-dimensional image, etc.), such as contained in a video feed or a series of consecutive still images (eg, captured according to predetermined rates or conditions). taken or captured) within a plurality of two-dimensional images.
  • multiple images eg, a first image and a second image.
  • the image capture sequence includes the first image and includes the second image captured at the camera module after the first image
  • the two-dimensional image can be transmitted to the controller (e.g., in the form of a data signal).
  • the two-dimensional image can then be recorded (eg, temporarily recorded) for comparison or evaluation.
  • the image capture sequence described above may be initiated in response to detection of motion within a refrigeration compartment (eg, a food preservation compartment) of the refrigeration appliance.
  • a refrigeration compartment eg, a food preservation compartment
  • the recording or evaluation of the two-dimensional image from the camera module can be prevented until the presence of motion in the refrigerated compartment is not detected.
  • motion detection may be performed at a camera module directed towards a refrigerated compartment.
  • changes in light or pixels captured by a camera module eg, between multiple images captured over time
  • motion detection may be performed in response to signals received from a separate sensor, such as in response to a switch signal selectively engaged with a door.
  • Such switches are generally understood and, for example, can also simultaneously control the activation of lamps for lighting in the refrigerated compartment.
  • opening the refrigerating door body can activate the above-mentioned lighting lamp, and transmit a signal indicating that there is an action in the refrigerating compartment.
  • the image capture sequence may continue until one or more end conditions are met.
  • the end condition may include a cutoff point after some predetermined period of time (eg, a time span) after the corresponding motion detection begins.
  • the end condition may include failing to detect a point of failure of further changes in sequential images of the image capture sequence. In other words, the image capture sequence may end after successive images cease to change or no further movement is detected.
  • the end condition may specifically require a preset number or preset duration of sequence-invariant images.
  • the end condition may include detection of the closing of the door of the refrigerated compartment. In other words, the image capture sequence may end in response to the door being moved to the closed position.
  • the method 600 includes determining an appendage occupancy area within the two-dimensional image in the image capture sequence (eg, determining a hand presence area).
  • the appendage footprint may be smaller than the entirety of the two-dimensional image (ie, may be a sub-portion of all pixels of the two-dimensional image).
  • step 620 may further include determining corresponding appendage occupancy areas within the plurality of images (eg, the first image and the second image) in order to track or capture different portions of the same user's appendage.
  • step 620 may include determining an area occupied by a first appendage within the first image (eg, capturing a particular user appendage), and determining an area occupied by a second appendage within a second image (eg, capturing a particular appendage) user appendages).
  • the general outline, shape, or presence of a user's appendage in a two-dimensional image can be detected (eg, automatically or without direct user instruction) by executing an appropriate detection routine.
  • the above-described determination process is performed to completion whenever (eg, in response to) the user placing the appendage within the camera's field of view during the image capture sequence. Since the determination operation is only performed on the area occupied by the appendage, the user does not have to be a previous user or an established user of the appliance, and a user profile corresponding to the user does not have to be created before step 620 .
  • step 630 method 600 includes analyzing the appendage occupied area (eg, in response to the determination of the appendage occupied area in step 620) (eg, analyzing the hand presence area).
  • the pixels within the area occupied by the appendage may be processed according to suitable identification routines such as edge matching, divide-and-conquer strategy search, grayscale matching, histogram of receptive field responses, and the like.
  • step 630 includes comparing the appendage footprint of the two-dimensional image to one or more previously established user profiles.
  • the pixels of the area occupied by the appendage can be compared to recorded metadata or images, where the metadata and the image are compared to specific user profiles (eg, multiple user profiles) stored within the appliance. Each user profile in the user profile) is associated.
  • step 640 method 600 includes assigning a confidence score to the user profile based on the analysis in step 630.
  • the above analysis may be based on a comparison between the area occupied by the appendage and the recorded data (eg, metadata or images) in a particular user profile. If multiple user profiles are provided, multiple confidence scores may be assigned (eg, each confidence score corresponds to each individual user profile).
  • step 640 includes assigning multiple temporal probability scores (ie, temporary single image confidence scores) in order to calculate a confidence score for a particular user profile .
  • a first temporal probability score may be assigned to the first image based on analysis of the first image.
  • the first temporal probability score may indicate an arithmetic likelihood that a user appendage captured by the appendage occupancy region in the first image is associated with a particular user profile (eg, the captured user appendage in the first image is associated with a particular user appendage). The percentage of user profiles that correspond to the appendages that match each other).
  • a second temporal probability score may be assigned to the second image based on analysis of the second image.
  • the second temporal probability score may indicate an arithmetic likelihood that a user appendage captured by the appendage occupancy area in the second image is associated with a particular user profile (eg, the captured user appendage in the second image is associated with a particular user appendage). The percentage of user profiles that correspond to the appendages that match each other).
  • the temporal probability scores can be further analyzed to determine confidence scores.
  • a temporal probability score can be applied to a predetermined function to calculate a confidence score.
  • the predetermined function may be an average function such that the confidence score is calculated as an average of the temporal probability scores (eg, the average of the first temporal probability score and the second temporal probability score).
  • the confidence score may be selected to be the largest (ie, the highest numerical value) among the plurality of temporal probability scores. For example, if the first temporal probability score is greater than the second probability score, the first temporal probability score may be used as (eg, instead of) the aforementioned confidence score.
  • method 600 includes comparing the assigned confidence score to a threshold score.
  • the threshold score may be predetermined and stored within the appliance, for example. Also, the threshold score may be set as the lowest score sufficient for the appliance to associate user appendages within the area occupied by the appendages with a particular user profile. In other words, the threshold score may determine whether the appliance can identify the user appendage as belonging to a particular user (eg, in the absence of any contrary or overlay data).
  • An assigned confidence score greater than the aforementioned threshold may generally indicate that a user appendage in the area occupied by the captured appendage may be identified as belonging to a particular user or user profile.
  • an assigned confidence score that is less than or equal to the aforementioned threshold score may generally indicate that a user's appendage in the area occupied by the captured appendage may not be identified as belonging to a particular user or user profile.
  • method 600 may include assigning an independent confidence score to each of the plurality of user profiles, based on the analysis of the area occupied by the appendage, wherein the plurality of user profiles includes the first profile, and assigning an independent confidence score to each of the plurality of user profiles
  • the confidence score assigned by the first description file is the first confidence score.
  • a plurality of independent confidence scores corresponding to a plurality of user profiles can be compared with each other.
  • the first confidence score is associated with the first description file based on comparing the individual confidence scores.
  • the highest confidence score may indicate a user appendage in the area occupied by the captured appendage that may be identified as belonging to the particular user or user profile for which the score corresponds (eg, regardless of whether the lower confidence score is greater than a threshold score or not) ).
  • the method 600 includes recording metadata about the two-dimensional image based on the comparison of the assigned confidence scores.
  • metadata may be recorded, such as data generated in the analysis of step 630 regarding one or more two-dimensional images (eg, the first image and the second image). Recording metadata and associating it with a specific user profile (if any) can often indicate that the captured user appendages belong to a specific user.
  • step 660 may include associating the recorded metadata with a specific user profile.
  • step 660 may include associating the recorded metadata with the individual profile, where , the separate description file is different from or different from the above-mentioned specific user description file.
  • This separate profile may, for example, be associated with another known user, or be generated in response to step 650 [eg, if no known user can correspond to the user's appendage captured in the two-dimensional image].
  • method 600 may include identifying stored items (eg, food items) that engage user appendages within the two-dimensional image. For example, within the two-dimensional image, it may be determined that a user appendage overlaps or engages (eg, contacts) with a particular stored item. It will be appreciated that the stored items themselves may be identified based on analysis of the item occupied area, such as by edge matching, divide and conquer strategy search, grayscale matching, histogram of receptive field responses, or another suitable routine. Subsequent or previous sequential images can be further analyzed to track the engagement of the user's appendage with the stored item. In other words, method 600 may include tracking engagement to determine whether a user has added or removed a stored item.
  • stored items eg, food items
  • method 600 may include associating the stored item with the same specific user profile associated with the user appendage (eg, performed in step 660). Thereby, engagement of a particular user can be automatically determined or recorded.
  • associating the stored item includes recording the stored item in a user database (e.g., a caloric intake database) of the user profile.
  • a user database e.g., a caloric intake database
  • the user's engagement with the stored item, or the user's caloric intake from the stored item is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Thermal Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Cold Air Circulating Systems And Constructional Details In Refrigerators (AREA)

Abstract

一种制冷电器可以包括箱体、门体、相机模块以及控制器。相机模块可以安装到箱体。控制器可以是可操作地联接到相机模块的。该控制器可以被配置为启动操作例程。操作例程可以包括:在相机模块处启动图像捕获序列。图像捕获序列可以包括在相机模块处捕获的二维图像。操作例程可以进一步包括:确定二维图像内的附肢占据区域;以及响应于确定附肢占据区域并分析附肢占据区域。操作例程还可以进一步包括:基于对附肢占据区域的分析,向存储在控制器中的用户描述文件分配置信度分数;将所分配的置信度分数与阈值分数进行比较;以及基于对所分配的置信度分数进行比较,记录关于二维图像的元数据。

Description

制冷电器和用户识别的方法 技术领域
本发明总体涉及一种对储存外壳的用户和储存外壳内的物品跟踪的技术,例如对制冷电器的用户和制冷电器内物品跟踪的技术。
背景技术
储存外壳,诸如制冷电器和食品储藏柜,通常提供用于接收多个物品或物体的封闭腔室。例如,制冷电器通常包括限定出制冷间室的箱体。用户可以将食品或物体放置在制冷间室内,以防止这种食品的腐烂。从而,可以增加易腐物品或物体的可利用寿命。
随着时间的推移,大量的储存物品(例如,食品)可能累积在冰箱的制冷间室内。随着所储存的物品的累积,制冷电器的用户可能难以识别位于制冷电器内的物品。另外,用户可能难以确定制冷电器内的某些物品的数量。当多个用户在不与其他用户通信的情况下,向公共制冷电器中添加/从公共制冷电器中移除物品时,上述情况尤为明显。因此,用户可能意外地购买了过多或不需要的物品。例如某些在制冷间室内不容易腐烂,并且可能不会经常被消耗的食品。由此,这种食品在制冷间室内可以保存的时间段被延长。用户可能忘记这种食品,并在尽管已经拥有可接受的同种物品的情况下,仍购买替代品。这样,可能给用户造成不便或者产生不必要的金钱消费。在作为补充或可选的情况中,一些用户可能不知道某些物品已经被取出或消耗。由此,用户可能无法更换或补充这种物品。
作为对电器内物品进行跟踪的补充或除此之外,对特定用户消耗或取出哪些物品的跟踪功能可能也是具有吸引力的。这样的跟踪可以帮助用户确定消耗习惯或热量摄入。
一些现有电器已经尝试,通过要求用户手动输入被储存的各个物品来解决这些问题。其它电器已经使用诸如标尺的各种方法来估计或猜测所储存或消耗的物品的数量或识别信息。然而,这种尝试不够繁复并且容易不准确。当多个用户与特定电器交互时,这些缺点可能会被放大。
因此,一种具有辅助用户跟踪制冷电器的制冷间室的内容物的特征的制冷电器将是有益处的。作为补充或可选地,一种具有识别多个用户的特征的制冷电器,以及可选地,具有识别向制冷间室中添加/从制冷间室中取出的物品的特征的制冷电器将是有益处的。
发明内容
本发明的各方面内容以及优点将会在下文的描述中进行阐述,这些内容和优点也可 以是是通过描述而显而易见的,也可以是通过实施本发明而获知的。
在本发明的一个示例性方面内容中,提供了一种制冷电器。该制冷电器可以包括箱体、门体、相机模块以及控制器。箱体可以限定出制冷间室。门体可以可旋转地铰接到箱体,以提供通向制冷间室的可选的途径。相机模块可以安装到箱体。控制器可以是可操作地联接到相机模块的。该控制器可以被配置为启动操作例程。操作例程可以包括:在相机模块处启动图像捕获序列。图像捕获序列可以包括在相机模块处捕获的二维图像。操作例程可以进一步包括:确定二维图像内的附肢占据区域;以及响应于确定附肢占据区域并分析附肢占据区域。操作例程还可以进一步包括:基于对附肢占据区域的分析,向存储在控制器中的用户描述文件分配置信度分数;将所分配的置信度分数与阈值分数进行比较;以及基于对所分配的置信度分数进行比较,记录关于二维图像的元数据。
在本发明的另一个示例性方面内容中,提供了一种制冷电器的操作方法。该方法可以包括:在相机模块处启动图像捕获序列。图像捕获序列可以包括在相机模块处捕获的二维图像。方法可以进一步包括:确定二维图像内的附肢占据区域;以及响应于确定附肢占据区域并分析附肢占据区域。方法还可以进一步包括:基于对附肢占据区域的分析,向存储在控制器中的用户描述文件分配置信度分数;将所分配的置信度分数与阈值分数进行比较;以及基于对所分配的置信度分数进行比较,记录关于二维图像的元数据。
参照下文的描述以及所附权利要求,本发明的上述特征、上述方面内容和上述益处和其它的特征、其他方面内容以及其他益处将变得更容易理解。结合在本说明书中并且构成本说明书一部分的附图,示出了本发明的实施方式,并且附图与对应描述一起用于对本发明的原理进行解释。
附图说明
参照附图,说明书中阐述了包括其最佳实施例在内的、关于本发明的完整且可实施的公开内容,此种公开面向任一本领域普通技术人员。
图1提供了根据本发明的示例性实施方式形成的制冷电器的前立面图。
图2提供了根据本发明的示例性实施方式形成的制冷电器的前立面图,其中,冷藏门体被示出为处于打开位置。
图3提供了根据本发明的示例性实施方式形成的制冷电器的示意图。
图4示出了在根据本发明的示例性实施方式形成的制冷电器的相机组件处所捕获的制冷电器的抽屉的示例性二维图像。
图5示出了在根据本发明的示例性实施方式形成的制冷电器的相机组件处所捕获的制冷电器的抽屉的示例性二维图像。
图6提供了示出根据本发明的示例性实施方式形成的制冷电器的操作方法的流程图。
具体实施方式
现在将详细地提供本发明的实施方式的参考内容,附图中对应示出本发明实施方式的一个或多个示例。每个示例都以对发明进行解释的方式给出,并不对本发明构成限制。实际上,对于本领域技术人员而言显而易见的是,本领域技术人员能够在不偏离本发明的范围的前提下对本发明进行多种改进和变型。例如,作为一个实施方式的一部分而示出或者描述的特征,能够应用于另一个实施方式,从而产生又一个实施方式。因此,可以预期的是,本发明覆盖所有这些落入所附权利要求及其等同形式的范围内的改进以及变型。
如本文所用的,术语“或”通常在于表示包括的含义(即,“A或B”在于意指“A或B或两者”)。术语“第一”、“第二”和“第三”可以互换使用以将一个部件与另一个部件区分开,并且这些术语并不旨在表示各个部件的位置或重要性。
大体而言,本发明提供了辅助识别用户与储存外壳交互或接合过程的方法,该储存外壳例如制冷电器或食品储藏柜。该方法可以包括用于自动地(例如,没有直接的用户输入地)辨别哪个用户向电器中添加了物品或者从电器中取出了物品(例如,以将这种动作与该用户相关联地识别)的一个或多个步骤。
现在转向附图,图1提供了根据本发明的示例性实施方式形成的制冷电器100的前立面图,其中制冷电器100的冷藏门体128被示出为处于关闭位置。图2提供了制冷电器100的前立面图,其中冷藏门体128被示出为处于打开位置,以露出制冷电器100的食物保鲜室122。
制冷电器100包括壳体或箱体102,该壳体或箱体沿着垂直方向V在顶部101与底部102之间延伸。箱体120限定出用于接收食品以便储存的制冷间室。特别地,箱体120限定出食物保鲜室122和冷冻室124,食物保鲜室122设置在箱体120的顶部101处或与其相邻的位置,冷冻室124设置在箱体120的底部102处或与其相邻的位置。由此可见,制冷电器100通常被称为底置式冰箱。然而,可以理解地,本发明的益处适用于其他类型和样式的储存外壳,诸如顶置式制冷电器、对开门式制冷电器或非冷藏食品储藏柜外壳。因此,本文阐述的描述仅出于示例性目的,而无意于在任何方面限制任何特定的储存外壳或冰箱腔室的构造。
冷藏门体128可旋转地铰接到箱体120的边缘,以便选择性地通向食物保鲜室122。另外,在冷藏门体128的下方布置冷冻门体130,以便选择性地通向冷冻室124。冷冻门体130联接至冷冻抽屉142(未示出),冷冻抽屉142(未示出)可滑动地安装在冷冻 室124内。如上所述,冷藏门体128和冷冻门体130在图1中被示出为处于关闭位置,并且冷藏门体128在图2中被示出为处于打开位置。
现在转向图2,如本领域技术人员可以理解的,各种储存部件被安装在食物保鲜室122内,以便于食品在其中的储存。特别地,储存部件包括安装在食物保鲜室122内的盒140、抽屉142以及层架144。盒140、抽屉142以及层架144被构造为接收所储存物品(例如,饮料或固体食品),并且可以帮助整理这种食品。作为示例,抽屉142可以接收新鲜食品(例如,蔬菜、水果或奶酪),并且增加这种新鲜食品的可利用寿命。
制冷电器100还包括用于辅助用户识别位于食物保鲜室122或冷冻室124内的食品的特征。用户可以利用这些特征,例如,来查看储存在食物保鲜室122或冷冻室124内的食品(即,所储存物品),或创建这种所储存物品的库存。下面更详细地讨论这种特征。
图3提供了制冷电器100的示意图。制冷电器100包括控制器150,该控制器与制冷电器100的制冷系统(未示出)的部件可操作地联接或通信,该制冷系统被配置为冷却食物保鲜室122或冷冻室124。部件包括压缩机170、蒸发器风扇172以及冷凝器风扇174。控制器150可以选择性地操作这种部件,以便冷却食物保鲜室122或冷冻室124。控制器150还与温控器152(例如,热电偶或热敏电阻)通信。温控器152可以设置在食物保鲜室122或冷冻室124(图2)中。控制器150可以从温控器152接收与食物保鲜室122或冷冻室124的温度对应的信号。控制器150还可以包括用于计算经过的时间段的内部计时器。
控制器150可以包括存储器以及一个或多个微处理器、CPU等,诸如通用或专用微处理器,该微处理器用于执行与制冷电器100的运行相关联的编程指令或微控制代码。存储器可以表示诸如DRAM的随机存取存储器,或诸如ROM或FLASH的只读存储器。在一些实施方式中,处理器执行存储在存储器中的永久编程指令。对于某些具体实施方式,指令包括软件包,该软件包被配置为操作电器100,或者执行操作例程(例如,下文描述中参见图6的示例性方法600)。存储器可以是与处理器分开的部件,或者可以包含在处理器内的板上。可选地,除了控制器150可以配置为足以执行控制功能的且不使用微处理器的其他构造(例如,使用离散的模拟或数字逻辑电路的组合;诸如开关、放大器、积分器、比较器、触发器、与门等),以使上述控制功能的实现不依靠于软件。
控制器150可以设置在整个制冷电器100中的各种位置。输入/输出(“I/O”)信号可以在控制器150与制冷电器100的各种操作部件之间路由。制冷电器100的一个或多个部件可以经由一条或多条传导信号线或共享的通信总线,与控制器150可操作地通信(例如,电气通信)。作为补充或可选地,制冷电器100的一个或多个部件可以经由一 个或多个无线信号带与控制器150可操作地通信(例如,无线通信)。
制冷电器100还包括相机或相机模块160。相机160可以是适于捕获二维照片或图像的任何类型的装置,该照片或图像诸如图4或图5所示的图像。作为示例,相机160可以是具有电子图像传感器【例如,电荷耦合器件(CCD)或CMOS传感器】的摄像机或数字相机。当组装完成时,相机160与控制器150通信(例如,电通信或无线通信),使得控制器150可以从相机160处接收,与由相机160捕获的图像相对应的信号。
总体而言,相机160设置在制冷电器100内并且指向一个或多个制冷间室(例如,食物保鲜室122-图2)。在一些实施方式中,相机160安装在食物保鲜室122内的顶部(例如,邻近顶部101)处。比如,相机160可以固定到或被设置为指向穿过限定出食物保鲜室122内胆的顶壁。在这种实施方式中,如图2所示,相机160可以指向下方,作为补充或可选地,多个独立相机160(例如,在图2中以虚线表示)可以从电器100中相互分离的多侧或多个区域(例如,腔室122的左侧、腔室122的右侧、门体128、抽屉142、层架144等)向内指向食物保鲜室122。由此,可以在电器内设置具有腔室122的不同视场的多个相机160。
当组装完成时,相机160可以指向抽屉142和层架144(图2)中任何特定的某一个的至少一部分,也可以指向抽屉142和层架144(图2)的组合的至少一部分。由此,相机160可以捕获抽屉142中的一个、所有抽屉142、层架144中的一个、所有层架144或其任何合适组合的图像。
在某些具体实施方式中,制冷电器100包括集成显示器180。集成显示器180可以安装在冷藏门体128(图1)上或制冷电器100上的任何其它合适位置处。集成显示器180与控制器150可操作地通信,使得集成显示器180可以从控制器150接收与由相机160捕获的图像相对应的信号。集成显示器180可以从控制器150接收这种信号,并且将上述图像可视化地呈现给用户。集成显示器180可以包括例如液晶显示面板(LCD)、等离子显示面板(PDP)或其他用于显示图像的任何合适的机构(例如投影仪)。
在作为补充或其他可选的实施方式中,制冷电器100包括网络接口(未示出),该网络接口将制冷电器100(例如,控制器150)联接到网络190,使得制冷电器100可以通过网络190传输和接收信息。网络190可以是任何有线或无线网络,诸如WAN、LAN或HAN。
在一些这种实施方式中,制冷电器100(例如,控制器150)经由网络190与移动显示器182通信。移动显示器182可以是被配置为通过网络190通信并显示从该网络190接收的图像的任何装置。例如,移动显示器182可以是电脑、智能电话或平板电脑。移动显示器182与控制器150通信,使得移动显示器182可以从控制器150(经由网络190) 接收与用户界面相对应的信号,或使得移动显示器182可以从控制器150(经由网络190)接收与由相机160捕获的图像相对应的信号。移动显示器182可以从控制器150接收这种信号,并且将一张或多张上述图像可视化地呈现给用户。移动显示器182可以包括例如液晶显示面板(LCD)、等离子显示面板(PDP)或其他用于显示图像的任何合适的机构(例如投影仪)。移动显示器182还可以包括一种界面(例如,触觉输入,诸如按钮,或图形用户界面),以允许移动显示器182通过网络190启动自身与制冷电器100的通信。
在使用期间,诸如在图像捕获序列期间,总体而言可以理解地,一个或多个相机160可以捕获可被传输到控制器150(例如,以数据信号的形式传输)的一个或多个二维图像(例如,作为视频馈送或一系列连续静态图像)。根据所捕获的图像,在一个或多个相机160视场(例如,设定视场)内的物品(例如,所储存物品或非所储存物品,所储存物品例如食物等,非所储存物品例如用户附肢、层架、可移动抽屉等)可以由控制器150自动识别。可以理解地,识别或分析这种物品可以通过边缘匹配、分治策略搜索、灰度匹配、接收场响应的直方图或其他任一适当的例程(例如,在控制器150处执行的、基于从一个或多个相机160捕获的一张或多张图像的例程)来执行。
现在转向图4和图5,图中示出了与抽屉储存有关的各种示例性二维图像,诸如可在相机160(图2)处捕获的图像、可在集成显示器180(图3)处查看的图像,或可在移动显示器182(图3)处观看的图像。可选地,相机160可以选择性地被可移动地安装或被固定地安装(例如,在电器100上或在电器100内)。当组装完成时,相机160可以具有设定视场(例如,可以在相机160处被捕获而包含于二维图像内的电器100的区域或其周围环境)。
作为示例,图4示例了诸如可以在相机160(图2)处捕获的示例性二维图像,该示例性二维图像属于图像捕获序列的一部分。换言之,图4处的图像示出了指向食物保鲜室122的相机160的一个设定视场(或该设定视场的子区域)。可选地,一旦被捕获(例如,作为图像捕获序列的一部分),图4处的图像就可以在显示器180或182处查看。总体而言,图4提供了一张抽屉142的视图/图像,其中抽屉142处于打开状态、内部为空(例如,未存储有任何上述所储存物品),且无视线障碍(例如,位于图5所示的用户附肢230之前或之后的空间均包含于在相机160与抽屉142之间的设定视场内)。
作为补充或可选的示例,图5示出了诸如可以在相机160处捕获的、作为图像捕获序列一部分的另一示例性二维图像。换言之,图5处的图像示出了指向食物保鲜室122的相机160(图2)的一个设定视场(或该设定视场的子区域)。可选地,一旦被捕获(例如,作为图像捕获序列的一部分),图5处的图像就可以在显示器180或182处查看。 总体而言,图5提供了一张抽屉142的视图/图像,其中抽屉142处于打开状态、包含一个或多个所储存物品,并且包括用户的用户附肢230(例如,手)。
在某些实施方式中,控制器150被配置为评估来自相机160的一个或多个二维图像的内容,以帮助识别特定用户或物品。作为示例,控制器150可以被配置为基于从相机160接收的信号或图像(例如,在图像捕获序列形成期间或形成之前)来识别特定用户,该特定用户与存储的用户描述文件或用户所接合的(例如,用于加入腔室122中的或从腔室122取出的)所储存物品234相关联。比如,根据从相机160捕获的二维图像,控制器150可以识别已经设置于某一预定子区域(例如,边界地带216)内的抽屉142,其中该预定子区域归属于相机160处的设定视场。总体而言,可以理解地,每个二维图像包括多个像素(例如,多个像素以预定义的网格布置)。在一些实施方式中,预定边界地带216建立了一个与食物保鲜室122相对固定(例如,从其向前)的二维像素网格或子区域。
在相机160的视场内,总体而言,像素的另一子部分可能被识别为包含用户附肢230或所储存的物品234。作为示例,此时可以确定在二维图像中已经捕获了界定一般用户附肢230的附肢占据区域232。可以理解地,上述确定过程可以利用例如任何适当的检测例程(例如,在控制器150处执行的检测例程),通过检测二维图像中的用户附肢230的轮廓或大体形状来实现。一旦确定存在这样的附肢占据区域232,则可以进一步分析其中的像素(例如,使用在控制器150处执行的适当的识别例程)。上述分析过程,可以包括:将附肢占据区域232与一个或多个存储的用户描述文件进行比较,其中各个用户描述文件对应于不同的用户并且包括对应的数据集(例如,记录有与该用户描述文件相关联的元数据或图像的数据集)。从对附肢占据区域232的分析,可以生成一个或多个分数(例如,置信度分数)。比如,可以为各个用户描述文件生成唯一的置信度分数,从而大体上指示附肢占据区域232捕获的用户附肢230与特定的用户描述文件相关联的算术可能性。在一些实施方式中,置信度分数可以指示所捕获的用户附肢230(即,用户附肢230的数据)与特定的用户描述文件对应的附肢(即,特定的用户描述文件对应的附肢的数据)相互匹配的百分比。
作为补充或可选地,二维图像中的所储存物品234(例如,可以在物品占据区域236内被检测到或被确定的)可以由适当的识别例程执行识别(例如,在控制器150处执行的、基于从一个或多个相机160捕获的一张或多张图像的例程),其中上述识别例程诸如边缘匹配、分治策略搜索、灰度匹配、接收场响应的直方图或其他任一适当的例程。
可选地,可以在由一个或多个相机160捕获的多张图像中匹配一个共同的用户附肢230或一个共同的所储存物品234。在一些这样的实施方式中,可以通过确定在多张图 像内都存在某个占据区域232或236(即,同一占据区域)来首先确定某个特定物品存在于多张图像中。然后,可以匹配多张图像中的该占据区域232或236,从而确定占据区域232或236被多张图像中的共同物品(例如,附肢230或所储存物品234)所占据。随后,可以在共同占据区域232或236处,对多张图像进行逐个分析(例如,分析多张二维图像内的附肢占据区域),以帮助识别特定物品(例如,将该特定物品识别为某个特定用户附肢230或某个特定所储存物品234)。作为示例,相同的物品230或234可以被识别为,存在于由相同的相机160捕获的多张图像中。由此,当单个物品(例如,附肢230或所储存的物品234)移动穿过单个相机160的视场时,该单个物品同样可以被识别。作为补充或可选示例,相同的物品可以被识别为,存在于由独立的相机160捕获的多张图像中。由此,单个物品(例如,用户附肢230或所储存物品234)可以以在多个不同的视场内被捕获的方式被识别出。
现在转向图6,图6提供了根据本发明的示例性实施方式中方法600的流程图。总体而言,方法600提供了一种制冷电器100(图1)的操作方法,如前所述,该制冷电器包括相机160或数字显示器(例如,集成显示器180或移动显示器182)。方法600可以由例如由控制器150(图3)的器件所执行。例如,如前所述,控制器150可以与相机160、集成显示器180(图3)或移动显示器182(图3)通信。在操作期间,控制器150可以向相机160、集成显示器180或移动显示器182处发送信号并从相机160、集成显示器180或移动显示器182处接收信号。通常,控制器150还可以进一步与电器100的其他合适的部件通信,以便于电器100的操作。
能够产生有益效果地,根据本发明提供的方法可以允许对一个或多个二维图像进行高效处理,从而在给定时刻识别出与电器接合的特定用户。
图6描述了出于展示和阐述目的而以特定顺序执行的诸多步骤。实施本文所提供的发明内容时,本领域普通技术人员应当理解地,本文所述的任何方法的步骤可以以各种方式改进、调整、重新排列、省略或扩展,而不脱离本发明的范围(除了以其他方式描述)。
在步骤610中,包括:在相机模块处启动图像捕获序列。图像捕获序列可以包括对多个二维图像(例如,第一二维图像、第二二维图像等)进行捕获,诸如包含于视频馈送或一系列连续静态图像(例如,根据预定速率或条件而拍摄或捕获的)内的多个二维图像。可选地,多张图像(例如,第一图像和第二图像。又如,图像捕获序列包括第一图像,且包括在第一图像之后在相机模块处捕获的第二图像)可以在相同的相机模块处捕获,或者可选地,在另一相机模块处捕获(例如,第一图像可以在第一相机模块处捕获,并且第二图像可以在第二相机模块处捕获)。在相机模块处被捕获时,可以将二维 图像传输到控制器(例如,以数据信号的形式)。然后可以记录(例如,暂时记录)该二维图像以用于比较或评估。
在某些具体实施方式中,上述图像捕获序列可以是响应于对制冷电器的制冷间室(例如,食物保鲜室)内动作的检测而启动的。由此,可以在未检测到制冷间室内存在动作之前,防止对来自相机模块的二维图像的记录或评估。作为示例,可以在指向制冷间室的相机模块处执行动作检测。具体地,如通常理解地,可以检测由相机模块捕获的光或像素的变化(例如,在随时间推移捕获的多张图像之间),并以该变化指示相机模块的视场内的一个或多个物体的移动。作为另一个示例,可以通过响应于从单独的传感器处接收的信号来执行运动检测,诸如响应于选择性地与门体配合的开关信号来执行动作检测。这种开关通常是可以理解的,并且举例而言,还可以同时兼任控制制冷间室内用于照明的灯的启动。由此,打开冷藏门体可以启动上述照明灯,并传输指示制冷间室内存在动作的信号。
可选地,图像捕获序列可以继续直到满足一个或多个结束条件为止。作为示例,结束条件可以包括在开始对应的移动检测之后的某个预定时间段(例如,时间跨度)后的截止点。作为另一示例,结束条件可以包括未能检测到图像俘获序列的顺序图像中进一步变化的失败点。换言之,图像捕获序列可以在连续图像停止变化或未检测到进一步的移动之后结束。结束条件可以具体地要求一个顺序不变图像的预设数目或预设持续时间段。但作为又一示例,结束条件可以包括对制冷间室的门体关闭的检测。换言之,图像捕获序列可以响应于门体被移动到关闭位置而结束。
在步骤620中,方法600包括:确定图像捕获序列中二维图像内的附肢占据区域(例如,确定手存在区域)。特别地,附肢占据区域可以小于二维图像的整体(即,可以为二维图像的全部像素的一个子部分)。可选地,步骤620还可以包括:确定多张图像(例如,第一图像和第二图像)内对应的附肢占据区域,以便跟踪或捕获相同用户附肢的不同部分。由此,步骤620可以包括:确定第一图像内的第一附肢占据区域(例如,捕获某个特定用户附肢),以及确定第二图像内的第二附肢占据区域(例如,捕获特定用户附肢)。
总体而言,可以理解地,可以通过执行适当的检测例程以检测(例如,自动地或在没有直接用户指令的情况下)二维图像中某个用户附肢的大体轮廓、形状或存在情况。由此,上述确定过程在每当(例如,响应于)用户在图像捕获序列期间将附肢放置在相机的视场内时执行完成。由于仅仅是对附肢占据区域执行确定操作,所以用户不必是电器的先前用户或已确定用户,并且对应该用户的的用户描述文件也不必在步骤620之前被建立。
在步骤630中,方法600包括:分析附肢占据区域(例如,响应于在步骤620中对附肢占据区域的确定操作)(例如,分析手存在区域)。特别地,可以根据适当的识别例程来处理附肢占据区域内的像素,识别例程诸如是边缘匹配、分治策略搜索、灰度匹配、接收场响应的直方图等。可选地,步骤630包括:将二维图像的附肢占据区域与一个或多个先前建立的用户描述文件进行比较。比如,附肢占据区域的像素(或识别例程的结果)可以与所记录的元数据或图像进行比较,其中该元数据和该图像与存储在电器内的特定用户描述文件(例如,多个用户描述文件中的每个用户描述文件)相关联。
在步骤640中,方法600包括:基于步骤630的分析,向用户描述文件分配置信度分数。例如,上述分析可以基于附肢占据区域与特定用户描述文件中的所记录数据(例如,元数据或图像)之间进行比较。如果提供了多个用户描述文件,则可以分配多个置信度分数(例如,各个置信度分数对应于每个独立的用户描述文件)。
在一些实施方式中,诸如在捕获多张图像的实施方式中,步骤640包括:分配多个时间概率分数(即,临时的单张图像置信度分数),以便计算特定用户描述文件的置信度分数。作为示例,可以基于对第一图像的分析来向第一图像分配第一时间概率分数。第一时间概率分数可以指示第一图像中的附肢占据区域所捕获的用户附肢与特定的用户描述文件相关联的算术可能性(例如,第一图像中的所捕获的用户附肢与特定的用户描述文件对应的附肢相互匹配的百分比)。作为另一示例,可以基于对第二图像的分析来向第二图像分配第二时间概率分数。第二时间概率分数可以指示第二图像中的附肢占据区域所捕获的用户附肢与特定的用户描述文件相关联的算术可能性(例如,第二图像中的所捕获的用户附肢与特定的用户描述文件对应的附肢相互匹配的百分比)。一旦分配了多个时间概率分数,可以进一步分析时间概率分数以确定置信度分数。可选地,可以将时间概率分数应用于预定函数以计算置信度分数。作为一个示例,预定函数可以是平均函数,使得置信度分数被计算为时间概率分数的平均值(例如,第一时间概率分数和第二时间概率分数的平均值)。作为另一示例,置信度分数可被选择为多个时间概率分数中的最大值(即,最高数值)。比如,如果第一时间概率分数大于第二概率分数,则第一时间概率分数可以用作(例如,替换为)上述置信度分数。
在步骤650中,方法600包括:将所分配的置信度分数与阈值分数进行比较。阈值分数可以举例而言,被预先确定并且存储在电器内。而且,阈值分数可以设置为,电器足以将附肢占据区域内的用户附肢与特定的用户描述文件相关联的最低分数。换言之,阈值分数可以确定电器是否可以将用户附肢识别为该用户附肢归属于特定用户(例如,不存在任何相反或覆盖数据)。大于上述阈值的所分配的置信度分数,大体上可以指示所捕获的附肢占据区域中的用户附肢,可以被确认为属于特定用户或用户描述文件。相 反地,小于或等于上述阈值分数的所分配的置信度分数,大体上可以指示所捕获的附肢占据区域中的用户附肢,可能不能被确认为属于特定用户或用户描述文件。
可选地,方法600可以包括:基于对附肢占据区域的分析,向多个用户描述文件中的每个用户描述文件分配独立置信度分数,其中多个用户描述文件包括第一描述文件,向第一描述文件分配的置信度分数为第一置信度分数。进一步地,可以在多个用户描述文件对应的多个独立置信度分数之间相互比较。并进一步地,基于比较各个独立置信度分数,将第一置信度分数与第一描述文件相关联。由此,最高置信度分数可以指示所捕获的附肢占据区域中的用户附肢,可以被确认为属于该分数对应的特定用户或用户描述文件(例如,无论较低置信度分数是否大于阈值分数)。
在步骤660中,方法600包括:基于对所分配的置信度分数进行比较,记录关于二维图像的元数据。特别地,可以记录元数据,诸如在步骤630的分析中生成的关于一个或多个二维图像(例如,第一图像和第二图像)的数据。记录元数据并将其与特定用户描述文件(如果有的话)相关联,通常可以指示所捕获的用户附肢属于特定用户。作为一种示例,如果或当为特定用户描述文件分配的置信度分数被确定为大于上述阈值分数(例如,或大于为每个其他所存储的用户描述文件分配的多个用户置信度分数)时,步骤660可以包括:将所记录的元数据与特定用户描述文件相关联。由此,可以确定在二维图像中捕获的用户附肢,被识别为在特定用户描述文件内关联的特定用户。作为补充或可选示例,如果或当为特定用户描述文件分配的置信度分数被确定为小于或等于上述阈值分数时,步骤660可以包括:将所记录的元数据与单独描述文件相关联,其中,该单独描述文件区别于或不同于上述特定用户描述文件。该单独的描述文件可以例如与另一已知用户相关联,或者响应于步骤650【例如,如果没有已知用户能够对应于在二维图像中捕获的用户附肢】而生成。
在记录元数据之后或与记录元数据相配合,方法600可以包括:识别与二维图像内的用户附肢接合的所储存物品(例如,食品)。比如,在二维图像内,可以确定用户附肢与特定的所储存物品重叠或接合(例如,接触)。可以理解地,可以基于对物品占据区域的分析来识别所储存的物品本身情况,诸如通过边缘匹配、分治策略搜索、灰度匹配、接收场响应的直方图或另一适当的例程。可以进一步分析随后或先前的顺序图像,以跟踪用户附肢与所储存的物品的接合情况。换言之,方法600可以包括:跟踪接合情况,以确定用户是否已经加入或取出所储存的物品。
在识别所储存的物品之后,方法600可以包括:将所储存的物品与用户附肢所关联的相同特定用户描述文件建立关联关系(例如,在步骤660中执行)。由此,可以自动地确定或记录特定用户的接合。在一些这样的实施方式中,关联所储存的物品包括:将 所储存的物品记录在用户描述文件的用户数据库(例如,热量摄入数据库)中。能够产生有益效果地,可以基于用户自身与提供给电器的所储存物品之间交互,或基于用户自身与从电器取出的储存物品的交互,来容易地或自动地(例如,至少部分自动地)记录用户与所储存物品的接合情况,或用户来自所储存物品的热量摄取情况。
本书面描述通过包括最佳实施例在内的多个示例的方式,对本发明进行了公开,并且本书面描述包含制造和使用任意装置或系统并且执行所包含的任意方法,还使本领域技术人员能够实施本发明。本发明的专利保护范围通过权利要求进行限定,并且可以包括本领域技术人员能够想到的其它的示例。任何其他包括与权利要求的字面语言没有区别的结构元件的示例,或者其他包括与权利要求的字面语言没有实质区别的等同结构元件的示例,均应落入本权利要求的范围中。

Claims (20)

  1. 一种制冷电器,其特征在于,该制冷电器包括:
    箱体,该箱体限定出制冷间室;
    门体,该门体旋转地铰接到所述箱体,以提供通向所述制冷间室的可选的途径;
    相机模块,该相机模块安装到所述箱体;以及
    控制器,该控制器可操作地联接到所述相机模块,所述控制器被配置为启动操作例程,该操作例程包括:
    在所述相机模块处启动图像捕获序列,所述图像捕获序列包括在所述相机模块处捕获的二维图像;
    确定所述二维图像内的附肢占据区域;
    响应于确定所述附肢占据区域并分析所述附肢占据区域;
    基于对所述附肢占据区域的分析,向存储在所述控制器中的用户描述文件分配置信度分数;
    将所分配的置信度分数与阈值分数进行比较;以及
    基于对所述所分配的置信度分数进行比较,记录关于所述二维图像的元数据。
  2. 根据权利要求1所述的制冷电器,其特征在于,所述对所述所分配的置信度分数进行比较,包括:
    确定所述分配的置信度分数大于所述阈值分数;并且
    其中,所述记录关于所述二维图像的元数据,包括:
    将所记录的元数据与所述用户描述文件相关联。
  3. 根据权利要求1所述的制冷电器,其特征在于,所述对所述所分配的置信度分数进行比较,包括:
    确定所述分配的置信度分数小于或等于所述阈值分数;并且
    其中,所述记录关于所述二维图像的元数据,包括:
    将所记录的元数据与不同于所述用户描述文件的单独的描述文件相关联。
  4. 根据权利要求1所述的制冷电器,其特征在于,所述二维图像是第一图像,其中,所述图像捕获序列还包括在所述第一图像之后在所述相机模块处捕获的第二图像,其中,所述操作例程还包括:
    确定所述第二图像内的附肢占据区域;以及
    响应于确定所述第二图像内的所述附肢占据区域并分析所述第二图像内的所述附肢占据区域;
    其中,所述分配置信度分数包括:
    根据所述第一图像确定所述用户描述文件的第一时间概率分数;
    根据所述第二图像确定所述用户描述文件的第二时间概率分数;以及
    基于所述第一时间概率分数和所述第二时间概率分数计算所述置信度分数。
  5. 根据权利要求4所述的制冷电器,其特征在于,所述计算所述置信度分数,包括:
    将所述置信度分数选择为多个时间概率分数中的最大值,所述多个时间概率分数包括所述第一时间概率分数和所述第二时间概率分数。
  6. 根据权利要求1所述的制冷电器,其特征在于,所述操作例程还包括:
    识别与所述二维图像内的用户附肢接合的所储存物品。
  7. 根据权利要求6所述的制冷电器,其特征在于,所述操作例程还包括:
    将所述所储存物品与所述用户描述文件相关联。
  8. 根据权利要求1所述的制冷电器,其特征在于,所述用户描述文件是存储在所述控制器内的多个用户描述文件中的第一描述文件,其中,所述所分配的置信度分数是第一置信度分数,并且其中,所述操作例程还包括:
    基于对所述附肢占据区域的分析,向所述多个用户描述文件中的每个用户描述文件分配独立置信度分数。
  9. 根据权利要求1所述的制冷电器,其特征在于,所述操作例程还包括:
    比较各个独立置信度分数;以及
    基于比较各个独立置信度分数,将第一置信度分数与第一描述文件相关联。
  10. 根据权利要求1所述的制冷电器,其特征在于,所述确定所述二维图像内的所述附肢占据区域,包括:
    匹配多张二维图像内的附肢占据区域;并且
    其中,所述分析所述附肢占据区域,包括:
    分析所述多张二维图像内的附肢占据区域。
  11. 一种制冷电器的操作方法,该制冷电器包括安装在制冷间室处的箱体内的相机模块,所述方法包括:
    在所述相机模块处启动图像捕获序列,所述图像捕获序列包括在所述相机模块处捕获的二维图像;
    确定所述二维图像内的附肢占据区域;
    响应于确定所述附肢占据区域并分析所述附肢占据区域;
    基于对所述附肢占据区域的分析,向存储在控制器中的用户描述文件分配置信度分 数;
    将所分配的置信度分数与阈值分数进行比较;以及
    基于对所述所分配的置信度分数进行比较,记录关于所述二维图像的元数据。
  12. 根据权利要求11所述的方法,其特征在于,所述对所述所分配的置信度分数进行比较,包括:
    确定所述分配的置信度分数大于所述阈值分数;并且
    其中,所述记录关于所述二维图像的元数据,包括:
    将所记录的元数据与所述用户描述文件相关联。
  13. 根据权利要求11所述的方法,其特征在于,所述对所述所分配的置信度分数进行比较,包括:
    确定所述分配的置信度分数小于或等于所述阈值分数;并且
    其中,所述记录关于所述二维图像的元数据,包括:
    将所记录的元数据与不同于所述用户描述文件的单独的描述文件相关联。
  14. 根据权利要求11所述的方法,其特征在于,所述二维图像是第一图像,其中,所述图像捕获序列还包括在所述第一图像之后在所述相机模块处捕获的第二图像,其中,所述方法包括:
    确定所述第二图像内的附肢占据区域;以及
    响应于确定所述第二图像内的所述附肢占据区域并分析所述第二图像内的所述附肢占据区域;
    其中,所述分配置信度分数包括:
    根据所述第一图像确定所述用户描述文件的第一时间概率分数;
    根据所述第二图像确定所述用户描述文件的第二时间概率分数;以及
    基于所述第一时间概率分数和所述第二时间概率分数计算所述置信度分数。
  15. 根据权利要求14所述的方法,其特征在于,所述计算所述置信度分数,包括:
    将所述置信度分数选择为多个时间概率分数中的最大值,所述多个时间概率分数包括所述第一时间概率分数和所述第二时间概率分数。
  16. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    识别与所述二维图像内的用户附肢接合的储存物品。
  17. 根据权利要求16所述的方法,其特征在于,所述方法还包括:
    将所储存的物品与所述用户描述文件相关联。
  18. 根据权利要求11所述的方法,其特征在于,所述用户描述文件是存储在所述控制器内的多个用户描述文件中的第一描述文件,其中,所述所分配的置信度分数是第 一置信度分数,并且其中,所述方法还包括:
    基于对所述附肢占据区域的分析,向所述多个用户描述文件中的每个用户描述文件分配独立置信度分数。
  19. 根据权利要求18所述的方法,其特征在于,该方法还包括:
    比较各个独立置信度分数;以及
    基于比较各个独立置信度分数,将第一置信度分数与第一描述文件相关联。
  20. 根据权利要求11所述的方法,其特征在于,所述确定所述二维图像内的所述附肢占据区域,包括:
    匹配多张二维图像内的附肢占据区域;并且
    其中,所述分析所述附肢占据区域,包括:
    分析所述多张二维图像内的附肢占据区域。
PCT/CN2021/120626 2020-10-07 2021-09-26 制冷电器和用户识别的方法 WO2022073430A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21876948.7A EP4206595A4 (en) 2020-10-07 2021-09-26 REFRIGERATION APPARATUS AND USER IDENTIFICATION METHOD
CN202180068477.1A CN116348727A (zh) 2020-10-07 2021-09-26 制冷电器和用户识别的方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/064,778 2020-10-07
US17/064,778 US11692767B2 (en) 2020-10-07 2020-10-07 Refrigerator appliance and methods of user identification

Publications (1)

Publication Number Publication Date
WO2022073430A1 true WO2022073430A1 (zh) 2022-04-14

Family

ID=80930724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120626 WO2022073430A1 (zh) 2020-10-07 2021-09-26 制冷电器和用户识别的方法

Country Status (4)

Country Link
US (1) US11692767B2 (zh)
EP (1) EP4206595A4 (zh)
CN (1) CN116348727A (zh)
WO (1) WO2022073430A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130002903A1 (en) * 2006-04-13 2013-01-03 Manico Joseph A Camera user input based image value index
US20140313328A1 (en) * 2013-04-18 2014-10-23 Sehwan Park Refrigerator and operating method thereof
CN109792478A (zh) * 2016-09-01 2019-05-21 迪尤莱特公司 基于焦点目标信息调整焦点的系统和方法
CN110472515A (zh) * 2019-07-23 2019-11-19 阿里巴巴集团控股有限公司 货架商品检测方法及系统
CN111503991A (zh) * 2020-04-15 2020-08-07 海信集团有限公司 一种识别冰箱食材存取位置的方法及冰箱
US10785456B1 (en) * 2019-09-25 2020-09-22 Haier Us Appliance Solutions, Inc. Methods for viewing and tracking stored items

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160085958A1 (en) 2014-09-22 2016-03-24 Intel Corporation Methods and apparatus for multi-factor user authentication with two dimensional cameras
US20160132642A1 (en) * 2014-11-06 2016-05-12 Raz Carmi Device and method for monitoring food intake
US10395764B2 (en) 2015-01-06 2019-08-27 Aic Innovations Group, Inc. Method and apparatus for recognition of patient activity
US10956856B2 (en) 2015-01-23 2021-03-23 Samsung Electronics Co., Ltd. Object recognition for a storage structure
CN105783413A (zh) * 2016-05-03 2016-07-20 青岛海尔股份有限公司 冰箱储藏物品信息的获取方法与冰箱
CN106403488B (zh) * 2016-09-08 2019-03-15 三星电子(中国)研发中心 智能冰箱及其管理方法
US10762641B2 (en) 2016-11-30 2020-09-01 Whirlpool Corporation Interaction recognition and analysis system
CA3088166A1 (en) * 2018-01-10 2019-07-18 Durgesh TIWARI Method for monitoring temperature-controlled units in a store
US10535146B1 (en) * 2018-07-16 2020-01-14 Accel Robotics Corporation Projected image item tracking system
US10902237B1 (en) * 2019-06-19 2021-01-26 Amazon Technologies, Inc. Utilizing sensor data for automated user identification
KR102245911B1 (ko) * 2019-08-09 2021-04-30 엘지전자 주식회사 인공 지능을 이용하여, 아이템의 정보를 제공하는 냉장고 및 그의 동작 방법
CN110807363A (zh) 2019-09-26 2020-02-18 青岛海尔智能技术研发有限公司 食材管理的方法及装置、冷藏设备
US11521248B2 (en) * 2019-12-13 2022-12-06 AiFi Inc. Method and system for tracking objects in an automated-checkout store based on distributed computing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130002903A1 (en) * 2006-04-13 2013-01-03 Manico Joseph A Camera user input based image value index
US20140313328A1 (en) * 2013-04-18 2014-10-23 Sehwan Park Refrigerator and operating method thereof
CN109792478A (zh) * 2016-09-01 2019-05-21 迪尤莱特公司 基于焦点目标信息调整焦点的系统和方法
CN110472515A (zh) * 2019-07-23 2019-11-19 阿里巴巴集团控股有限公司 货架商品检测方法及系统
US10785456B1 (en) * 2019-09-25 2020-09-22 Haier Us Appliance Solutions, Inc. Methods for viewing and tracking stored items
CN111503991A (zh) * 2020-04-15 2020-08-07 海信集团有限公司 一种识别冰箱食材存取位置的方法及冰箱

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4206595A4 *

Also Published As

Publication number Publication date
US20220107131A1 (en) 2022-04-07
US11692767B2 (en) 2023-07-04
CN116348727A (zh) 2023-06-27
EP4206595A1 (en) 2023-07-05
EP4206595A4 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
US11335010B2 (en) Methods for viewing and tracking stored items
US9719720B2 (en) Refrigerator and control method for the same
JP6307698B2 (ja) 冷蔵庫
JP6877734B2 (ja) 物品貯蔵庫の在庫管理装置
WO2021212993A1 (zh) 制冷电器摄像头模块和用于防止镜头起雾的方法
WO2021057769A1 (zh) 用于查看并跟踪所储存的物品的方法
JP7164296B2 (ja) 冷蔵庫
JP7281755B2 (ja) 冷蔵庫
US20140168396A1 (en) Method for viewing contents of a refrigerator appliance
JP7012261B2 (ja) 冷蔵庫
WO2021057820A1 (zh) 制冷电器和用于跟踪所储存物品的方法
JP2022103368A (ja) 冷蔵庫
WO2022073430A1 (zh) 制冷电器和用户识别的方法
WO2022048564A1 (zh) 具有可移动相机的制冷电器
WO2023193635A1 (zh) 用于跟踪冷冻室中的储存物品的制冷电器和方法
WO2023185834A1 (zh) 冰箱相机模块和用于解决相机透镜上的持续状况的方法
WO2022095995A1 (zh) 制冷电器和用于跟踪所储存物品的方法
JP7289084B2 (ja) 冷蔵庫
WO2023193756A1 (zh) 制冷设备及其控制方法
JP7289083B2 (ja) 冷蔵庫
JP2023099664A (ja) 冷蔵庫

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21876948

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021876948

Country of ref document: EP

Effective date: 20230330

NENP Non-entry into the national phase

Ref country code: DE