CN116888419A - Image pickup apparatus - Google Patents

Image pickup apparatus Download PDF

Info

Publication number
CN116888419A
CN116888419A CN202180094495.7A CN202180094495A CN116888419A CN 116888419 A CN116888419 A CN 116888419A CN 202180094495 A CN202180094495 A CN 202180094495A CN 116888419 A CN116888419 A CN 116888419A
Authority
CN
China
Prior art keywords
food
image
refrigerator
information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180094495.7A
Other languages
Chinese (zh)
Inventor
堀井慎一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of CN116888419A publication Critical patent/CN116888419A/en
Pending legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D23/00General constructional features
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F25REFRIGERATION OR COOLING; COMBINED HEATING AND REFRIGERATION SYSTEMS; HEAT PUMP SYSTEMS; MANUFACTURE OR STORAGE OF ICE; LIQUEFACTION SOLIDIFICATION OF GASES
    • F25DREFRIGERATORS; COLD ROOMS; ICE-BOXES; COOLING OR FREEZING APPARATUS NOT OTHERWISE PROVIDED FOR
    • F25D29/00Arrangement or mounting of control or safety devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Thermal Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Cold Air Circulating Systems And Constructional Details In Refrigerators (AREA)
  • Devices That Are Associated With Refrigeration Equipment (AREA)

Abstract

Provided is a photographing device capable of photographing articles coming in and going out of a storage with high accuracy. The photographing device includes: a main body provided on an upper surface of the storage; and a photographing member extending forward of the storage, the photographing member including a camera photographing from a front upper side of the storage, the main body including a position restricting member that abuts against a front end edge of the storage to restrict a position of the photographing member in a front-rear direction of the storage.

Description

Image pickup apparatus
Technical Field
The present invention relates to an imaging device.
Background
Patent document 1 discloses the following technique: a camera is arranged at the upper part of the refrigerator, shooting is executed when an object passing through an opening part of the refrigerator is detected, and food stored in the refrigerator is managed based on a shooting result.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2019-070476
Disclosure of Invention
Problems to be solved by the invention
The invention provides a photographing device capable of photographing articles coming in and going out of a storage.
Means for solving the problems
The photographing device of the present invention comprises: a main body provided on an upper surface of the storage; and a photographing member extending forward of the storage, the photographing member including a camera photographing from a front upper side of the storage, the main body including a position restricting member that abuts against a front end edge of the storage to restrict a position of the photographing member in a front-rear direction of the storage.
In addition, in this specification, the entire contents of japanese patent application No. 2021-029014 filed on 25 th 2 nd year 2021 are included.
Effects of the invention
In the imaging device of the present invention, the position of the imaging means in the front-rear direction of the storage is regulated by the position regulating means, so that the position of the camera is defined as a position from the front upper side of the storage to the lower side. Therefore, the articles coming in and going out of the storage can be included in the imaging result of the camera with high accuracy, and the articles coming in and going out of the storage can be imaged with high accuracy.
Drawings
Fig. 1 is a diagram showing a configuration of a food management system according to embodiment 1.
Fig. 2 is a side view of a camera unit provided in a refrigerator as seen from the right side in embodiment 1.
Fig. 3 is a perspective view of the position regulating member of embodiment 1, as seen from the left front side, in contact with the front end edge of the main casing.
Fig. 4 is a front view of the refrigerator and the camera unit in embodiment 1 as seen from the front side.
Fig. 5 is a plan view of the refrigerator and the camera unit in embodiment 1 as seen from the right side.
Fig. 6 is a block diagram showing the structure of the camera unit according to embodiment 1.
Fig. 7 is a block diagram showing the configuration of the terminal device and the food management server according to embodiment 1.
Fig. 8 is a diagram showing an example of the refrigerating compartment management database according to embodiment 1.
Fig. 9 is a view showing an example of an image in the refrigerator compartment according to embodiment 1.
Fig. 10 is a diagram showing an example of a drawer management database according to embodiment 1.
Fig. 11 is a view showing an example of the adjustment screen according to embodiment 1.
Fig. 12 is a diagram showing an example of the center position recording sheet according to embodiment 1.
Fig. 13 is a flowchart showing the operation of the camera unit according to embodiment 1.
Fig. 14 is a flowchart showing the operation of the food management system according to embodiment 1.
Fig. 15 is a diagram for explaining an access determination image according to embodiment 1.
Fig. 16 is a view showing an example of a refrigerator home screen in embodiment 1.
Fig. 17 is a diagram showing a remark input flow in embodiment 1.
Fig. 18 is a diagram showing an example of a storage list screen according to embodiment 1.
Fig. 19 is a diagram showing an example of the new additional list screen according to embodiment 1.
Fig. 20 is a diagram for explaining an additional flow of food information according to embodiment 1.
Fig. 21 is a diagram showing an example of the deletion object list screen according to embodiment 1.
Fig. 22 is a flowchart showing the operation of the food management system according to embodiment 1.
Fig. 23 is a view showing an example of a vegetable room home screen according to embodiment 1.
Fig. 24 is a diagram showing an example of a food image list screen according to embodiment 1.
Fig. 25 is a flowchart showing the operation of the food management system according to embodiment 1.
Fig. 26 is a block diagram showing the configuration of a terminal device and a food management server according to embodiment 2.
Detailed Description
(knowledge and so on which form the basis of the present invention)
The inventors have conceived that, at the time of the present invention, there is a technique of photographing articles coming in and going out of a storage such as a refrigerator. In the related art, photographing is sometimes performed using a photographing device separate from a storage. In this case, the photographing device is mostly provided on the upper surface of the storage. However, the inventors found that: in the case where the imaging device is provided on the upper surface of the storage, for example, by providing the imaging device at a position closer to the rear of the storage, it is possible that the object to be taken in and out of the storage cannot be included in the imaging result of the camera with high accuracy.
Accordingly, the present invention provides a photographing device capable of photographing articles coming in and going out of a storage with high accuracy.
The embodiments are described in detail below with reference to the drawings. However, the above detailed description may be omitted. For example, a detailed description of known matters or a repetitive description of substantially the same structure may be omitted.
Furthermore, the drawings and the following description are provided for a full understanding of the present invention by those skilled in the art, and are not intended to limit the subject matter recited in the claims.
(embodiment 1)
First, embodiment 1 will be described.
[1-1. Structure ]
[1-1-1. Structure of food management System ]
Fig. 1 is a diagram showing a configuration of a food management system 1000.
The food management system 1000 is a system for managing the date and time of storage in the refrigerator 1, and the like, for the food stored in the refrigerator 1.
The refrigerator 1 corresponds to an example of the storage of the present invention. The food corresponds to one example of the article of the present invention.
The food management system 1000 includes a refrigerator 1. The refrigerator 1 includes a main cabinet 10 having an open front surface. A refrigerating chamber 11, an ice making chamber 12, a fresh freezing chamber 13, a freezing chamber 14, and a vegetable chamber 15 are formed in the main cabinet 10. A rotary left door 11A and a rotary right door 11B are provided in an opening portion of the front surface of the refrigerator compartment 11. Drawers 12A, 13A, 14A, 15A for accommodating food are provided in each of the ice making chamber 12, the fresh freezing chamber 13, the freezing chamber 14, and the vegetable chamber 15. The drawers 14A, 15A of the present invention include two cases CA, an upper case JCA and a lower case GCA.
The upper case JCA corresponds to one example of the 1 st case of the present invention. The lower case GCA corresponds to an example of the 2 nd case of the present invention.
In the following description, the left gate 11A and the right gate 11B are not distinguished, and are referred to as "gates", and "11C" is given a symbol. In the following description, the reference numeral "16A" is given to "drawer" without distinguishing the drawers 12A, 13A, 14A, 15A.
The food management system 1000 comprises a camera unit 2. The camera unit 2 is a device that performs shooting and detects food that enters and exits the refrigerator 1 from a shot image. The camera unit 2 is provided on the upper surface 10B of the main casing 10 of the refrigerator 1.
The camera unit 2 corresponds to an example of the imaging device of the present invention.
The food management system 1000 includes a terminal device 3. The terminal device 3 is, for example, a smart phone or a tablet terminal. The terminal device 3 is provided with an application program for managing foods in the refrigerator 1, and communicates with the food management server 4 by the function of the application program.
In the following description, this application program is referred to as a "food management application", and a symbol of "311" is given.
The food management application 311 corresponds to one example of the program and application program of the present invention.
In fig. 1, a user P at home is indicated by a solid line, and a user P going out from home H is indicated by a broken line. When used by the user P at home, the terminal device 3 communicates with the food management server 4 via the communication device 5 or not via the communication device 5 by the function of the food management application 311. When the terminal device 3 is used by the user P who is going out from the home H and cannot establish a communication connection with the communication device 5, the terminal device communicates with the food management server 4 through the function of the food management application 311 without going through the communication device 5.
The communication device 5 is connected to a global network GN including the internet, a telephone network, and other communication networks, and communicates with a food management server 4 connected to the global network GN. The communication device 5 is an interface device for connecting the terminal device 3 to the global network GN.
The food management system 1000 has a food management server 4. The food management server 4 is a server device that manages food in the refrigerator 1, and is connected to the global network GN. In each drawing, the food management server 4 is represented by one block, but this does not mean that the food management server 4 is constituted by a single server device. For example, the food management server 4 may be configured to include a plurality of server devices having different processing contents.
Next, the configuration of the camera unit 2, the terminal device 3, and the food management server 4 will be described.
[1-1-2. Structure of Camera Unit ]
The camera unit 2 will be described with reference to fig. 2 to 4.
In fig. 2 to 4, X-axis, Y-axis and Z-axis are shown. The X-axis, Y-axis and Z-axis are orthogonal to each other. The Z axis represents the up-down direction. The X-axis and Y-axis are parallel to the horizontal direction. The X-axis represents the left-right direction, and the width direction of the refrigerator 1 and the camera unit 2. The Y-axis represents the front-rear direction, and the depth direction of the refrigerator 1 and the camera unit 2. The positive direction of the X-axis indicates the right direction. The positive direction of the Y-axis indicates the forward direction. The positive direction of the Z axis indicates the upward direction.
Fig. 2 is a side view of the camera unit 2 provided in the refrigerator 1 as seen from the right side. Fig. 3 is a perspective view of the position regulating member 205 abutting against the front edge 10A of the main casing 10 when viewed from the left front side.
As shown in fig. 2 and 3, the camera unit 2 includes an outer-appearance box-shaped main body 201 provided on the upper surface 10B of the main casing 10. Further, the camera unit 2 includes a photographing member 202 extending forward from the main body 201 above the front portion of the main body 201. The photographing part 202 includes a refrigerator camera 203 and a drawer camera 204. The refrigerator camera 203 and the drawer camera 204 are respectively provided at the lower part of the imaging member 202 in front of the main body 201.
The main body 201 includes a position regulating member 205 that regulates the position of the imaging member 202 in the front-rear direction of the refrigerator 1 by abutting against the front end edge 10A of the main casing 10. The position regulating member 205 is a position regulating piece that extends downward from the bottom surface 206 of the main body 201 in the set state. The position regulating member 205 extends in the left-right direction of the main body 201 on the front surface of the main body 201, and has a mark 207 at the center in the left-right direction of the main body 201. The mark 207 represents the position of the center in the left-right direction of the camera unit 2.
The length of the position regulating member 205 is formed shorter than the length from the upper surface 10B of the main casing 10 to the gasket provided between the refrigerator 1 and the door 11C in the set state. More specifically, the length of the position regulating member 205 is formed to be short from the upper surface 10B of the main casing 10 to a door gasket provided substantially parallel to the upper surface 10B on the upper portion of the inner surface of the door 11C in the installed state.
The main body 201 is fixed at a position regulated by the position regulating member 205 by a built-in magnet 208. The means for fixing the main body 201 is not limited to the magnet 208, and may be other means such as double-sided tape.
Fig. 4 is a front view of the refrigerator 1 and the camera unit 2 from the front side. The left door 11A and the right door 11B of the refrigerator 1 shown in fig. 4 are opened. Fig. 5 is a plan view of the refrigerator 1 and the camera unit 2 as seen from the right side. The refrigerator 1 shown in fig. 5 is in an open state (i.e., a pulled-out state) of the drawer 15A. In the camera unit 2 shown in fig. 4 and 5, the mark 207 is located at the substantially center in the left-right direction of the refrigerator 1, and the position regulating member 205 abuts against the front edge 10A.
The camera unit 2 has a refrigerator camera 203 and a drawer camera 204.
The refrigerator camera 203 photographs food items entering and exiting the refrigerator 11 by photographing the lower side from the front upper side of the refrigerator 1.
For example, the imaging range of the refrigerator compartment camera 203 includes a range A1 shown in fig. 4 in a front view. The range A1 is a range including the door shelf 111 provided to the door 11C and the opening portion of the front surface of the refrigerator compartment 11 in the open state of the door 11C. The imaging range of the refrigerator camera 203 includes a range A3 shown in fig. 5 in a side view. The range A3 is a range including a front end portion of the shelf 112 from the door 11C in an open state in the front-rear direction, and is a range including an opening portion of the refrigerator compartment 11 in the up-down direction.
The drawer camera 204 photographs food in and out of the drawer type storage room by photographing the lower side from the front upper side of the refrigerator 1.
For example, in the front view, the imaging range of the drawer camera 204 includes a range A2 shown in fig. 4. The range A2 is a range including the drawer 16A when the refrigerator 1 is seen from the front. The imaging range of the drawer camera 204 includes a range A4 shown in fig. 5 in a side view. The range A4 is a range of the drawer 16A including a state of being maximally pulled out in the front-rear direction.
The camera unit 2 has a left door range sensor 209, a right door range sensor 210, and a drawer range sensor 211.
The left door ranging sensor 209 is a ranging sensor for detecting the open/closed state of the left door 11A. The distance measuring sensor 209 for the left door detects the distance from the left door 11A as the open/close state of the left door 11A.
The right door ranging sensor 210 is a ranging sensor for detecting the open/closed state of the right door 11B. The distance measuring sensor 210 for right door also detects the distance from the right door 11B as the open/close state of the right door 11B.
The drawer distance measuring sensor 211 is a distance measuring sensor for detecting the open/close state of the drawer 16A. The drawer 16A being opened is pulled out from the corresponding drawer type storage chamber. The drawer 16A being in a closed state means a state of being stored in a corresponding drawer-type storage chamber. The distance measuring sensor 211 for a drawer detects a separation distance from the drawer 16A as an open/close state of the drawer 16A.
In the present embodiment, the drawer distance measuring sensor 211 detects the open/close states of the drawers 14A, 15A.
The camera unit 2 has a person sensor 212. The human sensor 212 outputs the detection value to the camera unit control section 20. The human sensor 212 is provided at a predetermined position of the photographing section 202.
The human sensor 212 corresponds to one example of the "sensor" of the present invention.
In addition, the arrangement order and the arrangement position of the respective sensors and the respective cameras in fig. 4 and 5 are illustrated for convenience of explanation of the camera unit 2, and the arrangement order and the arrangement position thereof are not limited to those illustrated in fig. 4 and 5.
Fig. 6 is a block diagram showing the structure of the camera unit 2.
The camera unit 2 includes a camera unit control section 20, a camera unit communication section 21, a sensor section 22, an imaging section 23, and a power supply section 24.
The camera unit control section 20 includes a camera unit processor 220, which is a processor such as a CPU (central processing unit) and an MPU (micro processing unit), and a camera unit storage section 230. The camera unit control unit 20 reads and executes a control program stored in the camera unit storage unit 230 by the camera unit processor 220 to control each part of the camera unit 2. The camera unit processor 220 functions as a camera unit communication control section 221, a detection value processing section 222, an imaging control section 223, a video recording control section 224, and a food detection section 225 by executing a control program stored in the camera unit storage section 230.
The camera unit storage section 230 includes a memory that stores a program executed by the camera unit processor 220 and data processed by the camera unit processor 220. The camera unit storage unit 230 stores a control program executed by the camera unit processor 220, camera unit ID4112 which is identification information of the camera unit 2, and other various data. The camera unit storage section 230 has a nonvolatile storage area. Further, the camera unit storage 230 may include a volatile storage area, constituting a working area of the camera unit processor 220.
The camera unit communication unit 21 is a communication interface including a communication circuit, a connector, and the like according to a predetermined communication standard, and communicates with the food management server 4 and the terminal device 3 under the control of the camera unit control unit 20. In the present embodiment, the communication standard used by the camera unit communication unit 21 exemplifies a wireless communication standard, but may be a wired communication standard.
The sensor unit 22 includes a left door range sensor 209, a right door range sensor 210, a drawer range sensor 211, and a human sensor 212, and outputs detection values to the camera unit control unit 20 for each sensor.
The imaging unit 23 includes a refrigerator camera 203 and a drawer camera 204. The imaging unit 23 outputs the imaging results of the respective refrigerator camera 203 and drawer camera 204 to the camera unit control unit 20. The refrigerator camera 203 and the drawer camera 204 capture moving images.
Hereinafter, the reference numeral "213" is given to the "camera" without distinguishing the refrigerator camera 203 and the drawer camera 204.
The power supply unit 24 includes hardware such as a power supply circuit, and supplies power to each part of the camera unit 2. In the present embodiment, the power cable 7 of the camera unit 2 is connected to the commercial ac power supply 8, and the power supply unit 24 supplies power to each of the camera units 2 based on power supplied from the commercial ac power supply 8.
The power supply source for supplying power to the power supply unit 24 is not limited to the external commercial ac power supply 8, and may be the refrigerator 1 or a battery. In the former case, the power supply unit 24 and the refrigerator 1 include hardware such as a port according to a standard capable of USB power supply. In the former case, the USB cable of the camera unit 2 is connected to the USB port of the refrigerator 1, and the refrigerator 1 supplies electric power to the power supply unit 24 via the USB cable. In the case where the supply source is the latter, the camera unit 2 includes a battery.
As described above, the camera unit control section 20 functions as the camera unit communication control section 221, the detection value processing section 222, the imaging control section 223, the video recording control section 224, and the food detection section 225.
The food detection unit 225 corresponds to one example of an article detection unit.
The camera unit communication control section 221 communicates with the food management server 4 via the camera unit communication section 21.
The detection value processing unit 222 determines whether the left door 11A, the right door 11B, and the drawers 14A, 15A are in the open state or the closed state, respectively, based on the detection values of the sensors output from the sensor unit 22. When it is determined that either one of the left door 11A and the right door 11B is in the open state, the detection value processing unit 222 outputs open state information indicating that the door 11C is in the open state to the video recording control unit 224 and the food detection unit 225. When it is determined that both the left door 11A and the right door 11B are closed, the detection value processing unit 222 outputs the closed state information indicating that the door 11C is closed to the video recording control unit 224 and the food detecting unit 225. When it is determined that either one of drawers 14A, 15A is in the open state, detection value processing unit 222 outputs drawer open state information indicating that either one of drawers 14A, 15A is in the open state to video recording control unit 224 and food detection unit 225. When it is determined that both drawers 14A, 15A are closed, detection value processing unit 222 outputs drawer closed state information indicating that drawers 14A, 15A are closed to video recording control unit 224 and food detection unit 225.
The detection value processing unit 222 determines whether or not a person is present around the refrigerator 1 based on the detection value of the person sensor 212 output from the sensor unit 22. When it is determined that a person is present around the refrigerator 1, the detection value processing unit 222 outputs information indicating the presence or absence of the person to the image pickup control unit 223. When it is determined that no person is present around the refrigerator 1, the detection value processing unit 222 outputs information indicating the presence or absence of the person to the image pickup control unit 223.
When the detection value processing unit 222 outputs information indicating the presence or absence of a person, the imaging control unit 223 starts imaging by the refrigerator camera 203 and the drawer camera 204. When the detection value processing unit 222 outputs information indicating the presence or absence of a person, the imaging control unit 223 ends imaging by the refrigerator camera 203 and the drawer camera 204.
When the detection value processing unit 222 outputs the door open state information, the image recording control unit 224 starts image recording of the imaging result of the refrigerator camera 203. When the detection value processing unit 222 outputs the door closing state information, the image recording control unit 224 ends the image recording of the imaging result of the refrigerator camera 203 started. When the detection value processing unit 222 outputs the drawer open state information, the recording control unit 224 starts recording the imaging result of the drawer camera 204. When the detection value processing unit 222 outputs the drawer closed state information, the recording control unit 224 ends the recording of the imaging result of the drawer camera 204 started. When finishing the video recording, the video recording control unit 224 generates a moving image from the start of the video recording to the end thereof as one video recording file 231 and stores the moving image in the camera unit storage unit 230.
The food detection unit 225 detects food items entering and exiting the refrigerator 1 based on the video file 231 stored in the camera unit storage unit 230.
Fig. 7 is a block diagram showing the configuration of the terminal device 3 and the food management server 4.
1-1-3 Structure of food management Server
First, the structure of the food management server 4 will be described.
The food management server 4 includes a server control unit 40 and a server communication unit 41.
The server control unit 40 includes a server processor 400, which is a processor such as a CPU or MPU, and a server storage unit 410. The server control unit 40 reads and executes a control program stored in the server storage unit 410 by the server processor 400, and controls each part of the food management server 4. The server processor 400 functions as a server communication control section 401 and an information processing section 402 by executing a control program stored in a server storage section 410.
The server storage unit 410 has a memory for storing a program executed by the server processor 400 and data processed by the server processor 400. The server storage unit 410 stores control programs executed by the server processor 400, a refrigerator compartment food management database 411, a drawer food management database 412, and other various data. The server storage unit 410 has a nonvolatile storage area. The server storage unit 410 may have a volatile storage area, and may constitute an operation area of the server processor 400.
Fig. 8 is a diagram showing an example of one record stored in the refrigerator compartment food management database 411.
The refrigerator compartment food management database 411 is a database that stores various information about food stored in the refrigerator compartment 11.
In the following description, one record stored in the refrigerator compartment food management database 411 is referred to as "refrigerator compartment food management record", and a symbol of "RR" is given.
The cooling room food management record RR has an account ID4111, a camera unit ID4112, cooling room model information 4113, cooling room image data 4114, in-and-out information 4115, food image data 4116, storage date and time information 4117, divided area number 4118, note input presence information 4119, note information 4130, deletion information 4131, and deletion date and time information 4132.
The account ID4111 is identification information identifying an account allocated to the user P who utilizes the food management application 311.
The refrigerator model information 4113 is information indicating the model of the refrigerator 1 used by the user P indicated by the account ID4111 which is associated with the same record.
The in-refrigerator image data 4114 is image data representing an image in the refrigerator 11 of the refrigerator 1. Hereinafter, an image in the refrigerator 11 will be referred to as an "image in the refrigerator", and a symbol "RG" will be given. The in-refrigerator image data 4114 represents an in-refrigerator image RG corresponding to the model represented by the refrigerator model information 4113 that is associated in the same record.
The in-refrigerator image RG corresponds to one example of an in-refrigerator image.
Fig. 9 is a view showing an example of the image RG in the refrigerator compartment.
The in-refrigerator image RG shown in fig. 9 includes a left door shelf image RG1, a refrigerator front surface image RG2, and a right door shelf image RG3. In the in-refrigerator image RG shown in fig. 9, a left door shelf image RG1, a refrigerator front surface image RG2, and a right door shelf image RG3 are arranged in this order from the left side in the figure. The left door shelf image RG1 is an image showing the door shelf 111 formed on the left door 11A. The refrigerator front surface image RG2 is an image of the refrigerator 11 viewed from the front in the case where the left door 11A and the right door 11B are in an open state. The right door shelf image RG3 is an image showing the door shelf 111 formed on the right door 11B.
The image area of the image RG in the refrigerator compartment is divided into a plurality of areas. Each of these areas is referred to as a "divided area", and a symbol of "BA" is given. The image RG in the refrigerator compartment shown in fig. 9 is divided into 15 divided areas BA. The respective divided areas BA are assigned mutually different numbers. In the following description, this number is referred to as a divided area number, and a symbol "4118" is given. In fig. 9, which divided area BA is assigned which divided area number is shown for convenience, and the divided area number 4118 is not shown in the in-refrigerator image RG displayed on the in-refrigerator home screen GM2 described later.
Returning to the description of fig. 8, the in-out garage presence information 4115 is information indicating whether or not there is a new in-out garage.
The food item image data 4116 is image data of food items. In the present embodiment, the image represented by the food image data 4116 is a captured image of the refrigerator compartment camera 203.
The storage date and time information 4117 is information indicating the storage date and time of food stored in the refrigerator 1.
The cooling room food management record RR has a divided area number 4118 for each food image data 4116.
The remark input presence information 4119 is information indicating whether or not there is an input of a remark.
The remark information 4130 is information indicating an input remark.
The deletion information 4131 is information indicating whether or not the sub-record to which the correspondence is established is the sub-record to be deleted from the refrigerating compartment food management record RR. The sub-record is a record stored in the refrigerator compartment food management record RR, and includes food image data 4116, storage date and time information 4117, divided area number 4118, note input presence information 4119, and note information 4130.
The delete date and time information 4132 is information indicating the date and time at which the corresponding sub record was set as the deletion target. When the deletion information 4131 indicates a deletion target, the information processing unit 402, which will be described later, deletes the sub-record, and the deletion information 4131 and the deletion date and time information 4132 associated with the sub-record, from the refrigerator compartment food management record RR at a predetermined timing.
Fig. 10 is a diagram showing one record stored in the drawer food management database 412.
The drawer food management database 412 is a database that stores various information related to food in the drawer type storage room. In the description with reference to fig. 23 to 25, the drawer food management database 412 is a database related to the vegetable room 15.
In the following description, one record stored in the drawer food management database 412 is referred to as a "drawer food management record", and a symbol of "HR" is given.
The drawer food management record HR has an account ID4111, a camera unit ID4112, update date and time information 4121, upper-layer box information 4122, and lower-layer box information 4124.
The update date and time information 4121 is information indicating the date and time at which the content of the drawer food management record HR was updated.
The upper-layer cartridge information 4122 includes upper-layer cartridge image data 4123, food image data 4116, remark input presence information 4119, and remark information 4130.
The upper-layer cassette image data 4123 is image data of an upper-layer cassette image JCG, which is an image of the upper-layer cassette JCA. The upper-layer box image JCG is a captured image of the drawer camera 204.
The upper-layer cassette image JCG corresponds to an example of the 1 st cassette image of the present invention.
In the present embodiment, the image represented by the food image data 4116 included in the upper-layer cassette information 4122 is an image cut out from the upper-layer cassette image JCG.
The lower-layer cartridge information 4124 includes lower-layer cartridge image data 4125, food image data 4116, remark input presence information 4119, and remark information 4130.
The lower-layer cassette image data 4125 is image data of the lower-layer cassette image GCG, which is an image of the lower-layer cassette GCA. The lower cassette image GCG is a captured image of the drawer camera 204.
The lower-layer cassette image GCG corresponds to an example of the 2 nd cassette image of the present invention.
Hereinafter, the upper-layer cassette image JCG and the lower-layer cassette image GCG are not distinguished from each other, and are referred to as "cassette images", and are denoted by the symbol "CG".
In the present embodiment, the image represented by the food image data 4116 included in the lower-layer cassette information 4124 is an image cut out from the lower-layer cassette image GCG.
Returning to the explanation of fig. 7, the server communication unit 41 is a communication interface including a configuration related to communication such as a radio circuit and an antenna according to a predetermined communication standard, and communicates with a device connected to the global network GN according to the predetermined communication standard. In the present embodiment, the server communication unit 41 communicates with the camera unit 2 and the terminal device 3.
As described above, the server control section 40 functions as the server communication control section 401 and the information processing section 402.
The server communication control unit 401 communicates with the camera unit 2 and the terminal device 3 via the server communication unit 41.
The information processing unit 402 performs various information processing based on information received by the server communication control unit 401 from the camera unit 2 and the terminal device 3.
[1-1-4. Structure of terminal device ]
Next, the functional configuration of the terminal device 3 will be described.
The terminal device 3 includes a terminal control unit 30, a terminal communication unit 31, and a touch panel 32.
The touch panel 32 corresponds to an example of a display unit.
The terminal control unit 30 includes a terminal processor 300, which is a processor such as a CPU or MPU, and a terminal storage unit 310. The terminal control unit 30 reads and executes a control program stored in the terminal storage unit 310 by the terminal processor 300, and controls each part of the terminal device 3. A food management application 311 is installed in the terminal device 3. The terminal processor 300 serves as an application execution unit 301 by reading and executing the food management application 311.
The terminal processor 300 corresponds to an example of a computer of the present invention. The application execution unit 301 corresponds to an example of the display control unit of the present invention.
The terminal storage unit 310 has a memory for storing a program executed by the terminal processor 300 and data processed by the terminal processor 300. The terminal storage unit 310 stores a control program executed by the terminal processor 300, a food management application 311, an account ID4111, and other various data. The terminal storage unit 310 has a nonvolatile storage area. The terminal storage unit 310 may include a volatile storage area, and may constitute a working area of the terminal processor 300.
The terminal communication unit 31 is a communication interface including a radio circuit, an antenna, and the like according to a predetermined communication standard, and communicates with a device connected to the global network GN according to the predetermined communication standard. In the present invention, a wireless communication standard is exemplified as a communication standard used by the terminal communication unit 31.
The touch panel 32 includes a display panel such as a liquid crystal display panel, and a touch sensor that is provided so as to overlap with the display panel or integrally therewith.
As described above, the terminal control section 30 functions as the application execution section 301.
The application execution unit 301 transmits and receives various information to and from the food management server 4 via the terminal communication unit 31.
The application execution unit 301 causes the touch panel 32 to display a user interface related to the management of food in the refrigerator 1. In the following description, this user interface is referred to as an "application UI", and reference numerals of "320" are given. The application execution unit 301 causes the touch panel 32 to display the application UI320, thereby providing the user P with various information related to the food in the refrigerator 1, and receiving various inputs related to the food in the refrigerator 1 from the user P.
[1-2. Action ]
Next, operations of each part of the food management system 1000 will be described.
[1-2-1. Arrangement of camera Unit ]
The camera unit 2 is separated from the refrigerator 1. Thus, the camera unit 2 is provided by the user P on the upper surface 10B of the refrigerator 1. As described above, the position restricting member 205 restricts the position of the camera unit 2 in the front-rear direction of the refrigerator 1, so that the user P adjusts the position of the refrigerator 1 in the left-right direction when setting the camera unit 2.
Hereinafter, a plurality of methods of adjusting the position of the camera unit 2 in the left-right direction of the refrigerator 1 will be described.
[1-2-1-1. 1 st position adjustment ]
In the 1 st position adjustment, the user P uses the application UI320.
In the 1 st position adjustment, the application execution unit 301 causes the touch panel 32 to display the application UI320 of the adjustment screen GM 1.
Fig. 11 is a view showing an example of the adjustment screen GM 1.
The adjustment screen GM1 sets a photographing result display area HA1. The imaging result display area HA1 is a rectangular area, and is an area in which the imaging result of either the refrigerator compartment camera 203 or the drawer camera 204 is displayed. In the imaging result display area HA1, a line L1 extending in the up-down direction is provided at substantially the center in the left-right direction in the drawing.
The adjustment screen GM1 includes adjustment method information J1. The adjustment method information J1 facilitates position adjustment so that the center in the left-right direction of the refrigerator 1, which is displayed in the imaging result display area HA1, becomes a line L1 provided in the imaging result display area HA1.
The user P sets the camera unit 2 in the refrigerator 1 according to the adjustment method indicated by the adjustment method information J1. Thus, the food entering and exiting the refrigerator 1 can be contained in the imaging range of each of the refrigerator camera 203 and the drawer camera 204 of the camera unit 2. The camera unit 2 is configured to be able to detect opening and closing of the corresponding door 11C or drawer 16A by each of the distance measuring sensors included therein.
In the 1 st position adjustment, the camera unit 2 transmits the imaging result of either the refrigerator camera 203 or the drawer camera 204 to the terminal device 3. In the 1 st position adjustment, the camera unit 2 and the terminal device 3 directly communicate with each other via the food management server 4 or by short-range wireless communication or the like.
The camera unit communication control unit 221 of the camera unit 2 transmits the imaging result of either the refrigerator camera 203 or the drawer camera 204 to the terminal device 3. Then, the application execution unit 301 displays the received imaging result in the imaging result display area HA1.
[1-2-1-2. 2 nd position adjustment ]
In the 2 nd position adjustment, the user P uses the center position recording paper CY.
Fig. 12 is a diagram showing an example of the center position recording paper CY.
The center position recording sheet CY is a sheet on which a center mark MR2 indicating the position of the center in the left-right direction of the refrigerator 1 is recorded. In addition to the center mark MR2, the center position recording paper CY also has a left end mark MR1 indicating the position of the left end in the left-right direction of the refrigerator 1 and a right end mark MR3 indicating the position of the right end in the left-right direction of the refrigerator 1.
The center position recording sheet CY is bundled together with the refrigerator 1 to reach the user P. In addition, the camera unit 2 may be bundled together with the center position recording paper CY and the refrigerator 1, or may not be bundled. The length from the left end mark MR1 to the right end mark MR3 is a length corresponding to the width of the refrigerator 1 for the center position recording paper CY bundled together with the refrigerator 1.
In the 2 nd position adjustment, the user P uses the center position recording paper CY to grasp where the position of the center in the left-right direction of the refrigerator 1 is. For example, the user P attaches the center position recording paper CY to the refrigerator 1 such that the left end mark MR1 becomes the left end of the refrigerator 1 and the right end mark MR3 becomes the right end of the refrigerator 1, and grasps the position of the center in the left-right direction of the refrigerator 1 with the center mark MR2 as a mark. Then, the user P adjusts the position of the camera unit 2 so that the mark 207 of the position restricting member 205 of the camera unit 2 becomes the grasped center position.
By the 2 nd position adjustment, the camera unit 2 can perform various kinds of detection in the same manner as the 1 st position adjustment.
[1-2-1-3. 3 rd position adjustment ]
In the 3 rd position adjustment, a mark indicating the position of the center in the left-right direction of the refrigerator 1 is given to a predetermined position of the refrigerator 1. In the 3 rd position adjustment, the mark 207 provided in the position regulating member 205 and the mark provided to the refrigerator 1 are used.
In the 3 rd position adjustment, the user P adjusts the position of the camera unit 2 in the left-right direction of the refrigerator 1 so that the position of the mark given to the refrigerator 1 coincides with the position of the mark 207 provided in the position restricting member 205.
By the 3 rd position adjustment, the camera unit 2 can perform various kinds of detection in the same manner as the 1 st position adjustment.
Next, the operation of the food management system 1000 in which the camera unit 2 is provided in the refrigerator 1 will be described.
[1-2-2. Video-recording-related action ]
First, an action related to video recording will be described.
Fig. 13 is a flowchart FA showing the operation of the camera unit 2.
In the operation shown in fig. 13, the sensor unit 22 outputs the detection values of the respective sensors to the camera unit control unit 20. In the operation shown in fig. 13, the detection value processing unit 222 performs processing based on various detection values received from the sensor unit 22.
The detection value processing unit 222 determines whether or not a person is present around the refrigerator 1 based on the detection value of the person sensor 212 (step SA 1).
When it is determined that a person is present around the refrigerator 1 (yes in step SA 1), the detection value processing unit 222 outputs information indicating the presence or absence of the person to the image pickup control unit 223 (step SA 2).
When the information indicating the presence or absence of the person is input from the detection value processing unit 222, the imaging control unit 223 starts imaging by the refrigerator camera 203 and the drawer camera 204 (step SA 3).
Next, the detection value processing unit 222 determines whether or not any of the left door 11A, the right door 11B, and the drawers 14A, 15A is in the open state based on the detection values of the various sensors output from the sensor unit 22 (step SA 4).
When it is determined that none of the left door 11A, the right door 11B, and the drawers 14A, 15A is open (step SA4: no), the detection value processing unit 222 performs the processing of step SA 16.
When it is determined that any one of the left door 11A, the right door 11B, and the drawers 14A, 15A is in the open state (yes in step SA 4), the detection value processing unit 222 determines whether the door 11C is in the open state or whether any one of the drawers 14A, 15A is in the open state (step SA 5).
When it is determined that the door 11C is in the open state (step SA5: door), the detection value processing unit 222 outputs door open state information to the video recording control unit 224 (step SA 6).
Next, when the door open state information is input from the detection value processing unit 222, the recording control unit 224 starts recording the imaging result of the refrigerator camera 203 (step SA 7).
Next, the detection value processing unit 222 determines whether or not the door 11C is in the closed state (step SA 8).
When it is determined that the door 11C is in the closed state (yes in step SA 8), the detection value processing unit 222 outputs the door-closed state information to the video recording control unit 224 (step SA 9).
Next, when the door-closed state information is input from the detection value processing unit 222, the recording control unit 224 ends the recording of the imaging result of the refrigerator compartment camera 203 (step SA 10). Thereby, the video file 231 of the single refrigerator camera 203 is generated in the camera unit storage unit 230.
Returning to the explanation of step SA5, when it is determined that either one of the drawers 14A, 15A is in the open state (step SA5: drawer), the detection value processing unit 222 outputs drawer open state information to the video recording control unit 224 (step SA 11).
Next, when the drawer open state information is input from the detection value processing unit 222, the recording control unit 224 starts recording the imaging result of the drawer camera 204 (step SA 12).
Next, the detection value processing unit 222 determines whether or not any of the opened drawers 14A, 15A is in the closed state (step SA 13).
When it is determined that either one of the opened drawers 14A, 15A is in the closed state (yes in step SA 13), the detection value processing unit 222 outputs drawer closed state information to the video recording control unit 224 (step SA 14).
Next, when the drawer closed state information is input from the detection value processing unit 222, the recording control unit 224 ends the recording of the imaging result of the drawer camera 204 (step SA 15). Thereby, the video file 231 of one drawer camera 204 is generated in the camera unit storage unit 230.
The detection value processing unit 222 determines whether or not a person is present around the refrigerator 1 (step SA 16).
When it is determined that a person is present in the vicinity of the refrigerator 1 (yes in step SA 16), the detection value processing unit 222 performs the processing of step SA4 and subsequent steps again.
On the other hand, when it is determined that no person is present around the refrigerator 1 (step SA16: no), the detection value processing unit 222 outputs information indicating the presence or absence of the person to the image pickup control unit 223 (step SA 17).
When the presence/absence information indicating the absence of a person is input from the detection value processing unit 222, the imaging control unit 223 ends the imaging by the refrigerator camera 203 and the drawer camera 204 (step SA 18).
[1-2-3. Action related to detection of food ]
Next, an operation related to the detection of food will be described.
Fig. 14 is a flowchart showing the operation of the food management system 1000.
In fig. 14, a flowchart FB shows the operation of the camera unit 2, and a flowchart FC shows the operation of the food management server 4.
The food detection unit 225 of the camera unit 2 determines whether or not the video file 231 is stored in the camera unit storage unit 230 (step SB 1).
When it is determined that the video file 231 is stored in the camera unit storage unit 230 (yes in step SB 1), the food detection unit 225 determines whether the stored video file 231 is the video file 231 of the refrigerator camera 203 or the video file 231 of the drawer camera 204 (step SB 2). The judgment in step SB2 is based on the information output from the detection value processing unit 222.
When the food detection unit 225 determines that the video file 231 is the video camera 203 for the refrigerator (step SB2: video camera for the refrigerator), it detects food based on the stored video file 231 (step SB 3).
In step SB3, for example, the food detection unit 225 detects food from the video file 231 of the refrigerator camera 203 based on the feature amounts such as the shape and the color. The data required for food detection is stored in a storage area that can be acquired by the food detection unit 225.
The food detection unit 225 determines whether or not food can be detected from the video file 231 of the refrigerator camera 203 (step SB 4).
When it is determined that the food cannot be detected from the video file 231 of the refrigerator camera 203 (no at step SB 4), the food detection unit 225 advances the process to step SB10.
On the other hand, when the food detection unit 225 can detect the food from the video file 231 of the refrigerator camera 203 (yes in step SB 4), the in-and-out determination process is performed (step SB 5).
The in-and-out determination process is a process of determining whether the food detected in step SB3 is food that is put in the refrigerator 1 or food that is taken out of the refrigerator 1. In the in-and-out determination process, the in-and-out determination image HG is superimposed on the moving image represented by the video file 231 of the refrigerator camera 203.
Fig. 15 is a diagram for explaining the entry and exit determination image HG.
The in-and-out determination image HG shown in fig. 15 shows a case where the user P overlaps with a moving image for putting the egg package in the refrigerator 1.
The entry/exit determination image HG is provided with a boundary line L2.
The boundary line L2 has boundary lines L21, L22, L23, L24, and L25. The boundary line L21 is a line extending from the left end to the right in the drawing of the access determination image HG, and is an upper portion in the drawing of the access determination image HG. The boundary line L22 is a line connecting the right end of the boundary line L21 and the left end of the boundary line L23. The boundary line L23 is a line extending rightward from a position in the left-right direction of the right end of the boundary line L21 at substantially the center in the up-down direction of the access determination image HG. The boundary line L24 is a line connecting the right end of the boundary line L23 and the left end of the boundary line L24. The boundary line L25 is a line extending rightward from a position in the right-left direction of the right end of the boundary line L23 to the right end of the access determination image HG, at the upper portion of the access determination image HG.
The image area of the entry/exit determination image HG is divided by the boundary line L2 into a concave area HG1, which is a concave area, and a convex area HG2, which is a convex area.
The food detection unit 225 performs image analysis on the moving image on which the in-out determination image HG is superimposed, thereby determining whether the food detected in step SB3 is moving from the convex region HG2 to the concave region HG1 or moving from the concave region HG1 to the convex region HG2. When it is determined that the food detected in step SB3 has progressed from the convex region HG2 to the concave region HG1, the food detection unit 225 determines that the food detected in step SB3 has been stored in the refrigerator 1. On the other hand, when it is determined that the food detected in step SB3 has progressed from the concave region HG1 to the convex region HG2, the food detection unit 225 determines that the food detected in step SB3 has been taken out of the refrigerator 1.
Returning to the description of the flowchart FB of fig. 14, the food detection unit 225 performs the divided area number determination process (step SB 6).
The divided area number determination process is a process of determining the divided area BA corresponding to the storage area in which the food is delivered and received detected in step SB3, and determining the divided area number 4118 assigned to the determined divided area BA.
For example, the food detection unit 225 identifies the storage area of the refrigerator compartment 11 in which the food detected in step SB3 is delivered by image analysis. Further, for example, a mark for identifying the storage area is given to each front end of the rack 112 and each door rack 111, and the food detection unit 225 determines the storage area based on the positional relationship between the mark and the detected food in the moving image. Next, the food detection unit 225 determines which of the divided areas BA of the image RG in the refrigerator compartment corresponds to the specified storage area, and specifies the divided area number 4118 assigned to the determined divided area BA. The correspondence between the storage area, the divided area BA, and the divided area number 4118 is stored as data in the camera unit storage unit 230.
When the divided area number determination process is performed, the food detection unit 225 cuts out an image of the food for each food detected in step SB3 from the video file 231 of the refrigerator camera 203 (step SB 7). Hereinafter, an image of the food cut out from the imaging result of the camera 213 is referred to as a "captured food image", and a symbol of "FG" is given.
The captured food image FG corresponds to an example of the food image of the present invention.
Next, the food detection unit 225 generates 1 st update information (step SB 8).
The 1 st update information generated at step SB8 includes the camera unit ID4112 of the camera unit storage section 230.
The 1 st update information generated in step SB8 includes date and time information indicating the date and time when the video file 231 of the refrigerator camera 203 was generated.
The 1 st update information generated in step SB8 includes, for each food item detected in step SB3, food item image data 4116 indicating the captured food item image FG cut out in step SB7, in-and-out determination result information indicating the determination result of the in-and-out determination process, and the divided area number 4118 determined by the divided area number determination process.
Next, the camera unit communication control unit 221 transmits the 1 st update information generated by the food detection unit 225 to the food management server 4 (step SB 9).
The food detection unit 225 deletes the video file 231 from the camera unit storage unit 230 (step SB 10).
Returning to the explanation of step SB2, when it is determined that the video file 231 of the drawer camera 204 is recorded (step SB2: drawer camera), the food detection unit 225 determines whether or not at least one of the upper case JCA and the lower case GCA of the moving image recorded in the video file 231 of the drawer camera 204 is pulled out by a predetermined threshold or more (step SB 11).
For example, the food detection unit 225 distinguishes the cartridge CA from the moving image based on the feature quantity such as the shape and the color. Then, the food detection unit 225 determines whether or not the upper case JCA and the lower case GCA are pulled out from the refrigerator 1 by a predetermined threshold or more by image analysis of the video file 231.
For example, marks for identifying the cartridges CA are given to the upper cartridge JCA and the lower cartridge GCA at predetermined positions, respectively, and the food detection unit 225 reads the marks from the moving image to distinguish the cartridges CA in the moving image. Then, the food detection unit 225 calculates the moving distance of the mark in the moving image, and determines whether or not the mark is pulled out by a predetermined threshold or more with respect to the refrigerator 1.
When it is determined that neither the upper case JCA nor the lower case GCA has been pulled out by the predetermined threshold or more (step SB11: no), the food detection unit 225 performs the processing of step SB 18.
On the other hand, when it is determined that at least one of the upper case JCA and the lower case GCA is pulled out by the predetermined threshold or more (yes in step SB 11), the food detection unit 225 cuts out the case image CG from the video file 231 (step SB 12).
In step SB12, when only the upper cassette JCA is pulled out by a predetermined threshold or more, the food detection unit 225 cuts out the upper cassette image JCG from the video file 231.
In step SB12, when only the lower cassette GCA is pulled out by a predetermined threshold or more, the food detection unit 225 cuts out the lower cassette image GCG from the video file 231.
In step SB12, when the upper cassette JCA and the lower cassette GCA are pulled out by a predetermined threshold or more, the food detection unit 225 cuts out the upper cassette image JCG and the lower cassette image GCG from the video file 231.
Next, the food detection unit 225 detects food from the cut-out cassette image CG (step SB 13).
Next, the food detection unit 225 determines whether or not food can be detected from the cassette image CG (step SB 14).
When it is determined that the food cannot be detected (step SB14: NO), the food detection unit 225 performs the processing of step SB 16.
On the other hand, when it is determined that the food item can be detected (yes in step SB 14), the food item detecting unit 225 cuts out the captured food item image FG for each detected food item from the cassette image CG (step SB 15).
Next, the food detection unit 225 generates the 2 nd update information (step SB 16).
In the case where only the upper-layer cassette image JCG is cut out, the 2 nd update information generated at step SB16 includes the camera unit ID4112 of the camera unit storage section 230, the upper-layer cassette image data 4123 indicating the cut-out upper-layer cassette image JCG, and the food image data 4116 indicating the captured food image FG cut out from the upper-layer cassette image JCG. If no is determined in step SB14, the food image data 4116 is not included in the 2 nd update information.
In addition, in the case where only the lower-layer cassette image GCG is cut out, the 2 nd update information generated in step SB16 includes the camera unit ID4112 of the camera unit storage section 230, lower-layer cassette image data 4125 indicating the cut-out lower-layer cassette image GCG, and food image data 4116 indicating the captured food image FG cut out from the lower-layer cassette image GCG. If no is determined in step SB14, the food image data 4116 is not included in the 2 nd update information.
Further, in the case where the upper layer cassette image JCG and the lower layer cassette image GCG are cut out, the 2 nd update information generated at step SB16 includes the camera unit ID4112 of the camera unit storage section 230, the upper layer cassette image data group, and the lower layer cassette image data group. The upper-layer cassette image data set includes upper-layer cassette image data 4123 representing a cut-out upper-layer cassette image JCG, and food image data 4116 representing a captured food image FG cut out from the upper-layer cassette image JCG. The lower-layer cassette image data set includes lower-layer cassette image data 4125 representing a cut-out lower-layer cassette image GCG, and food image data 4116 representing a captured food image FG cut out from the lower-layer cassette image GCG. If no is determined in step SB14, the food image data 4116 is not included in the 2 nd update information.
Next, the camera unit communication control unit 221 transmits the 2 nd update information generated by the food detection unit 225 to the food management server 4 (step SB 17).
Next, the food detection unit 225 deletes the video file 231 from the camera unit storage unit 230 (step SB 18).
As shown in the flowchart FC, the server communication control unit 401 of the food management server 4 determines whether or not any one of the 1 st update information and the 2 nd update information is received (step SC 1).
When the server communication control unit 401 determines that the update information is received (yes in step SC 1), the information processing unit 402 updates the database based on the update information received by the server communication control unit 401 (step SC 2).
Step SC2 will be described in detail.
In the case where the update information received by the server communication control unit 401 is the 1 st update information, the information processing unit 402 determines the refrigerating compartment food management record RR having the camera unit ID4112 included in the 1 st update information. Then, the information processing unit 402 updates the information 4115 indicating whether or not the specified refrigerator compartment food management record RR is present in the refrigerator compartment food management record RR.
Next, when the in-and-out determination result information included in the 1 st update information indicates that the information is in storage, the information processing unit 402 generates a sub-record, and associates the blank deletion information 4131 and the deletion date-and-time information 4132 with the generated sub-record to store the blank deletion information and the deletion date-and-time information in the specified refrigerating room food management record RR. The sub-record generated here includes the food image data 4116 included in the 1 st update information, the storage date-time information 4117 indicating the same date-time as the date-time information included in the 1 st update information, the divided area number 4118 included in the 1 st update information, the blank note input presence/absence information 4119, and the blank note information 4130.
Further, in the case where the in-out determination result information included in the 1 st update information indicates that the warehouse is present, the information processing unit 402 determines, from the refrigerating compartment food management record RR, a sub record of food image data 4116 including the captured food image FG indicated by the food image data 4116 included in the 1 st update information and the captured food image FG indicating the high-consistency captured food image FG. This determination is made for each food image data 4116 included in the 1 st update information. Then, the information processing unit 402 updates the deletion information 4131 corresponding to the specified sub-record to information indicating the deletion target. The information processing unit 402 updates the delete date and time information 4132 corresponding to the specified sub-record to information indicating the current date and time as the delete date and time.
Further, step SC2 will be described in detail.
In the case where the update information received by the server communication control section 401 is the 2 nd update information, the information processing section 402 determines the drawer food management record HR having the camera unit ID4112 included in the 2 nd update information.
In the case of the update information of the 2 nd stage when only the upper stage box image JCG is cut out, the information processing unit 402 updates the upper stage box image data 4123 and the food image data 4116 included in the upper stage box information 4122 of the determined drawer food management record HR to the data included in the update information of the 2 nd stage.
In the case of the update information of the 2 nd stage when only the lower stage box image GCG is cut out, the information processing unit 402 updates the lower stage box image data 4125 and the food image data 4116 included in the lower stage box information 4124 of the specified drawer food management record HR to the data included in the update information of the 2 nd stage.
In the case of updating information for the 2 nd stage when the upper-stage box image JCG and the lower-stage box image GCG are cut out, the information processing unit 402 appropriately updates the upper-stage box image data 4123 and the food image data 4116 included in the upper-stage box information 4122 of the determined drawer food management record HR to data included in the upper-stage box image data group. The information processing unit 402 updates the lower-layer cassette image data 4125 and the food image data 4116 included in the lower-layer cassette information 4124 of the specified drawer food management record HR to data included in the lower-layer cassette image data group.
The information processing unit 402 also updates the update date/time information 4121 of the determined drawer food management record HR to information indicating the date/time at which the drawer food management record HR was updated.
1-2-4. Actions related to application UI
Next, actions related to the application UI320 will be described.
In the present embodiment, the application UI320 displays a screen related to the refrigerator compartment 11 and a screen of the drawer type storage compartment.
[1-2-4-1. Picture related to refrigerator Chamber ]
The screens related to the refrigerator 11 include a refrigerator home screen GM2, a refrigerator list screen GM3, and a food information addition list screen GM4.
Fig. 16 is a diagram showing an example of the refrigerator home screen GM 2.
The refrigerator home screen GM2 HAs a food image display area HA2.
The food image display area HA2 is an area in which the captured food image FG is displayed. The food image display area HA2 displays the refrigerator interior image RG, and displays the captured food image FG so as to overlap the refrigerator interior image RG. The captured food images FG are each displayed in superimposed relation to the in-refrigerator image RG in a divided area BA indicated by a divided area number 4118 corresponding to the food image data 4116.
The food image display area HA2 of fig. 16 displays the in-refrigerator image RG shown in fig. 9. For example, in the refrigerating compartment food management record RR, the divided area number 4118 of "4" is associated with the food image data 4116 representing the captured food image FG1 of fig. 16. In this case, the captured food image FG1 is displayed superimposed on the divided area BA assigned with the divided area number 4118 of "4".
The default food image display area HA2 displays the refrigerator interior image RG in a state where the left door shelf image RG1 and the right door shelf image RG3 are not visible. In the food image display area HA2, a scroll operation in the left-right direction of the drawing is received from the user P. The food image display area HA2 changes the display range of the image RG in the refrigerator according to the scroll amount and scroll direction of the received scroll operation.
For example, the scroll direction is right, and is the scroll amount displayed to the left end of the left door shelf image RG 1. In this case, the food image display area HA2 displays the in-refrigerator image RG in which the right door shelf image RG3 is not seen at all. The captured food image FG and the remark input icon MIC follow the change of the display range of the image RG in the refrigerator compartment, in other words, move with the movement of the superimposed divided area BA. The remark input icon MIC is described later.
The food image display area HA2 is displayed side by side in the left-right direction of the figure when a plurality of captured food images FG are displayed in one divided area BA. The food image display area HA2 can simultaneously display at most five captured food images FG side by side in one divided area BA. The food image display area HA2 receives a scroll operation from the user P when there are six or more captured food images FG as display objects in one divided area BA. One divided area BA displays a sixth and subsequent captured food image FG upon receiving a scroll operation. When six or more captured food images FG to be displayed are displayed in one divided area BA and a scroll operation in the left-right direction is received, the food image display area HA2 displays the sixth and subsequent captured food images FG based on the scroll amount while maintaining the maximum number of images that can be displayed side by side. At this time, the food image display area HA2 does not display the first to fifth captured food images FG displayed or displays the first and subsequent captured food images FG from the sixth and subsequent captured food images FG based on the scroll amount and the number of captured food images FG to be displayed. In addition, the remark input icon MIC moves following the movement of the associated captured food image FG. That is, if the associated photographed food image FG is not displayed by the scroll operation, the note input icon MIC is also not displayed.
The captured food image FG displayed in the food image display area HA2 receives inputs of remarks, respectively. The remark is information that the user P wants to leave for the food products of the refrigerator 1.
Fig. 17 is a diagram showing a remark input flow.
In the refrigerator home screen GM2, the user P performs a predetermined touch operation such as a long press operation on the captured food image FG to which a remark is to be input. When a predetermined touch operation is received for one of the captured food images FG, the application executing unit 301 displays a comment edit image MHG in association with the captured food image FG subjected to the predetermined touch operation.
The note editing image MHG has a note input area MNA, a note save button B1, a note delete button B2, and a food delete button B3.
The remark input area MNA is an area for inputting a remark. When the remark input area MNA is touch-operated, the application execution unit 301 causes the touch panel 32 to display a soft keyboard. Then, the application execution unit 301 inputs a remark corresponding to an operation on the soft keyboard into the remark input area MNA.
The remark saving button B1 is a software button for saving a remark input in the remark input area MNA. The remark saving button B1 cannot be touch-operated when no remark is input in the remark input area MNA. When the remark saving button B1 is touched, the application execution unit 301 displays the captured food image FG to which the remark is input in a display manner different from the display manner of the captured food image FG to which the remark is not input on the screen related to the refrigerator compartment 11. In the present embodiment, the application execution unit 301 displays a frame of the captured food image FG to which the remark is input in a different color, and displays a remark input icon MIC indicating an icon to which the remark is input in association with the captured food image FG to which the remark is input. The remark input icon MIC is capable of touch operation. When a touch operation is performed on the remark input icon MIC, the input remark is displayed in the application UI 320.
The remark input icon MIC corresponds to an example of a remark input image of the present invention.
The remark deletion button B2 is a button for deleting an input remark. When the note deletion button B2 is touch-operated, the note input to the captured food image FG is deleted.
The food deletion button B3 is a button for deleting the captured food image FG from the food image display area HA 2.
Returning to the explanation of the refrigerator home screen GM2 of fig. 16, the food image display area HA2 HAs a list display button B4. The list display button B4 is a software button capable of touch operation.
When a touch operation is performed on the list display button B4, in the application UI320, the screen transitions to the refrigerating compartment list screen GM3.
The refrigerator compartment list screen GM3 includes a storage list screen GM31, a new addition list screen GM32, and a deletion object list screen GM33.
The storage list screen GM31 will be described.
Fig. 18 is a diagram showing the storage list screen GM 31.
The storage list screen GM31 HAs a storage food information display area HA3. The stored food information display area HA3 displays the 1 st food information J21 for each list of the foods stored in the refrigerator compartment 11. The 1 st food information J21 includes a captured food image FG, storage date and time information 4117, and divided area information J3. The divided area information J3 is a refrigerator interior image RG indicating the divided area BA in color.
When a remark is input, the captured food image FG of the 1 st food information J21 is displayed in a different color display frame in association with the remark input icon MIC. In the stored food information display area HA3, a scroll operation can be performed in the vertical direction in the figure. By performing a scroll operation in the stored food information display area HA3, the user P can display the 1 st food information J21 which cannot be completely displayed in the stored food information display area HA 3.
The storage list screen GM31 has a 1 st switching button B5 and a 2 nd switching button B6.
The 1 st switching button B5 is a software button for switching the arrangement order of the 1 st food information J21 displayed in the stored food information display area HA3 to the next arrangement order.
That is, when the 1 st switching button B5 is touched, the stored food information display area HA3 displays the 1 st food information J21 having the photographed food image FG to which the remark is input higher than the 1 st food information J21 having the photographed food image FG to which the remark is not input. Further, the display in the upper position means that the display is performed so that the scroll amount at the time of display is reduced. When there are a plurality of 1 st food information J21 having the captured food image FG to which the remark is input, the stored food information display area HA3 displays the 1 st food information J21 having the earlier storage date and time indicated by the storage date and time information 4117 in the upper position. When there are a plurality of 1 st food information J21 having the captured food image FG to which no remark is input, the stored food information display area HA3 displays the 1 st food information J21 having the earlier storage date and time indicated by the storage date and time information 4117 in the upper position.
The left storage list screen GM31 in the diagram of fig. 18 shows the storage list screen GM31 when the 1 st switch button B5 is touched.
The 2 nd switching button B6 is a software button for switching the arrangement order of the 1 st food information J21 displayed in the stored food information display area HA3 to the next arrangement order.
That is, when the 2 nd switching button B6 is touched, the 1 st food information J21 is displayed in parallel for each divided area BA in the stored food information display area HA 3. The arrangement order of each of the divided areas BA may be an order predetermined by the user P. The stored food information display area HA3 displays the 1 st food information J21 in the same arrangement as the arrangement order in the case where the 1 st switching button B5 is touched in one divided area BA.
The storage list screen GM31 on the right side in the diagram of fig. 18 shows the storage list screen GM31 when the 2 nd switch button B6 is touched.
The storage list screen GM31 has a storage list display button B7, a new addition list display button B8, and a deletion object list display button B9.
The accommodation list display button B7 is a software button for switching the screen displayed by the application UI320 to the accommodation list screen GM31.
The new additional list display button B8 is a software button for switching the screen displayed by the application UI320 to the new additional list screen GM 32.
The deletion object list display button B9 is a software button for switching the screen displayed by the application UI320 to the deletion object list screen GM 33.
Fig. 19 is a diagram showing an example of the new additional list screen GM 32.
The new additional list screen GM32 HAs a new additional food information display area HA4. The newly added food information display area HA4 displays the 1 st food information J21 concerning the food newly stored in the refrigerator compartment 11 in a list. The food newly stored in the refrigerator compartment 11 is a food stored for a predetermined period (for example, three days) from the current date and time. The newly added food information display area HA4 displays the 1 st food information J21 of the food corresponding to the newly stored food in the order in which the storage dates and times indicated by the storage date and time information 4117 are new.
The 1 st food information J21 displayed in the newly added food information display area HA4 receives a predetermined touch operation on the captured food image FG. On the new additional list screen GM32, a 1 st edit image HNG1 is displayed in association with the captured food image FG subjected to the predetermined touch operation.
The 1 st editing image HNG1 has a food deletion button B10, an area correction button B11, and a return button B12.
The food deletion button B10 is a software button for setting a food indicated by the captured food image FG associated with the 1 st edit image HNG1 as a deletion target from the refrigerator home screen GM2 when touched.
The area correction button B11 is a software button for correcting the divided area BA indicated by the divided area information J3 when touched.
The back button B12 is a software button for stopping the display of the 1 st editing image HNG 1.
The newly added food information display area HA4 HAs a food information adding button B13 at the end of the list of 1 st food information J21. The food information adding button B13 is a software button for manually adding new 1 st food information J21.
An additional flow of the 1 st food information J21 will be described with reference to fig. 20.
Fig. 20 is a diagram for explaining an additional flow of the 1 st food information J21.
When the food information addition button B13 is touch-operated, the food information addition list screen GM4 is displayed in the application UI 320. The food information additional list screen GM4 has a plurality of additional buttons B14 as software buttons. The food information additional list screen GM4 includes a blank image BG and a food name input area SNA for inputting a food name. When the blank image BG is touched, an image indicating the name of the food input to the food name input area SNA is formed in the blank image BG.
When the additional button B14 is touch-operated on the food information additional list screen GM4, the application UI320 displays the in-refrigerator image RG. Here, the displayed image RG in the refrigerator compartment is an image in which the divided area numbers 4118 are added to the respective divided areas BA, and the respective divided areas BA can be selected by a touch operation. When the divided area BA is selected in the image RG in the refrigerator compartment, the next 1 st food information J21 is added to the application UI 320.
That is, the 1 st food information J21 to be added is an image of the food corresponding to the additional button B14 that is touched, the storage date-time information 4117 indicating the date-time of the 1 st food information J21 to be added as the storage date-time, and the divided area information J3 indicating the selected divided area BA. The image of the added 1 st food item indicated by the food item information J21 may be an icon. In the present embodiment, the image superimposed on the in-refrigerator image RG of the home screen GM2 is assumed to be the captured food image FG, but when the 1 st food information J21 is added, an icon is displayed in the in-refrigerator image RG together with the captured food image FG.
In addition, the additional flow of the 1 st food information J21 may be a flow in which the user P inputs the date and time of storage. In this case, the added 1 st food information J21 includes storage date and time information 4117 indicating the storage date and time input by the user P.
Returning to the explanation of fig. 19, the new additional list screen GM32 has a storage list display button B7, a new additional list display button B8, and a deletion object list display button B9.
Fig. 21 is a diagram showing an example of the deletion object list screen GM 33.
The deletion object list screen GM33 HAs a deletion object food information display area HA5. The deletion-target food information display area HA5 displays the 2 nd food information J22 for each of the food lists to be deleted. The food to be deleted is a food to be deleted from the refrigerator home screen GM2, and a food to be judged to be taken out of the refrigerator 11. The 2 nd food information J22 includes a captured food image FG, delete date and time information 4132, and divided area information J3. The deletion object list screen GM33 displays the 2 nd food information J22 in the order of new deletion date and time.
The 2 nd food information J22 displayed in the deletion target food information display area HA5 receives a predetermined touch operation on the captured food image FG. On the deletion object list screen GM33, the 2 nd edit image HNG2 is displayed in association with the captured food image FG subjected to the predetermined touch operation.
The 2 nd editing image HNG2 has a cancel delete button B15 and a return button B16.
The deletion cancel button B15 is a software button for setting the captured food image FG associated with the 2 nd edit image HNG2 as an image displayed on the refrigerator home screen GM 2.
The back button B16 is a software button for stopping the display of the 2 nd editing image HNG 2.
1-2-4-2. Actions of the food management System
Next, an operation of the food management system 1000 in a case where a screen related to the refrigerator compartment 11 is displayed will be described.
Fig. 22 is a flowchart showing the operation of the food management system 1000. In fig. 22, a flowchart FD shows the operation of the terminal device 3, and a flowchart FE shows the operation of the food management server 4.
The application execution unit 301 of the terminal device 3 determines whether or not the refrigerator compartment food management record RR is requested from the food management server 4 (step SD 1).
For example, when the application UI320 receives a touch operation to instruct the display of the refrigerator home screen GM2, the application execution unit 301 determines yes in step SD 1.
For example, when the application execution unit 301 performs a touch operation for instructing the update of the display on the refrigerator home screen GM2, it determines yes in step SD 1.
When it is determined that the refrigerating room food management record RR is requested (yes in step SD 1), the application execution unit 301 transmits the 1 st record request information for requesting the refrigerating room food management record RR to the food management server 4 (step SD 2). The 1 st recording request information transmitted in step SD2 includes the account ID4111 stored in the terminal storage unit 310.
As shown in the flowchart FF, the server communication control unit 401 of the food management server 4 receives the 1 st recording request information through the server communication unit 41 (step SE 1).
Next, the information processing unit 402 determines the refrigerator food management record RR including the account ID4111 included in the 1 st record request information received by the server communication control unit 401 from the refrigerator food management database 411 (step SE 2).
Then, the server communication control unit 401 transmits the refrigerating room food management record RR specified by the information processing unit 402 to the terminal device 3 (step SE 3). The information processing unit 402 updates the information 4115 indicating whether or not the refrigerator compartment food management record RR is present in the refrigerator compartment food management record RR.
Referring to flowchart FD, when receiving refrigerating room food management record RR from food management server 4 (step SD 3), application execution unit 301 of terminal device 3 generates a screen relating to refrigerating room 11 based on refrigerating room food management record RR (step SD 4).
Step SD4 is described in detail.
The application execution unit 301 generates a refrigerator home screen GM2 for displaying the refrigerator interior image RG indicated by the refrigerator interior image data 4114 included in the refrigerator food management record RR on the food image display area HA 2.
The application execution unit 301 creates a refrigerator home screen GM2 in which the captured food image FG indicated by the food image data 4116 is superimposed on the divided area BA indicated by the divided area number 4118 for each sub-record.
Further, when the note input presence/absence information 4119 indicates the presence of a note, the application execution unit 301 generates a refrigerator home screen GM2 in which a note input icon MIC is displayed in association with the captured food image FG for each sub-record.
Further, step SD4 will be described in detail.
The application execution unit 301 generates 1 st food information J21 for each food image data 4116. The 1 st food information J21 includes a captured food image FG indicated by the food image data 4116, storage date-and-time information 4117 associated with the food image data 4116, and divided area information J3 indicating the divided area BA indicated by the divided area number 4118 in color. Then, the application execution unit 301 generates the storage list screen GM31 for displaying the 1 st food information J21 associated with the sub-record to which the deletion information 4131 not indicating the deletion target is associated in a list. Further, the application execution unit 301 generates a new additional list screen GM32 for displaying the 1 st food information J21, which indicates the date and time of storage indicated by the storage date and time information 4117, of the 1 st food information J21 displayed on the storage list screen GM31 in a list.
Further, step SD4 will be described in detail.
The application execution unit 301 generates the 2 nd food information J22 for each food image data 4116. The 2 nd food information J22 includes a captured food image FG indicated by the food image data 4116, deletion date-and-time information 4132 corresponding to the food image data 4116, and divided area information J3 indicating the divided area BA indicated by the divided area number 4118 in color. Then, the application execution unit 301 generates a deletion object list screen GM33 for displaying the generated 2 nd food information J22 in a list.
Returning to the description of the flowchart FD, the application execution unit 301 displays a screen related to the refrigerator compartment 11 according to the touch operation (step SD 5). When the in-and-out information 4115 of the refrigerator compartment food management record RR received in step SD3 indicates "in-and-out", the application execution unit 301 pops up the refrigerator compartment home screen GM2 with the intention of having a new in-and-out in the refrigerator compartment home screen GM2 when the refrigerator compartment home screen GM2 is displayed for the first time in step SD 5.
Next, an operation of the food management system 1000 when the operation is performed on the screen related to the refrigerator compartment 11 will be described.
When the remark saving button B1 is touched, the application executing unit 301 transmits information for updating the sub-record related to the imaged food image FG as the operation target to the food management server 4. The information includes remark input presence information 4119 indicating that a remark is present, and remark information 4130 indicating that the remark is input. The food management server 4 updates the sub-records appropriately based on the received information.
When the comment delete button B2 is touched, the application execution unit 301 transmits information for updating the sub-record related to the imaged food image FG to be operated to the food management server 4. The information includes remark input presence information 4119 indicating the input of no remarks. The food management server 4 updates the sub-records appropriately based on the received information.
When the food deletion buttons B3 and B10 are touched, the application execution unit 301 transmits information for updating the sub-record related to the captured food image FG as the operation target to the food management server 4. The information includes deletion information 4131 indicating a deletion object. The food management server 4 updates the sub-records appropriately based on the received information.
When the divided area BA is corrected by a touch operation on the area correction button B11, the application execution unit 301 transmits information for updating the sub-record related to the imaged food image FG as the operation target to the food management server 4. This information includes the divided area number 4118 of the corrected divided area BA. The food management server 4 updates the sub-records appropriately based on the received information.
When the 1 st food information J21 is added by the addition flow shown in fig. 20, the application execution unit 301 transmits information to which the sub-record related to the added 1 st food information J21 is added to the food management server 4. The food management server 4 updates the sub-records appropriately based on the received information.
When the deletion cancel button B15 is touched, the application execution unit 301 transmits information for updating the sub-record of the captured food image FG as the operation target to the food management server 4. The information includes deletion information 4131 indicating a blank. The food management server 4 updates the sub-records appropriately based on the received information.
[1-2-4-3. Picture of drawer type storage Chamber ]
Next, a screen related to the drawer type storage room will be described. In the explanation using fig. 23 to 25, the vegetable compartment 15 is exemplified as a drawer-type storage compartment. In this description, the database shown in fig. 10 is a database that holds various information related to foods in the vegetable room 15.
The screens related to the vegetable room 15 include a vegetable room home screen GM5 and a food image list screen GM6.
Fig. 23 is a diagram showing an example of the vegetable room home screen GM 5.
The vegetable room home screen GM5 HAs a box image display area HA6.
The cassette image display area HA6 is an area in which the cassette image CG is displayed. The cassette image display area HA6 displays an upper cassette image JCG superimposed on the lower cassette image GCG. The default cassette image display area HA6 is formed by overlapping the upper cassette image JCG with the entire image area of the lower cassette image GCG, and only the upper cassette image JCG is displayed.
The cassette image display area HA6 receives a scroll operation in the up-down direction of the drawing. The box image display area HA6 changes the display range of the upper box image JCG according to the scroll amount and the scroll direction of the received scroll operation. For example, when the scroll direction of the received scroll operation is upward, the box image display area HA6 displays a part of the upper box image JCG which is not visible. In the cassette image display area HA6, when the upper cassette image JCG is moved upward, the lower cassette image GCG on which the upper cassette image JCG is superimposed is displayed.
Each of the cassette images CG displayed in the cassette image display area HA6 receives input of remarks. In addition, the flow of inputting remarks is the same as the flow, except that the image operated by the user P is the captured food image FG or the box body image CG, as compared with the case of the refrigerator 11. As shown in fig. 23, the remark input icon MIC associated with the upper cassette image JCG moves following the movement of the upper cassette image JCG.
The vegetable room home screen GM5 has a list display button B17. The list display button B17 is a software button capable of touch operation.
When a touch operation is performed on the list display button B17, a food image list screen GM6 is displayed in the application UI 320.
In addition, the vegetable room home screen GM5 may have a switch button for switching the box image CG displayed in the box image display area HA6 to the upper box image JCG or the lower box image GCG. In this configuration, the application execution unit 301 switches the displayed cassette image CG to the upper cassette image JCG or the lower cassette image GCG in response to the operation of the switch button.
Fig. 24 is a diagram showing an example of the food image list screen GM6.
The food image list screen GM6 has update date and time information 4121. The food image list screen GM6 of fig. 24 has update date and time information 4121 indicating the update date and time of "12/1 10:05".
The food image list screen GM6 displays the captured food images FG for each box. The captured food image FG displayed in association with the item of the upper-layer cartridge JCA is an image taken from the upper-layer cartridge image JCG. The captured food image FG displayed in association with the item of the lower-layer cassette GCA is an image taken from the lower-layer cassette image GCG.
1-2-4-5. Actions of the food management System
Next, an operation of the food management system 1000 in a case where a screen related to the vegetable room 15 is displayed will be described.
Fig. 25 is a flowchart showing the operation of the food management system 1000. In fig. 25, a flowchart FF shows the operation of the terminal device 3, and a flowchart FJ shows the operation of the food management server 4.
The application execution unit 301 of the terminal device 3 determines whether or not the drawer food management record HR is requested from the food management server 4 (step SF 1).
For example, when the application UI320 receives a touch operation to instruct the display of the vegetable room home screen GM5, the application execution unit 301 determines yes in step SF 1.
For example, when the application execution unit 301 performs a touch operation for instructing the update of the display on the vegetable room home screen GM5, it determines yes in step SF 1.
When it is determined that the drawer food management record HR is requested (yes in step SF 1), the application execution unit 301 transmits the 2 nd record request information for requesting the drawer management record HR to the food management server 4 (step SF 2). The 2 nd recording request information transmitted in step SF2 includes the account ID4111 stored in the terminal storage section 310.
Referring to flowchart FJ, server communication control unit 401 of food management server 4 receives the 2 nd recording request information (step SJ 1).
Next, the information processing section 402 determines a drawer management record HR including the account ID4111 included in the 2 nd record request information received by the server communication control section 401 from the drawer food management database 412 (step SJ 2).
Then, the server communication control unit 401 transmits the drawer management record HR specified by the information processing unit 402 to the terminal device 3 (step SJ 3).
Referring to flowchart FJ, when receiving drawer food management record HR from food management server 4 (step SF 3), application execution unit 301 of terminal device 3 generates a screen related to vegetable room 15 (step SF 4).
Step SF4 is described in detail.
The application execution unit 301 generates a vegetable room home screen GM5, the vegetable room home screen GM5 having a box image display area HA6 in which a lower box image GCG indicated by the lower box image data 4125 included in the drawer food management record HR is superimposed with an upper box image JCG indicated by the upper box image data 4123 included in the same record.
Further, when the note input presence/absence information 4119 indicates the presence of a note, the application execution unit 301 generates a vegetable room home screen GM5 in which the note input icon MIC is displayed in association with the corresponding box image CG.
Further, step SF4 will be described in detail.
The application execution unit 301 generates a food image list screen GM6 in which the captured food images FG indicated by the food image data 4116 are arranged for each case CA.
Returning to the explanation of the flowchart FF, the application execution unit 301 displays a screen related to the vegetable room 15 according to the touch operation (step SF 5).
Next, an operation of the food management system 1000 when the operation is performed on the screen related to the vegetable room 15 will be described.
When the comment holding button B1 is touched, the application executing unit 301 transmits information of updating the comment input/non-input information 4119 and the comment information 4130 to the food management server 4. The information includes information indicating whether the operation object is the upper layer cassette image JCG or the lower layer cassette image GCG, remark input presence or absence information 4119 indicating that a remark is present, and remark information 4130 indicating that the remark is input. The food management server 4 updates the drawer food management record HR appropriately based on the received information.
When the comment delete button B2 is touched, the application executing unit 301 transmits information of the update comment input/non-input information 4119 and the comment information 4130 to the food management server 4. The information includes information indicating whether the deletion target of the remark is the upper-layer cassette image JCG or the lower-layer cassette image GCG, and remark input presence information 4119 indicating that no remark is entered. The food management server 4 updates the drawer food management record HR appropriately based on the received information.
The vegetable compartment 15 is exemplified as the drawer-type storage compartment described with reference to fig. 23 to 25. However, the screen display shown in fig. 23 to 25 and the operation of the food management system 1000 may be performed in the same manner with respect to the freezing chamber 14. In this case, the database shown in fig. 10 represents a database holding various information related to food items of the freezing chamber 14.
[1-3. Effect ]
As described above, the camera unit 2 includes: a main body 201 provided on an upper surface 10B of the refrigerator 1; and an imaging unit 202 including a camera 213 extending from the main body 201 to the front of the refrigerator 1 and photographing the lower side from the front upper side of the refrigerator 1. The main body 201 includes a position regulating member 205 that abuts against the front end edge 10A of the refrigerator 1 and regulates the position of the imaging member 202 in the front-rear direction of the refrigerator 1.
Thus, the position of the imaging member 202 in the front-rear direction of the refrigerator 1 is regulated by the position regulating member 205, so that the camera 213 is defined at a position where photographing from the front upper side of the refrigerator 1 is possible. Therefore, the food entering and exiting the refrigerator 1 can be included in the imaging result of the camera 213 with high accuracy, and the food entering and exiting the refrigerator 1 can be imaged with high accuracy.
The position regulating member 205 is a position regulating piece that extends downward from the bottom surface 206 of the main body 201 in the set state.
Thereby, the position of the camera unit 2 can be restricted by the gap formed between the main casing 10 and the door 11C. Therefore, since the camera unit 2 can be provided to the conventional refrigerator 1, food items entering and exiting the refrigerator 1 can be photographed with high accuracy even in the conventional refrigerator 1.
The position regulating member 205 is formed to be shorter than a length to a gasket provided between the refrigerator 1 and the door 11C of the refrigerator 1.
Thereby, the position of the camera unit 2 can be restricted by the gap formed between the main casing 10 and the door 11C without impeding the cooling effect of the refrigerating chamber 11. Therefore, food entering and exiting the refrigerator 1 can be photographed with high accuracy without impeding the cooling effect of the refrigerator compartment 11.
The position regulating member 205 extends in the width direction of the main body 201.
This ensures a large contact area with the main casing 10, and thus stabilizes the installation posture of the camera unit 2. Therefore, the food entering and exiting the refrigerator 1 can be photographed with higher accuracy.
The position regulating member 205 has a mark 207 for adjusting the position of the camera unit 2 in the width direction of the refrigerator 1.
By this, the user P can easily set the camera unit 2 in the left-right direction of the refrigerator 1 by setting the mark 207 as a sign. Therefore, the user P can easily install the camera unit 2 so that the user P can take a picture of the food in and out of the refrigerator 1 with high accuracy.
The position regulating member 205 has a mark 207 at the center in the width direction of the main body 201.
Accordingly, by setting the mark 207 as a sign, the user P can easily set the camera unit 2 in the center of the refrigerator 1 in the left-right direction. Therefore, the user P can easily install the camera unit 2 so that the user P can take a picture of the food in and out of the refrigerator 1 with higher accuracy.
Further, as described, the camera unit 2 includes: a camera 213 that photographs the refrigerator 1 from the front upper side of the refrigerator 1 by moving images; an imaging control unit 223 that causes the camera 213 to start imaging when the person sensor 212 that detects whether or not a person is present in the vicinity of the refrigerator 1 detects that a person is present in the vicinity of the refrigerator 1; and a video recording control unit 224 that starts video recording of the imaging result of the camera 213 at a predetermined timing after the imaging control unit 223 starts imaging the camera 213.
In addition, the control method of the camera unit 2 starts the imaging of the camera 213 when the human sensor 212 that detects whether or not a human is present in the vicinity of the refrigerator 1 detects that a human is present in the vicinity of the refrigerator 1, and starts the imaging of the imaging result of the camera 213 at a predetermined timing after the imaging of the camera 213 is started.
According to the camera unit 2 and the control method of the camera unit 2, by starting the image capturing by the camera 213 when the human sensor 212 detects that a person is present in the vicinity of the refrigerator 1, the possibility of including the image capturing result of the camera 213 with food taken in and out of the refrigerator 1 before the door 11C or drawer 16A of the refrigerator 1 is opened can be improved. Therefore, the food entering and exiting the refrigerator 1 can be photographed with high accuracy.
After the imaging control unit 223 starts imaging by the camera 213, the imaging control unit 224 starts imaging of the imaging result by the camera 213 when the door 11C or drawer 16A of the refrigerator 1 is opened.
Thus, since the video recording is started at a timing when the possibility of starting the entrance/exit of the food is high, unnecessary video recording can be suppressed in a situation where the food does not enter/exit the refrigerator 1. Further, the video recording control unit 224 can suppress an increase in the data capacity of the video recording file 231, and thus can prevent an increase in the processing target of the food detection unit 225, and an increase in the processing time and processing load of the food detection unit 225.
After the video recording control unit 224 starts the video recording, the video recording is stopped when the door 11C or drawer 16A of the refrigerator 1 is opened and closed.
Thus, since the image recording is completed at a timing when the possibility of the food entering and exiting is high, unnecessary image recording can be suppressed in a situation where the food does not enter and exit the refrigerator 1. Further, the video recording control unit 224 can suppress an increase in the data capacity of the video recording file 231, and thus can prevent an increase in the processing target of the food detection unit 225, and an increase in the processing time and processing load of the food detection unit 225.
When stopping the video recording, the video recording control unit 224 generates a result of the video recording from the time when the door 11C or drawer 16A of the refrigerator 1 is opened to the time when it is closed as one video file 231.
Thus, the result of the shooting from the opening to the closing of the door 11C or drawer 16A of the refrigerator 1 is generated as one video file 231, so that it is possible to quickly generate one video file 231 while suppressing an increase in the data capacity of one video file 231. Further, since the increase in the data capacity of one video file 231 can be suppressed and one video file 231 can be quickly created, the food detection unit 225 can quickly start processing related to the video file 231 after food has entered and exited the refrigerator 1.
After the image recording control unit 224 stops the image recording, the image pickup control unit 223 causes the camera 213 to end the image pickup when the human sensor 212 detects that no human is present around the refrigerator 1.
In this way, since the camera 213 is terminated to take a picture when the human sensor 212 detects that no human is present in the vicinity of the refrigerator 1, it is possible to prevent the camera 213 from taking a picture when no food is going into or out of the refrigerator 1. Therefore, the food entering and exiting the refrigerator 1 can be photographed with high accuracy while preventing an increase in power consumption.
The camera unit 2 includes a food detection unit 225 that detects food items that come in and go out of the refrigerator 1 based on the video result of the video recording control unit 224.
Accordingly, the food can be detected based on the imaging result obtained by imaging the food in and out of the refrigerator 1 with high accuracy, and therefore the food in and out of the refrigerator 1 can be detected with high accuracy.
As described above, the food management application 311 uses the terminal processor 300 of the terminal device 3 including the touch panel 32 as the application execution unit 301, and the application execution unit 301 displays the in-refrigerator image RG via the touch panel 32, and displays the captured food image FG in the divided area BA corresponding to the storage area of the food stored in the refrigerator 11 in the displayed in-refrigerator image RG.
Further, the terminal device 3 includes: a touch panel 32; and an application execution unit 301 that displays the in-refrigerator image RG on the touch panel 32, and displays the captured food image FG in a divided area BA corresponding to the storage area of the food stored in the refrigerator 11 in the displayed in-refrigerator image RG.
The control method of the terminal device 3 displays the in-refrigerator image RG, and in the displayed in-refrigerator image RG, the captured food image FG is displayed in the divided area BA corresponding to the storage area of the food stored in the refrigerator 11.
According to the food management application 311, the terminal device 3, and the control method of the terminal device 3, the captured food image FG is displayed in the divided area BA corresponding to the storage area of the food stored in the refrigerator within the image showing the inside of the refrigerator. Therefore, the user P can intuitively grasp what food is contained in the refrigerator compartment 11.
The captured food image FG is a captured image of the refrigerator compartment camera 203.
Thus, the user P is actually provided with the captured image of the food stored in the refrigerator 1, and thus can grasp what the food stored in the refrigerator 11 is more intuitively.
The in-refrigerator image RG is an image based on the model of the refrigerator 1 used by the user P of the terminal device 3.
Accordingly, since the user P is provided with the in-refrigerator image RG corresponding to the refrigerator 11 of the refrigerator 1 that is actually used, the user P can more intuitively grasp what food is contained in the refrigerator 11.
The image RG in the refrigerator compartment is divided into a plurality of divided areas BA. The number of divided areas BA included in the in-refrigerator image RG is based on the number of models of the refrigerator 1 used by the user P of the terminal device 3.
This allows the user P to grasp what food is contained in the storage area of the refrigerator compartment 11 of the refrigerator 1 in actual use. Therefore, the user P can more intuitively grasp what food is contained in the refrigerator compartment 11.
When a plurality of captured food images FG are displayed in one divided area BA, the application execution unit 301 displays the plurality of captured food images FG in parallel.
In this way, since a plurality of captured food images FG can be displayed visually and easily for each divided area BA, the user P can intuitively and easily grasp what food is stored in the refrigerator compartment 11.
An upper limit number of captured food images FG that can be displayed simultaneously is set for one divided area BA.
This can prevent the visibility from being lowered when the number of captured food images FG displayed in one divided area BA is large. Therefore, the user P can intuitively and easily grasp what food is contained in the refrigerator compartment 11.
When the number of captured food images FG to be displayed in one divided area BA exceeds the upper limit number, the application execution unit 301 can display the captured food images FG in a switchable manner within the upper limit number.
This can prevent the visibility from being lowered when the number of captured food images FG displayed in one divided area BA is large, and the user P can visually confirm the captured food images FG that cannot be displayed completely.
The application execution unit 301 receives a comment input for each captured food image FG from the user P of the terminal device 3, and displays a comment input icon MIC indicating that the comment was received in association with the captured food image FG.
This enables the user P to grasp which food the remark is input to. Therefore, the user P can intuitively grasp what food is stored in the refrigerator compartment 11, and can grasp which food has been input with remarks.
The food management application 311 is an application program that can be installed in the terminal device 3.
Thus, by installing the food management application 311, the terminal device 3 that does not have a function of displaying the image RG in the refrigerator compartment and capturing the food image FG can be made to be the terminal device 3 that can display these images.
As described above, the food management application 311 causes the terminal processor 300 of the terminal device 3 including the touch panel 32 to function as the application execution unit 301 that displays the captured image of the drawer camera 204 that captures the drawer 16A of the refrigerator 1 having the upper case JCA and the lower case GCA. The application execution unit 301 can display the upper-layer box image JCG and the lower-layer box image GCG, and receive input of remarks from the user P of the terminal apparatus 3 for each of the upper-layer box image JCG and the lower-layer box image GCG. The application execution unit 301 displays a remark input icon MIC indicating that the remark is received in association with the upper-layer cassette image JCG when the remark is received from the user P in the upper-layer cassette image JCG, and displays a remark input icon MIC in association with the lower-layer cassette image GCG when the remark is received from the user P in the lower-layer cassette image GCG.
Further, the terminal device 3 includes a touch panel 32 and an application execution unit 301 that displays a captured image of the drawer camera 204. The application execution unit 301 can display the upper-layer box image JCG and the lower-layer box image GCG, and receive input of remarks from the user P of the terminal apparatus 3 for each of the upper-layer box image JCG and the lower-layer box image GCG. The application execution unit 301 displays a remark input icon MIC indicating that the remark is received in association with the upper-layer cassette image JCG when the remark is received from the user P in the upper-layer cassette image JCG, and displays a remark input icon MIC in association with the lower-layer cassette image GCG when the remark is received from the user P in the lower-layer cassette image GCG.
The control method of the terminal device 3 is capable of displaying the upper-layer box image JCG and the lower-layer box image GCG, receiving a remark input from the user P of the terminal device 3 for each of the upper-layer box image JCG and the lower-layer box image GCG, displaying a remark input icon MIC indicating that the remark input was received in association with the upper-layer box image JCG when the remark input was received from the user P for the upper-layer box image JCG, and displaying a remark input icon MIC in association with the lower-layer box image GCG when the remark input was received from the user P for the lower-layer box image GCG.
Thereby, the food management application 311, the terminal device 3, and the control method of the terminal device 3 can display the photographed image of the case CA that the drawer 16A has, and the user P can input notes to the displayed photographed image. Therefore, the user P can intuitively grasp the food stored in the drawer 16A of the refrigerator 1, and the convenience of the user P can be improved.
When the application execution unit 301 receives an operation to move the upper cassette image JCG from the user P while displaying the upper cassette image JCG in a superimposed manner on the lower cassette image GCG, it moves the 1 st cassette image to display the lower cassette image GCG.
This allows the lower cassette image GCG to be displayed so as to move the actual cassette CA. Therefore, the user P can grasp the food stored in the drawer 16A of the refrigerator 1 more intuitively, and the convenience of the user P can be improved.
When the remark input icon MIC is displayed in association with the upper-layer box image JCG, the application executing unit 301 moves the remark input icon MIC together with the upper-layer box image JCG.
Thus, even when the user P moves the upper-layer cassette image JCG, the association between the upper-layer cassette image JCG and the comment input icon MIC can be maintained. Therefore, even when the user P moves the upper box image JCG, the user P can appropriately grasp which image the remark input icon MIC is related to, and the convenience of the user P can be improved.
The application execution unit 301 displays a switching button for switching the displayed captured image to the upper-layer cassette image JCG or the lower-layer cassette image GCG, and switches the displayed captured image to the upper-layer cassette image JCG or the lower-layer cassette image GCG according to the operation of the switching button.
Thus, the displayed cassette image CG can be easily switched by the operation of the switching button, so that the convenience of the user P can be improved.
The upper case image JCG is a captured image when the upper case JCA is pulled out by a predetermined threshold or more from the refrigerator 1. The lower case image GCG is a captured image when the lower case GCA is pulled out by a predetermined threshold value or more from the refrigerator 1.
Thus, since the captured image in which the food is displayed as much as possible can be used as the box image CG, the user P can intuitively grasp the food stored in the drawer 16A of the refrigerator 1, and the convenience of the user P can be improved.
The food management application 311 is an application program that can be installed in the terminal device 3.
Thus, by installing the food management application 311, the terminal device 3 having no function of displaying the box image CG and no function of receiving the input of remarks to the displayed box image CG can be made the terminal device 3 having these functions.
(embodiment 2)
Next, embodiment 2 will be described. In the description of embodiment 2, the same reference numerals are given to the same components as those of the respective parts of the food management system 1000 of embodiment 1, and detailed description thereof is omitted as appropriate.
[2-1. Structure ]
Fig. 26 is a block diagram showing the configuration of terminal device 3 and food management server 4 in embodiment 2.
As can be seen from comparing fig. 26 and 7, the server processor 400 in embodiment 2 functions as a food recognition unit 403 in addition to the server communication control unit 401 and the information processing unit 402 by executing a control program stored in the server storage unit 410.
The food recognition unit 403 performs food recognition processing based on the captured food image FG indicated by the food image data 4116 included in each of the 1 st update information and the 2 nd update information.
The food recognition processing is processing for recognizing food from the captured food image FG and acquiring an icon representing the same type of food as the recognized food from an icon database. The icon database stores icons indicating foods for each food. Hereinafter, this icon will be referred to as a food icon. The icon database may be stored in the food management server 4 or may be stored in an external device with which the food management server 4 can communicate.
In the present embodiment, a food icon corresponds to an example of the food image of the present invention.
For example, the food recognition unit 403 recognizes the food as follows.
The food recognition unit 403 identifies an image having the highest degree of coincidence with the captured food image FG indicated by the food image data 4116 included in the update information, from the images stored in the food image database. The food image database is a database storing image data for each food. The degree of coincidence is determined based on feature amounts such as shape and color. Then, the food item identification section 403 identifies the food item corresponding to the determined image creation as the food item represented by the food item image data 4116 included in the update information.
The food recognition unit 403 can recognize the food by AI (artificial intelligence). For example, a machine learning process is performed to learn feature amounts such as color and shape of food from a captured food image FG for training, and a learned model is constructed in advance. The learned model is stored in the server storage unit 410. The food recognition unit 403 recognizes the food indicated by the captured food image FG with reference to the learned model, using the food image data 4116 included in the update information as input data. Here, the learned model may be stored in a device other than the food management server 4 connected to the global network GN, and the food management server 4 may recognize the food. In this configuration, the food recognizing unit 403 transmits the food image data 4116 included in the update information to the device storing the learned model, and obtains the result of the determination of the food from the device. Then, the food recognition unit 403 recognizes the food indicated by the obtained determination result as the food indicated by the captured food image FG indicated by the food image data 4116 included in the update information.
When recognizing the food, the food recognizing unit 403 obtains the food icon of the recognized food from the icon database. In embodiment 2, instead of or together with the food image data 4116 included in the refrigerating compartment food management record RR, the icon data of the food icon acquired by the food recognition unit 403 is appropriately stored. In embodiment 2, instead of or together with the food image data 4116 included in the drawer food management record HR, the icon data of the food icon acquired by the food recognition unit 403 is appropriately stored.
Thus, embodiment 2 can display a food icon in the application UI320 instead of capturing the food image FG.
[2-2. Effect, etc. ]
According to embodiment 2, the same effects as those of embodiment 1 are obtained.
(other embodiments)
As described above, the above embodiments are described as examples disclosed in the present application. However, the technique of the present application is not limited to this, and can be applied to embodiments in which modifications, substitutions, additions, omissions, and the like are made. The components described in the above embodiments may be combined to form new embodiments.
Accordingly, other embodiments are exemplified below.
In the above embodiment, the refrigerator 1 is exemplified as the storage of the present invention. However, the storage of the present invention is not limited to the refrigerator 1, and may be a warehouse installed in an office or the like, for example. In the above embodiment, the food is exemplified as the article of the present invention, but the article of the present invention is not limited to the food, and may be other articles such as books depending on the storage.
In the above embodiment, the position regulating member 205 has the mark 207 at the center in the lateral direction of the main body 201. However, the position of the marker 207 is not limited to the center in the left-right direction of the main body 201. The position of the mark 207 may be other than the center in the left-right direction of the main body 201. In this case, each mark is appropriately recorded in the center position recording paper CY, so that the camera unit 2, which has been position-adjusted with reference to the mark 207, can appropriately perform various detections.
In the above embodiment, as the in-room image of the present invention, the in-refrigerator image RG, which is an image of the refrigerator 11 when the left door 11A and the right door 11B are in an open state when viewed from the front, is exemplified. However, the in-store image of the present invention is not limited to the in-refrigerator image RG, and may be an image of a refrigerator when a door that is opened on one side is opened when viewed from the front, or may be an image of a refrigerator 1 when a refrigerator and a freezer are included in one store, and the door is opened when viewed from the front.
The imaging range of the drawer camera 204 may be set to a range in which one or a plurality of arbitrary drawer type storage compartments are combined according to the type of drawer type storage compartments of the screen displayed in the application UI 320.
The food management system 1000 may manage other elements such as the remaining amount and the expiration date in addition to the date and time of storage.
The main body 201 and the photographing part 202 included in the camera unit 2 may be separately constructed.
The camera unit 2 may be configured to transmit the box image CG to the food management server 4 even when the camera unit is not pulled out by a predetermined threshold or more. In this configuration, the food management server 4 stores the case image CG when the case image CG is not pulled out by a predetermined threshold or more, and supplies the case image CG to the terminal device 3. Then, the application execution unit 301 displays the upper-layer cartridge image JCG and the lower-layer cartridge image GCG in the application UI320 so as to be replaceable with each other.
In the above embodiment, the case where the user P sets the camera unit 2 in the refrigerator 1 is exemplified, but the camera unit 2 may be shipped in a state of being set in the refrigerator 1.
The server processor 400 may be used as the food detection section 225, and the camera unit processor 220 may not be used as the food detection section 225. In this configuration, the camera unit 2 transmits the video file 231 to the food management server 4.
In the above embodiment, as the sensor of the present invention, the human sensor 212 is exemplified. However, the sensor of the present invention is not limited to the human sensor 212, and may be an illuminance sensor, an electrostatic sensor provided at a grip portion of the refrigerator 1, or the like. In the case of this electrostatic sensor, the refrigerator 1 and the camera unit 2 can communicate, and the refrigerator 1 transmits a detection value of the electrostatic sensor to the camera unit 2.
The terminal device 3 may store the refrigerator compartment food management record RR and the drawer food management record HR, and the food management application 311 may manage various information related to the food detected by the camera unit 2. In this configuration, the application execution unit 301 updates the content of each record stored in the terminal device 3 as appropriate, similarly to the food management server 4 described above. Thus, the food management system 1000 does not need to include the food management server 4, and the system configuration becomes simple. In addition, in the case of this configuration, the application execution section 301 may function as the food recognition section 403.
For example, the types of chambers formed in the main casing 10 of the refrigerator 1 are not limited to the refrigerator chamber 11, the ice making chamber 12, the fresh-freezing chamber 13, the freezing chamber 14, and the vegetable chamber 15, and at least other types of chambers may be formed. The number of doors provided in the front opening of the refrigerator compartment 11 may be 1.
For example, drawers 14A, 15A may include cartridges not limited to two sections, and may include more. In this case, a certain case corresponds to the 1 st case of the present invention, and a case of the lower layer of the certain case corresponds to the 2 nd case of the present invention.
For example, the camera unit processor 220, the terminal processor 300, and the server processor 400 may be constituted by a single processor or may be constituted by a plurality of processors.
The respective parts shown in fig. 6, 7 and 26 are examples, and the specific mounting manner is not particularly limited. That is, it is not necessarily required to install hardware corresponding to each unit independently, and it is needless to say that the functions of each unit may be realized by executing a program by one processor. In the above embodiments, a part of the functions realized by software may be realized as hardware, or a part of the functions realized by hardware may be realized by software. The specific details of the camera unit 2, the terminal device 3, and the other parts of the food management server 4 may be arbitrarily changed within a range not departing from the gist of the present invention.
For example, the step units of the operations shown in fig. 13, 14, 22, and 25 are divided according to the main processing contents in order to facilitate understanding of the operations of the respective parts of the food management system 1000, and the present invention is not limited by the manner and name of the division of the processing units.
The above-described embodiments are embodiments for illustrating the technology of the present invention, and therefore, various modifications, substitutions, additions, omissions, and the like can be made within the scope of the claims and their equivalents.
Industrial applicability
As described above, the imaging device and the control method of the imaging device according to the present invention can be used for imaging an object that enters and exits the storage.
Description of the reference numerals
1 refrigerator (storeroom)
2 video camera unit (photographic device)
3. Terminal device
4. Food management server
10. Main box body
10A front edge
10B upper surface
11. Refrigerating chamber
11A left door
11B right door
11C door
12. Ice making chamber (drawer type storage room)
12A, 13A, 14A, 15A, 16A drawers
13. Fresh-frozen chamber (drawer type storage chamber)
14. Freezing chamber (drawer type storage room)
15. Vegetable room (drawer type storage room)
32. Touch panel (display)
201. Main body
202. Shooting component
203. Video camera for refrigerating chamber
204. Camera for drawer
205. Position regulating member
206. Bottom surface
207. Marking
212. Human sensor (sensor)
213. Video camera
223. Image pickup control unit
224. Video recording control unit
225. Food detection part (article detection part)
231. Video file
300. Terminal processor (computer)
301. Application execution unit (display control unit)
311. Food management application (program, application program)
1000. Food management system
BA dividing region (area)
FG. FG1 shooting food image (food image)
GCA lower box (2 nd box)
GCG lower box image (2 nd box image)
JCA upper layer box body (1 st box body)
JCG upper image (1 st image)
MIC remark input icon (remark input image)
P user
RG refrigerator interior image (in-compartment image).

Claims (6)

1. A photographing apparatus, comprising:
a main body provided on an upper surface of the storage; and
a photographing part extending from the main body to the front of the storage, including a camera photographing the lower part from the front upper part of the storage,
the main body includes a position regulating member that abuts against a front end edge of the storage, and regulates a position of the photographing member in a front-rear direction of the storage.
2. The photographing apparatus according to claim 1, wherein:
the position regulating member is a position regulating piece that extends downward from the bottom surface of the main body in the set state.
3. The photographing apparatus according to claim 2, wherein:
the storage warehouse is a refrigerator, and the refrigerator is provided with a plurality of storage boxes,
The position regulating member is formed to be shorter than a length up to a gasket provided between the refrigerator and a door of the refrigerator.
4. A photographic apparatus as claimed in claim 2 or 3, wherein:
the position regulating member extends in a width direction of the main body.
5. The photographing apparatus according to claim 4, wherein:
the position regulating member has a mark for adjusting a position of the photographing device in a width direction of the storage.
6. The photographing apparatus according to claim 5, wherein:
the position regulating member has the mark at a center in a width direction of the main body.
CN202180094495.7A 2021-02-25 2021-11-17 Image pickup apparatus Pending CN116888419A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021-029014 2021-02-25
JP2021029014A JP2022130061A (en) 2021-02-25 2021-02-25 Imaging device
PCT/JP2021/042223 WO2022180952A1 (en) 2021-02-25 2021-11-17 Imaging device

Publications (1)

Publication Number Publication Date
CN116888419A true CN116888419A (en) 2023-10-13

Family

ID=83047950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180094495.7A Pending CN116888419A (en) 2021-02-25 2021-11-17 Image pickup apparatus

Country Status (3)

Country Link
JP (2) JP2022130061A (en)
CN (1) CN116888419A (en)
WO (1) WO2022180952A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024058086A (en) * 2022-10-14 2024-04-25 パナソニックIpマネジメント株式会社 Article management system and refrigerator

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002158897A (en) * 2000-11-17 2002-05-31 Sharp Corp Video camera
JP2003042626A (en) * 2001-07-26 2003-02-13 Toshiba Corp Refrigerator
JP2004183987A (en) * 2002-12-04 2004-07-02 Hitachi Home & Life Solutions Inc Refrigerator
JP6792013B2 (en) * 2013-03-12 2020-11-25 東芝ライフスタイル株式会社 refrigerator
JP7297563B2 (en) * 2019-07-03 2023-06-26 シャープ株式会社 refrigerator

Also Published As

Publication number Publication date
JP2024016290A (en) 2024-02-06
JP2022130061A (en) 2022-09-06
WO2022180952A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
EP3001269B1 (en) Work management system and work management method
CN116057337A (en) Food detection device
CN116888419A (en) Image pickup apparatus
CN116888415A (en) Image capturing apparatus and control method for image capturing apparatus
CN116888416A (en) Program, terminal device, and control method for terminal device
CN116888417A (en) Program, terminal device, and control method for terminal device
CN109631483B (en) Refrigerator with a door
CN116057336A (en) Food management system
CN115997093A (en) Food management system and refrigerator
WO2023095536A1 (en) Detection device, control method for detection device, and program
WO2023095533A1 (en) Detection system, program, and storage/retrieval management method
TWI759875B (en) Object warehouse management system
CN104252830B (en) Partitioned backlight mode control method and device
WO2023095526A1 (en) Program, terminal device,and recipe display system
CN206302407U (en) A kind of Kato and terminal
JP2003267525A (en) Storage device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination