WO2016031246A1 - Nutrient quantity calculating device and refrigerator provided with same - Google Patents

Nutrient quantity calculating device and refrigerator provided with same Download PDF

Info

Publication number
WO2016031246A1
WO2016031246A1 PCT/JP2015/004290 JP2015004290W WO2016031246A1 WO 2016031246 A1 WO2016031246 A1 WO 2016031246A1 JP 2015004290 W JP2015004290 W JP 2015004290W WO 2016031246 A1 WO2016031246 A1 WO 2016031246A1
Authority
WO
WIPO (PCT)
Prior art keywords
food
nutrient amount
nutrient
food material
unit
Prior art date
Application number
PCT/JP2015/004290
Other languages
French (fr)
Japanese (ja)
Inventor
平石 智一
豊嶋 昌志
浩史 佐藤
英蓮 安
綾子 齊藤
秀子 蓮池
一歩 宮沢
一生 永利
昌哉 弦巻
小川 誠
陽平 佐藤
雅則 久保田
Original Assignee
ハイアールアジア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ハイアールアジア株式会社 filed Critical ハイアールアジア株式会社
Priority to JP2015543170A priority Critical patent/JP6577365B2/en
Priority to CN201580045903.4A priority patent/CN107077709B/en
Publication of WO2016031246A1 publication Critical patent/WO2016031246A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight

Definitions

  • the present invention relates to a nutrient amount calculation device that calculates the amount of nutrients of ingredients used by a user for cooking, and a refrigerator equipped with the nutrient amount calculation device.
  • Calculating the amount of ingredients and food nutrients consumed by the user is important for health management and physical condition maintenance. Since the amount of nutrients in a dish is determined by the type and amount of ingredients used in cooking, it is possible to calculate the amount of nutrients in a dish based on information on ingredients. Specifically, the total nutrient amount of the food can be calculated by measuring the weight of the food and multiplying the weight by the amount of nutrient per unit amount of the food. However, it is complicated to perform this calculation work every time the user prepares a dish.
  • Patent Document 1 listed below describes a calorie calculation device that can automatically calculate the calories of food ingredients.
  • the calorie calculation device 1 here includes a measurement unit 20, a weight detection unit 30, and a control unit 70. Yes. Further, the measurement unit 20 measures moisture contained in the analysis object S, and the weight detection unit 30 measures the weight of the analysis object S. The control unit 70 calculates the calorie of the analysis object S using the measurement result of the measurement unit and the measurement result of the weight detection unit. Thereby, the effect which can measure the calorie of analysis subject S which is food, for example is acquired.
  • Patent Document 1 has room for improvement from the viewpoint of simplifying the calorie calculation method. Specifically, in order to operate the measurement unit in order to calculate calories, it is necessary for the user to operate an operation unit provided separately. In addition, a plurality of ingredients are used in normal cooking, but such a calorie calculating device requires an operation for calculating the calories for each ingredient.
  • the calorie of the analysis object S is calculated based on the moisture ratio of the analysis object S.
  • the analysis method using infrared rays or the like, it is difficult to analyze the internal state, and thus it is difficult to specify the type. Therefore, the case where it is difficult to analyze the calorie of analysis object S appropriately is anticipated.
  • the total calories are calculated from the weights of proteins, carbohydrates, and the like contained in foodstuffs.
  • the amount of nutrients such as inorganic substances such as vitamins can be estimated, it is convenient for the user. The property can be further improved.
  • An object of the present invention is to provide a nutrient amount calculation device for estimating the amount of nutrients of ingredients used by a user for cooking with a simple operation, and a refrigerator including the nutrient amount calculation device.
  • the nutrient amount calculation apparatus is based on the food material photographing means for obtaining food image data by photographing food before cooking, the food measuring means for obtaining food weight data by measuring the food, and the food image data.
  • a food material type estimation means for estimating the food material type; a nutrient content calculation means for calculating the nutrient content contained in the food material based on the food material type and the food weight data;
  • the photographing means photographs the food when the variation in the food weight data measured by the food measuring means is less than a certain value.
  • the food metering unit obtains the food weight data by measuring the food material at regular intervals, and the food photographing unit measures the food material at a plurality of times measured immediately before. The food is photographed when the standard deviation of the weight data is less than a certain value.
  • the food photographing unit captures the foods sequentially placed on the food metering unit by photographing the food when the food weight data is different from the previous photographing. A picture is taken every time the food is placed.
  • the nutrient amount calculation apparatus of the present invention is characterized in that the food photographing means photographs the food after a certain period of time after the weight of the food placed on the food measuring means is determined.
  • the food material type estimation unit calculates an image feature value from the food image data, and from the food material list in which the image feature value and the food material type are listed in association with each other.
  • the food material having a close image feature amount is selected, and the nutrient amount calculation means multiplies the nutrient amount per unit amount of the selected food material by the food material weight data to obtain the nutrient amount of the food material. It is characterized by calculating.
  • the nutrient amount calculation apparatus of the present invention is characterized in that the type of food and the image feature amount selected by the food type estimation means are added to the food list.
  • the food photographing unit is configured to photograph the foods sequentially placed on the food metering unit each time the foods are placed, so that a plurality of the food image data is obtained.
  • the food material type estimation means identifies the image portion of the newly added food material by taking the difference between the latest food image data and the food image data photographed last time, The image feature amount is calculated from an image portion.
  • the nutrient amount calculation apparatus of the present invention is characterized in that the food material type estimation means and the nutrient amount calculation means are realized as a function of a mobile terminal.
  • the refrigerator of the present invention is characterized by including the nutrient amount calculation device described above.
  • the nutrient amount calculation apparatus is based on the food material photographing means for obtaining food image data by photographing food before cooking, the food measuring means for obtaining food weight data by measuring the food, and the food image data.
  • a food material type estimation means for estimating the food material type; a nutrient content calculation means for calculating the nutrient content contained in the food material based on the food material type and the food weight data;
  • the photographing means photographs the food when the variation in the food weight data measured by the food measuring means is less than a certain value. Therefore, it is possible to photograph the food placed on the food weighing means more clearly by photographing the food after the fluctuation of the weight measured by the food weighing means becomes less than a certain value, and the food image The accuracy of estimation using data is improved. Further, since the food photographing means photographs the food based on the output of the food measuring means, it is possible to photograph the food without the user performing a special operation for photographing.
  • the food metering unit obtains the food weight data by measuring the food material at regular intervals, and the food photographing unit measures the food material at a plurality of times measured immediately before. The food is photographed when the standard deviation of the weight data is less than a certain value. Therefore, it is possible to photograph the food in a more stable state using the food photographing unit.
  • the food photographing unit captures the foods sequentially placed on the food metering unit by photographing the food when the food weight data is different from the previous photographing. A picture is taken every time the food is placed. Therefore, the image data of the food can be sequentially photographed without the user performing a special operation for photographing.
  • the nutrient amount calculation apparatus of the present invention is characterized in that the food photographing means photographs the food after a certain period of time after the weight of the food placed on the food measuring means is determined. Accordingly, it is possible to prevent a user's hand operating the food material from being erroneously reflected in the food material image data.
  • the food material type estimation unit calculates an image feature value from the food image data, and from the food material list in which the image feature value and the food material type are listed in association with each other.
  • the food material having a close image feature amount is selected, and the nutrient amount calculation means multiplies the nutrient amount per unit amount of the selected food material by the food material weight data to obtain the nutrient amount of the food material. It is characterized by calculating. Accordingly, since the type of the food is specified using the image feature amount calculated from the color of the image, the type of the food can be easily specified without the user inputting the name of the food. .
  • the nutrient amount calculation apparatus of the present invention is characterized in that the type of food and the image feature amount selected by the food type estimation means are added to the food list. Accordingly, since the number of food lists used from the next search increases, it is possible to improve the estimation accuracy of the food material type based on the subsequent image feature amount.
  • the food photographing unit is configured to photograph the foods sequentially placed on the food metering unit each time the foods are placed, so that a plurality of the food image data is obtained.
  • the food material type estimation means identifies the image portion of the newly added food material by taking the difference between the latest food image data and the food image data photographed last time, The image feature amount is calculated from an image portion. Accordingly, since the image feature amount of the food is calculated using image data of only the portion where the food is photographed, the image feature amount is calculated more accurately and the accuracy of continuously estimating the type of the food is improved. I can do it.
  • the nutrient amount calculation apparatus of the present invention is characterized in that the food material type estimation means and the nutrient amount calculation means are realized as a function of a mobile terminal. Therefore, by connecting to a telephone communication line via a portable terminal and using the information stored in the server, it is possible to improve the accuracy of estimating the kind of food.
  • the refrigerator of the present invention is characterized by including the nutrient amount calculation device described above. Therefore, the convenience of the user who uses a refrigerator can be improved by giving the refrigerator in which a foodstuff is stored the function which calculates the nutrient amount of a foodstuff.
  • FIG. (A) is a block diagram which shows the structure of a nutrient amount calculation apparatus
  • (B) is a block which shows the data structure used
  • FIG. (A) is a perspective view which shows the structure of the nutrient amount calculation apparatus concerning the 2nd Embodiment of this invention
  • (B) is a perspective view which shows a refrigerator.
  • FIG. 1 illustrates the configuration of the nutrient amount calculation apparatus 10.
  • the nutrient amount calculation device 10 measures the food 16 before cooking and measures the measurement, and a portable terminal that estimates the type of the food and calculates the nutrient amount based on information input from the measurement device 12. 24.
  • the main function of the nutrient amount calculation device 10 is to calculate the amount of nutrients contained in the food material 16 placed on the measuring instrument 12, and to present the calculation result to the user. Therefore, the user can easily know the nutrient amount of the food 16 by placing the food 16 before cooking on the measuring instrument 12.
  • nutrients include, for example, minerals such as calories and salt, vitamins, proteins, carbohydrates, fats and the like.
  • the nutrient amount calculation device 10 of this embodiment may include a server 28 in addition to the measuring device 12 and the portable terminal 24 described above.
  • the portable terminal 24 or the measuring instrument 12 is connected to the server 28 via a communication network such as the Internet.
  • the food image data obtained by photographing the food 16, the food ID indicating the type (food name) of the food 16, and the food weight data indicating the weight of the food 16 are transmitted from the portable terminal 24 to the server 28.
  • the server 28 accumulates and analyzes these data transmitted from the mobile terminals 24 of a large number of users, and feeds back the analysis information based on the analysis results to the mobile terminal 24, so that the food that has taken the food 16 is captured. It is possible to improve the accuracy of estimating the type of the food 16 from the image data.
  • the measuring instrument 12 includes a weighing unit 14 (foodstuff weighing unit), an imaging unit 18 (foodstuff photographing unit) and an illumination unit 20 disposed above the weighing unit 14, and a control unit 22 connected to the weighing unit 14.
  • a weighing unit 14 foodstuff weighing unit
  • an imaging unit 18 foodstuff photographing unit
  • an illumination unit 20 disposed above the weighing unit 14
  • a control unit 22 connected to the weighing unit 14.
  • the weighing unit 14 is a so-called electronic balance, and transmits an electric signal indicating the weight of the food 16 placed on the upper surface thereof to the control unit 22.
  • a color different from the color of a general foodstuff is adopted as the color of the upper surface of the measuring unit 14.
  • the nutrient amount of the food 16 is calculated based on the food weight data measured by the weighing unit 14, the nutrient amount can be accurately calculated as compared with the case of calculating from only the food image data. .
  • the imaging unit 18 is composed of an imaging device such as a CCD, for example, and is disposed above the weighing unit 14.
  • the imaging unit 18 obtains food image data by photographing the weighing unit 14 on which the food 16 is placed from above.
  • the obtained food image data is transmitted to the control unit 22.
  • the imaging unit 18 images the food 16
  • the relative positions of the imaging unit 18 and the weighing unit 14 are fixed. Accordingly, it is possible to stably photograph the food 16 placed on the upper surface of the weighing unit 14.
  • the timing at which the food image data is acquired by the imaging unit 18 is determined based on the output of the weighing unit 14, and such matters will be described later with reference to FIG. 2 and the like.
  • the illumination unit 20 includes, for example, an LED, and is disposed above the measuring unit 14 and in the vicinity of the imaging unit 18.
  • the illumination unit 20 has a function of emitting light toward the food 16 when the imaging unit 18 photographs the food 16. Since the relative position of the illumination unit 20 with respect to the imaging unit 18 and the weighing unit 14 is fixed, the conditions for photographing the food 16 are made the same, and the type of the food 16 is selected using the obtained food image data. The accuracy at the time of estimation can be improved.
  • the control unit 22 has a predetermined control program installed therein, and has the function of controlling the operations of the imaging unit 18 and the illumination unit 20 as well as the above-described food image data and food weight data being input.
  • the control unit 22 also has a function of communicating with the mobile terminal 24 arranged in the vicinity of the measuring instrument 12.
  • the control unit 22 and the portable terminal 24 may be wired or wirelessly connected. For wireless connection, for example, Wi-Fi standard data communication can be employed.
  • the mobile terminal 24 is, for example, a smartphone owned by the user, and an application for controlling the measuring instrument 12 is installed.
  • the portable terminal 24 includes a display unit 26 that is a touch panel, for example, and can display the food image data captured by the imaging unit 18 on the display unit 26. Further, by operating the display unit 26, the type of the food material 16 can be specified as will be described later.
  • the portable terminal 24 stores programs that are nutrient amount calculation means and food material type estimation means in advance.
  • the user operates the portable terminal 24 separate from the measuring device 12 to estimate the nutrient amount of the food material 16, but the functions of both may be integrated. That is, an operation unit such as a touch panel may be provided in the measuring instrument 12, and the user may select the food 16 by operating the operation unit.
  • an operation unit such as a touch panel may be provided in the measuring instrument 12, and the user may select the food 16 by operating the operation unit.
  • Step S11 to step S13 and step S16 to step S21 described below is performed by the measuring instrument 12 described above.
  • Step S14 to Step S15 and Step S22 to Step S36 are performed by the portable terminal 24.
  • initial weight data which is information about the weight
  • the measuring unit 14 is photographed by the imaging unit 18 (step S13).
  • the photographed initial food image data is stored in the control unit 22 as an initial value.
  • the measuring unit 14 when the measuring unit 14 is imaged by the imaging unit 18 in order to capture a clear image of the upper surface of the measuring unit 14, the measuring unit is used by the illumination unit 20. 14 may be irradiated. In such a case, in order to make the photographing conditions uniform, irradiation by the illumination unit 20 is also performed when photographing the food image data of the food 16 in the steps described later.
  • the measuring instrument 12 and the mobile terminal 24 are connected using a wireless connection based on the Wi-Fi standard (step S14). Then, the application acquires the initial food weight data and the initial food image data obtained in the above steps from the measuring instrument 12 (step S15).
  • step S16 the food 16 to be cooked is placed on the upper surface of the measuring unit 14 of the measuring instrument 12 (step S16).
  • the food weight data measured by the weighing unit 14 is sequentially transmitted to the control unit 22, but immediately after the food 16 is placed on the weighing unit 14, the value of the food weight data measured by the weighing unit 14 is stable. Not done. Therefore, in this embodiment, the process waits until the value of the food weight data becomes a constant value (step S17). Further, in this embodiment, in order to prevent the accuracy of estimating the food 16 from being reduced as a result of the user's hand handling the food being reflected in the food image data, the weighing unit 14 measures in step S17. After the food weight data is stabilized, photographing is further waited for a predetermined time (step S18).
  • the food weight data of the food 16 measured by the measuring unit 14 at that time is recorded in the control unit 22 and the food image 16 is obtained by photographing the food 16 from above with the imaging unit 18 ( Steps S19 and S20). Thereafter, the obtained food weight data and food image data are transmitted from the measuring device 12 to the portable terminal 24 (step S21).
  • the portable terminal 24 calculates the type of the food 16 and the amount of nutrients based on the transmitted food image data and food weight data.
  • step S22 a difference from the value of the food weight data received last time is taken (step S22). That is, when the first food 16 is measured, the state in which the food 16 is not placed, that is, 0 g is subtracted from the value of the food weight data when the first food 16 is placed.
  • the value of the food weight data when the first food 16 is placed is subtracted from the value of the food weight data when the second food 16 is placed.
  • FIG. 4 schematically shows a situation of image processing or the like when three food ingredients A, B, and C are sequentially weighed and photographed.
  • the weight Xg first measured by placing the food A on the measuring unit 14 is used as it is as the food weight data of the food A in the subsequent processing.
  • Yg is measured, but (YX) g is used as the food weight data of the food B.
  • Zg is measured, but (ZYX) g is used as the food weight data of the food B.
  • the updated image portion is used as an image of the newly added food material 16 by taking the difference from the previously captured food image data (steps S23 and S24).
  • the food image data 30 in which the food A is placed on the weighing unit 14 is acquired, and the image portion of the food A is obtained from the difference from the food image data (not shown here) on which the food is not placed. Is generated.
  • the difference between the ingredient image data 30 and the ingredient image data 36 is taken to obtain the ingredient image data from which the image portion of the ingredient B is separated. 38 is generated.
  • the foodstuff image C is further placed and the foodstuff image data 42 is acquired, the foodstuff image data 44 in which the image portion of the foodstuff C is separated by taking the difference between the foodstuff image data 36 and the foodstuff image data 42. Is generated.
  • an image feature amount is calculated (step S25). Specifically, based on the color and roughness of the image portion generated as described above, an image feature amount obtained by quantifying the feature of the image portion is calculated. And the thing with an image feature-value near is searched from "the foodstuff list selected in the past” and prior learning data (step S26). In the “past ingredient list selected in the past”, the ingredient ID indicating the type of the ingredient 16 selected in the past and the image feature amount are associated and listed.
  • the pre-learning data is input in advance when the application of the mobile terminal 24 is shipped, and is a list in which the food ID and the image feature amount are associated with each other.
  • the “food list selected in the past” and the pre-learning data may be collectively referred to as a food list.
  • the image feature amount calculated in step S25 and the food material having the similar image feature amount are searched from the food material list (step S26).
  • step S27 a plurality of candidate ingredients whose image feature amounts are close to the ingredients to be searched from the ingredient list are displayed on the display unit 26 of the portable terminal 24.
  • a foodstuff name may be displayed and the image imitating a foodstuff may be displayed.
  • the food with the closest image feature quantity from the food list as the food to be searched.
  • the accuracy of estimating the ingredient 16 is improved. .
  • Step S28 If the search target food is present in the food displayed on the display unit 26 (YES in step S28), the user selects the correct food by operating the display unit 26, which is a touch panel, etc. ( Step S29). At this time, as shown in the lower part of FIG. 4, the food image data 34, 40, 46 indicating the selected food may be displayed on the display unit 26.
  • the food material ID corresponding to the selected food material 16 is associated with the image feature amount of the food material 16 and added to the “food material list selected in the past” (step S30).
  • the current search result can be used in the next candidate search for ingredients, so that the accuracy of the candidate search can be improved. That is, the “food list selected in the past” can be learned according to the lifestyle of the user.
  • step S28 when there is no correct ingredient in the ingredient candidate displayed in step S28 (NO in step S28), the user inputs the ingredient 16 into the text on the portable terminal 24, and the ingredient having the same name as the inputted ingredient name. Is retrieved from the ingredient master (step S31).
  • the food master is a list in which the food ID is associated with the nutrient amount of the food. Search results based on the input text are displayed on the display unit 26 of the portable terminal 24 (step S32).
  • step S33 The food material ID indicating the selected food material 16 and its image feature amount are recorded in association with each other in the same manner as in step S30 described above (step S34).
  • the nutrient amount of the food 16 is calculated using the type and weight of the food 16 specified in the above step (step S35). Specifically, the amount of nutrient contained in the ingredient 16 is calculated by multiplying the amount of nutrient per unit quantity associated with the ingredient ID contained in the aforementioned ingredient master by the weight of the ingredient 16. As described above, the nutrient amount in this embodiment includes minerals such as calories and salt, vitamins, proteins, carbohydrates, fats and the like. Further, the accumulated nutrient amount is displayed on the display unit 26 of the portable terminal 24 (step S36). Thus, the user can know the amount of all nutrients contained in the food material 16 placed on the measuring instrument 12.
  • step S16 to step S36 described above is performed on each food material 16 prepared by the user (step S37).
  • step S ⁇ b> 16 to step S ⁇ b> 21 for photographing the food material 16 after it is placed on the weighing unit 14 will be described in detail below.
  • the control unit 22 of the measuring instrument 12 acquires food weight data indicating the weight of the food 16 from the weighing unit 14 (step S50).
  • the acquisition of the food weight data is performed continuously at regular intervals, for example, continuously at intervals of 0.1 seconds.
  • the food image data is acquired in a stable state, and the estimated accuracy of the food type using the food image data Can be improved.
  • the food material weight data which is analog data, is converted into digital data (step S51).
  • the average value w (t) of the five previous food weight data is obtained (step S52). For example, if the food weight data is measured at the fifth time, the average value of the food weight data from the first time to the fifth time is calculated. Further, the standard deviation ⁇ (t) of the last five food weight data is calculated using the previous five food weight data (step S53).
  • step S54 If the standard deviation ⁇ (t) calculated in the above step is smaller than a predetermined value (for example, 0.5) (YES in step S54), the value of the food weight data measured in the last five times is converged. Judge that there is. That is, the control unit 22 can determine that the time variation of the food weight data is sufficiently small, the posture of the food 16 is stable, and the food image data can be captured clearly.
  • a predetermined value for example, 0.5
  • step S55 it is determined whether or not the time variation of the food material weight data is in an unstable state (for example, more than 3 g), and the food material weight data is different from the food material weight data at the previous photographing.
  • the time variation of the food material weight data is in an unstable state
  • the food material weight data is obtained under the situation where the food material 16 is not added due to a drift phenomenon during long-time operation of the measuring instrument 12. It is possible to prevent the imaging unit 18 from taking a picture when the value slightly changes (for example, 3 g or less).
  • the food 16 described later can be shot only when the food 16 is newly placed on the measuring unit 14.
  • the measured food material weight data is determined (step S57). Further, the control unit 22 takes a picture of the food 16 with the imaging unit 18 after a predetermined time (for example, 0.5 seconds) has elapsed in order to retract the user's hand operating the food 16 from the field of view of the imaging unit 18. Obtain food image data. Further, the measuring unit 14 determines that the stable state has been reached. On the other hand, if the food weight data is not in an unstable state or if the food weight data is the same as that at the previous photographing (NO in step S55), the process returns to step S51 without determining the food weight data.
  • a predetermined time for example, 0.5 seconds
  • step S54 the food weight data measured at the tth time (for example, the fifth time) and the t ⁇ 1th time (for example, the fourth time) ) Is checked whether the absolute value of the difference from the measured food weight data is equal to or less than a predetermined value (for example, 3 grams) (step S56). If the above value is equal to or greater than the predetermined value (YES in step S56), there is a possibility that a new food 16 may be placed on the weighing unit 14, and the time variation of the food weight data has shifted to an unstable state. Judgment is made, and the process returns to step S51. On the other hand, if the above value is less than the predetermined value (NO in step S56), it is determined that the time variation of the food weight data remains in a stable state, and the process returns to the above step S51.
  • a predetermined value for example, 3 grams
  • the imaging unit 18 captures the food 16 based on the output of the weighing unit 14 by placing the food 16 on the weighing unit 14, the user performs a special operation for shooting.
  • the food image data can be easily acquired without performing the process.
  • the imaging unit 18 photographs the food 16, so that the food 16 that is stationary on the upper surface of the weighing unit 14 can be photographed. Therefore, it is possible to improve the accuracy of obtaining clear food image data and estimating the type of food 16.
  • the food image data is acquired by the imaging unit 18 based on the output of the measuring unit 14, and the nutrient amount of the food 16 is calculated on the mobile terminal 24 side using the food image data.
  • the food image data in this embodiment is a still image. If there is no function for determining when to shoot a still image based on the output of the weighing unit 14, it is necessary to analyze the moving image to calculate the nutrient content of the food, but this is required for the analysis of the moving image.
  • the amount of information processing is large compared to still images. Therefore, in this embodiment, the information processing amount on the portable terminal 24 side can be remarkably reduced and the power consumption amount can be reduced.
  • the food image data is image data obtained by photographing the food 16 that is stationary on the upper surface of the measuring unit 14, it is the most accurate image data for estimating the type of the food 16. Therefore, even if it compares with the case where the amount of nutrients of a foodstuff is calculated by analyzing a moving picture, the accuracy which estimates the kind of foodstuff 16 is not inferior.
  • imaging of the food image data most suitable for the analysis of the food material 16 is performed using the output of the weighing unit 14 as a trigger, and the amount of nutrients of the food material 16 is accurately controlled while keeping the amount of data to be analyzed low. It is a good calculation.
  • the nutrient amount is calculated while adding the food material 16, but the food material 16 may be deleted along the way.
  • the control part 22 can detect that the foodstuff 16 was deleted because the foodstuff weight data measured by the measurement part 14 reduce
  • the nutrient amount calculation device 10 described above may be incorporated as a function of the refrigerator. By adding a function of calculating the nutrient amount of the food 16 to the refrigerator in which a large number of food 16 is stored, the added value of the refrigerator can be improved.
  • This embodiment is particularly effective when the server 28 calculates the nutrient amount of the food material 16.
  • the food image data is a still image and has a much smaller amount of information than a moving image.
  • the food image data has high accuracy because it is obtained by taking an image of the output of the weighing unit 14 as a trigger. Therefore, it is possible to calculate the nutrient amount of the food 16 with high accuracy while reducing the amount of information transmitted to the server 28 through the communication line.
  • the nutrient amount calculation apparatus 110 calculates the amount of nutrients only by taking an image of the cooked food.
  • FIG. 5A is a block diagram illustrating a schematic configuration of the nutrient amount calculation apparatus 110
  • FIG. 5B is a block diagram illustrating nutrient amount calculation data 124 that is referred to in order to calculate the nutrient amount.
  • the nutrient amount calculation device 110 of this embodiment includes a food photographing unit 112, a food measuring unit 114, a dish photographing unit 116, a dish measuring unit 118, and a nutrient amount calculating unit 120.
  • Storage means 122 The general function of the nutrient amount calculation device 110 is to simply calculate the nutrient amount of the cooked food from the weight data or image data of the food 140 used for cooking or the food itself.
  • the food material photographing means 112 is a means for photographing the food material 140 to be cooked in color. Specifically, the food material photographing means 112 is constituted by a photographing element such as a CCD. The food imaging unit 112 generates the food image data 126 by shooting the food 140, and the food image data 126 is transmitted to the nutrient amount calculation unit 120. Here, the food photographing unit 112 may photograph one of the ingredients 140 used for cooking one by one, or may photograph a plurality of the same kind of materials.
  • the food material measuring means 114 is a means for measuring the weight of the food material 140 to be cooked.
  • the food material weighing unit 114 measures the food material 140 to obtain food material weight data 128 indicating the weight of the food material 140, and the food material weight data 128 is transmitted to the nutrient amount calculating unit 120.
  • the food measuring unit 114 may measure the foods 140 one by one, or may measure a plurality of the same kind of foods 140 simultaneously.
  • the dish photographing unit 116 is a unit that photographs a dish made by cooking the above-described food material 140.
  • the dish image data 130 obtained by photographing the dish by the dish photographing unit 116 is transmitted to the nutrient amount calculating unit 120.
  • the dish weighing means 118 is a means for weighing cooked dishes.
  • the dish weight data 132 obtained by the dish weighing unit 118 weighing the dish is transmitted to the nutrient amount calculating unit 120.
  • the above-described food photographing unit 112 and the dish photographing unit 116 may be provided individually or a single photographing unit may be used. Further, the food weighing means 114 and the food weighing means 118 may be provided individually or a single photographing means may be used.
  • the nutrient amount calculation means 120 calculates the amount of nutrients of the cooked food from each data transmitted from each means described above.
  • a CPU is employed as the nutrient amount calculation means 120.
  • the nutrient amount calculation means 120 may also calculate the amount of nutrients such as salt as will be described later in addition to the amount of nutrients in the dish.
  • the type of the ingredients 140 is estimated from the ingredients image data 126, and the estimated nutrient amount per unit amount of the ingredients 140 is multiplied by the ingredient weight data 128. For example, the nutrient amount of each food 140 is calculated. Then, the nutrient amount of the dish scheduled to be cooked is calculated by adding the nutrient amounts of all the ingredients 140 used for cooking.
  • the type of dish is estimated from the dish image data 130, and the estimated nutrient amount per unit amount of the dish is multiplied by the dish weight data 132.
  • the nutrient content of the dish is calculated.
  • the storage means 122 is means for storing each image data, each weight data, etc. obtained by each means described above. Specifically, a hard disk or a semiconductor storage device is employed as the storage unit 122.
  • the nutrient amount calculation apparatus 110 does not necessarily include the storage unit 122, and a server connected via a network or the like may be used as the storage unit, and the above-described data may be stored in the server.
  • the nutrient amount calculation data 124 includes ingredient image data 126, ingredient weight data 128, dish image data 130, dish weight data 132, ingredient database 125, dish database 127, ingredient nutrient quantity database 129, and dish nutrient quantity database 131. .
  • the food image data 126 is image data of a still image obtained by photographing with the above-described food photographing means 112.
  • the food weight data 128 is data indicating the weight of the food 140 obtained by weighing with the food weighing means 114.
  • the dish image data 130 is image data of a still image obtained by photographing a dish made by cooking the ingredients 140 with the dish photographing unit 116.
  • the dish weight data 132 is data indicating the weight of the dish measured by weighing the prepared dish with the dish weighing unit 118.
  • the food material database 125 is a database composed of data associating features extracted from the food image data 126 with the types of the food materials 140. For example, the color and surface roughness extracted from the food image data 126 and the food material 140 are stored.
  • the dish database is a database that associates the dish image data 130 with the kind of dish, and includes, for example, data that associates the color of the surface of the dish with the kind of dish.
  • the ingredient nutrient amount database 129 is a database that associates the ingredient 140 with the nutrient amount per unit quantity of the ingredient 140.
  • the cooking nutrient amount database 131 is a database that associates a dish with the nutrient amount per unit amount of the dish. As these data, data publicly disclosed by public institutions may be used, or data accumulated and revised by the user using the nutrient amount calculation device 110 of this embodiment may be used. Both data may be used in combination.
  • the ingredient database 125 and the dish database 127 may be integrated into one database.
  • the food nutrient amount database 129 and the cooking nutrient amount database 131 may be integrated into a single database.
  • FIG. 6A is a perspective view showing a specific configuration of the nutrient amount calculation apparatus 110
  • FIG. 6B is a perspective view showing a refrigerator 142 in which the nutrient quantity calculation apparatus 110 is incorporated.
  • the nutrient amount calculation device 110 includes a pedestal 134 on which the food 140 is placed, a movable support 136 whose end is fixed to the pedestal 134, and a support 136. And an imaging unit 138 provided.
  • the pedestal 134 is a plate-like member having a flat surface on which the food material 140 is placed, and has a built-in measuring module for measuring the food material 140. Further, the nutrient amount calculation means 120 that receives various data from the measurement module and the imaging unit 138 and calculates the nutrient amount may be incorporated in the base 134.
  • the support portion 136 is a rod-like member disposed near the end of the pedestal 134, and the lower end thereof is rotatably connected to the pedestal 134. Further, since a concave region corresponding to the shape of the support portion 136 is provided on the upper surface of the pedestal 134, the tilted support portion 136 is accommodated in the concave region.
  • an imaging part 138 made of, for example, a CCD is provided near the upper end of the support part 136.
  • the imaging unit 138 is installed at a position where the food 140 placed on the upper surface of the pedestal 134 is photographed in a state where the support unit 136 is raised.
  • the food image data 126 obtained by photographing by the imaging unit 138 is transmitted to a processing unit built in the pedestal 134.
  • the nutrient amount calculation data 124 described with reference to FIG. 5B may be recorded on a recording medium such as a hard disk built in the nutrient amount calculation device 110, or a recording device arranged outside. May be recorded. Alternatively, a part of the nutrient amount calculation data 124 may be recorded in a recording device built in the nutrient amount calculation device 110, and another part of the nutrient amount calculation data 124 may be recorded in a recording device arranged outside. Here, all or part of the nutrient amount calculation data 124 is recorded in the server 156.
  • the nutrient amount calculation device 110 and the server 156 are connected via a communication network 158 such as the Internet network. A method of using the nutrient amount calculation apparatus 110 described above will be described later with reference to FIG.
  • FIG. 6B illustrates a refrigerator 142 as one application example of the nutrient amount calculation device 110 described above.
  • the refrigerator 142 includes a plurality of storages such as a refrigerator and a freezer, and the front opening of each storage is closed by doors 144, 146, 148, and 150 so as to be opened and closed.
  • the door 144 rotates and opens in the left-right direction with either left or right end as a fulcrum, and the doors 146, 148, 150 open and close in the front-rear direction.
  • the small door 152 is a door in which a part of the door 146 is rotatable in the front-rear direction, and rotates and opens with the lower end as a fulcrum.
  • the upper surface of the small door 152 in the opened state functions as the pedestal 134 shown in FIG.
  • An imaging unit 154 that captures the food 140 is disposed near the opening of the door 146.
  • the small door 152 is opened, and the ingredients 140 and the dish are placed on the upper surface of the small door 152 in the opened state. Then, the food material 140 or the dish is weighed by a weighing module built in the small door 152. In addition, the food 140 or the dish is photographed by the imaging unit 154. As a result, the food 140 and the food are measured and photographed.
  • the food material 140 used for cooking is generally stored in the refrigerator in many cases, so that the food material 140 can be immediately measured and the measurement and photography can be further simplified. Can be performed.
  • FIG. 7 shows a nutrient amount calculation method at the first stage of cooking a specific dish
  • FIG. 8 shows a nutrient amount calculation method for the second and subsequent times of cooking the specific dish.
  • each data described below is appropriately stored in the nutrient amount calculation device 110, the server 156, and the like shown in FIG.
  • the nutrient amount calculation device 110 shown in FIG. 6A is activated (step S111). Specifically, the support 136 is raised with respect to the pedestal 134 and the start switch is turned on.
  • FIG. 6A illustrates a case where the fish before cooking is employed as the food material 140.
  • the fish before cooking is employed as the food material 140.
  • only one food 140 is placed on the upper surface of the pedestal 134, but a plurality of the same kind of foods 140 may be placed on the upper surface of the pedestal 134. Thereby, it becomes possible to measure and photograph the food 140 easily.
  • the food material 140 placed on the upper surface of the pedestal 134 is photographed by the imaging unit 138 (step S113). Specifically, the food 140 is imaged by the imaging unit 138 in a state where light is irradiated to the food 140 by a light emitting unit (not shown) as necessary. As a result, food image data 126 obtained by photographing the food 140 is obtained.
  • the acquired food image data 126 is stored in a storage means such as a hard disk.
  • the food material 140 to be photographed may be an unprocessed product or a processed product. For example, if the food material 140 is a banana, the state before the skin is peeled off or the state after the skin is peeled off may be used.
  • the food 140 is estimated from this image, and the nutrient amount is calculated by excluding the portion corresponding to the skin.
  • the food material 140 is estimated using the food image data 126, and the nutrient amount is calculated assuming that the whole is used as the food material 140.
  • the type of food is estimated from the food image data 126 (step S114).
  • Various methods are conceivable as a method for estimating the food material 140 from the food image data 126.
  • a method for estimating the food material 140 by focusing on the color and roughness of the surface of the food material 140 will be described.
  • a portion where foodstuff 140 is photographed is extracted from the above-described food image data 126, and data relating to the color and surface roughness of this portion is extracted.
  • the color, surface roughness, type, and the like of the surface of the food are tabulated for each food. Therefore, the color and surface roughness of the food material 140 extracted from the food image data 126 are compared with the color and surface roughness recorded for each food material in the food material database 125, and the food material having the closest approximation of these values is “ Estimated ingredients ”.
  • step S115 the user determines whether or not the food material estimated in the previous step is correct. Specifically, the estimated food image and name are displayed on a display device or the like associated with the nutrient amount calculation device 110. And if a user judges that the estimated foodstuff is correct, it will transfer to the next step by performing the operation to that effect (YES of step S115). On the other hand, if the displayed food is not correct (NO in step S115), other foods with similar colors and roughness are displayed (presented) to the user (step S116). As a result, if the newly displayed food is correct, the process proceeds to the next step (YES in step S117), and if it is not correct, another food is estimated and displayed (NO in step S117).
  • steps S115 and S117 for making this determination may be performed by a user operating a switch or a touch panel provided in the nutrient amount calculation apparatus 110 itself.
  • the estimated food image or name may be displayed on a portable information terminal such as a smartphone in which a specific application is installed, and the user may operate it.
  • step S115 and step S117 when the type of the food material 140 is specified in step S115 and step S117, the combination of the food material image data 126 and the food material 140 in these steps is associated and stored. That is, the food material database 125 shown in FIG. 5B is revised. From the next time, this combination is used in step S114, thereby improving the accuracy of the estimation.
  • the nutrient amount per unit amount is determined by the type of the food material 140, but the nutrient amount can be calculated by specifying the type of the food material 140 placed on the nutrient amount calculation device 110 by the above steps. It becomes.
  • the type data specifying the type of the food material 140 is stored and used in a later step.
  • step S115 to step S117 if the estimated food 140 is not correct, another food 140 is presented, but if this estimation is not correct more than a predetermined number of times (for example, 5 times or more), the user The ingredients 140 may be input manually. Thereby, the step which a user selects the foodstuff 140 can be simplified.
  • step S118 the food material 140 placed on the upper surface of the pedestal 134 of the nutrient amount calculation device 110 is weighed (step S118). Specifically, the weight of the food 140 is measured by a measuring module built in the pedestal 134. The food weight data 128 obtained by this measurement is stored in a storage device provided in the nutrient amount calculation device 110.
  • the calculation means such as a CPU provided in the nutrient quantity calculation device 110
  • the estimation result of the kind of food 140 and the food weight data 128 described above are used for the cooking to be cooked.
  • the total nutrient amount (first estimated cooking nutrient amount) is calculated (step S119).
  • the amount of nutrients per unit amount is converted into data in the food nutrient amount database 129 (FIG. 5B). Therefore, for each ingredient 140, the ingredient weight data 128 is multiplied by the nutrient amount per unit quantity, and these are added to calculate the total nutrient quantity (first estimated dish nutrient quantity) of the cooked food.
  • step S112 to step S119 described above are performed for each food 140.
  • operations from step S112 to step S117 are performed for each material such as carrot, onion, meat, and potato.
  • the kind of each foodstuff 140 is specified, and those measurement and integration are also performed.
  • the accumulated food 140 and nutrient amount are notified to the user, for example, by being displayed on a display provided in the nutrient amount calculation device 110.
  • the calculated ingredients 140 and nutrient amounts may be accumulated for each nutrient amount, stored in association with the dish, and used as a nutrient amount database in a later step.
  • a dish is prepared from the above-described ingredients 140.
  • the curry is cooked by frying or boiling the above-mentioned ingredients such as carrots.
  • step S121 the dish made by cooking is placed on the pedestal 134 shown in FIG. 6A and weighed (step S121).
  • the dish is placed on the pedestal 134 together with a container such as a pot used for cooking, by storing the weight of the container in advance, the weight of the container is subtracted from the total weight, so that only the dish Can weigh.
  • dish weight data 132 of the cooked dish is obtained.
  • the dish image data 130 is obtained by photographing the dish placed on the pedestal 134 using the imaging unit 138.
  • a step of selecting a specific cooking method may be performed between step S120 and step S121 described above.
  • the user inputs a cooking method such as boiling, frying, steaming, frying, etc. to the nutrient amount calculation device 110 via an input means such as a touch panel.
  • a cooking method such as boiling, frying, steaming, frying, etc.
  • the amount of nutrients in the dish differs depending on the cooking method. For example, when fried, the oil used for cooking is added to the dish, so the amount of nutrients in the cooked dish is higher than when steamed.
  • the accuracy of the calculated nutrient amount can be improved by recalculating the nutrient amount of the dish in consideration of the amount of oil used. I can do it.
  • the dish image data 130 is obtained by photographing the dish placed on the upper surface of the pedestal 134 with the imaging unit 138. Here, the top surface of the dish is taken.
  • step S123 the cooked food is estimated based on the dish image data 130 obtained in the previous step.
  • This estimation method may be the same as step S113 described above.
  • a portion where a dish is photographed is extracted from the above-described food image data 126, and data relating to the color and surface roughness of this portion is extracted.
  • the dish database 127 shown in FIG. 5B the color, surface roughness, type, and the like of the top surface of each dish are tabulated for each dish. Therefore, the color and surface roughness of the dish extracted from the dish image data 130 are compared with the color and surface roughness recorded in the dish database 127, and the dish having the closest approximation of these values is “estimated dish”.
  • step S124 the user determines whether or not the food estimated in the previous step is correct. Specifically, the estimated image and name of the dish are displayed on a display device or the like attached to the nutrient amount calculation device 110. When the user determines that the estimated food is correct, the calculation of the nutrient amount is completed by performing an operation to that effect (step S127). Note that, by specifying and weighing the type of cooked dish, the type of dish (for example, curry) estimated from the dish image data 130 is associated with the nutrient amount per unit amount. Data indicating this matter is registered in the cooking nutrient amount database 131 shown in FIG. 5B, and this data is used for the next cooking.
  • the type of dish for example, curry
  • step S124 if the displayed dish is not correct (NO in step S124), other dishes having similar colors and roughness are displayed (presented) to the user (step S125). As a result, if the estimated dish is correct, the process proceeds to the next step (YES in step S126), and if it is not correct, another dish is estimated and displayed (NO in step S126).
  • steps S124 to S126 described above if the estimated dish is not correct, other dishes are presented. If this estimation is not correct more than a predetermined number of times (for example, five times or more), the user can The type of dish may be input manually. Thereby, the step which a user selects can be simplified.
  • the combination of the dish image data 130 and the dish in these steps is assumed to be correct and stored as the dish database 127. Specifically, for each dish, the type of dish and the nutrient amount per unit amount are stored in association with each other. Then, from the next dish, this combination is used in step S123, so that the accuracy of the estimation is improved.
  • the type of the food 140 can be specified by the above-described image analysis without the user inputting the type of the food 140. Convenience has been improved. Similarly, since the type of dish is also specified by the method shown in steps S122 to S125, the input operation is not necessary and convenience is improved.
  • the cooking method shown here is the same as the method described with reference to FIG. 7, except that the food 140 is not photographed and weighed before cooking.
  • step S151 cooking is performed using the ingredients 140 (step S152).
  • the types and ratios of the ingredients 140 used for cooking are similar, and the imaging and measurement of the ingredients 140 can be omitted for simplicity. Even when the same dish is cooked, if the type and ratio of the ingredients 140 to be used are different, the ingredients 140 may be photographed and measured.
  • the user may input the type of cooking (sauté, boil), etc., to the nutrient amount calculation device 110.
  • the type of cooking sauté, boil
  • the nutrient amount calculation device 110 it becomes possible to calculate the nutrient amount accurately by taking into account the seasoning used for cooking.
  • the dish is placed on the upper surface of the pedestal 134 of the nutrient amount calculation device 110 shown in FIG. 6A, and the weight is measured to obtain dish weight data 132 (step S153). . Then, the dish image data 130 is obtained by photographing the dish from above using the imaging unit 138 (step S154).
  • step S155, S156, S157, S158 the type of cooked food is estimated based on the dish image data 130, and if this estimation is incorrect, it is corrected (steps S155, S156, S157, S158).
  • the specific method of each step is the same as steps S123, S124, S125, and S126 described with reference to FIG.
  • the nutrient amount of the cooked dish is calculated based on the data regarding the type of dish and the dish weight data 132 obtained in the above step (step S159). Also, in this step, a nutrient amount database is prepared in which each dish is associated with the nutrient amount per unit amount, and the nutrient amount per unit amount of the corresponding type of dish is added to the weight of the weighed dish. By multiplying, the total nutrient amount is calculated.
  • the nutrient amount per unit amount of curry to be cooked when the target user cooks curry is recorded in advance by the nutrient amount calculation method shown in FIG.
  • This information is recorded for each type of dish in the dish nutrient database 131 shown in FIG. Therefore, the nutrient amount per unit amount of the target dish recorded in the dish nutrient database 131 is multiplied by the dish weight data 132 weighed in step S153 described above, so that the total nutrient amount (first) 2 (estimated cooking nutrient amount) is calculated (step S160).
  • the nutrient amount per unit amount of the dish to be cooked is estimated using the information regarding the ingredients 140 photographed by the first cooking shown in FIG. Therefore, it is possible to calculate the amount of nutrients only by photographing and weighing the dish after cooking.
  • step S118 shown in FIG. 7 the nutrient amount may be calculated in consideration of the seasoning used and the loss rate of the material. Thereby, it becomes possible to calculate the amount of nutrients more accurately.
  • the above-described embodiments can be implemented in combination with each other.
  • the weighing method and the photographing method of the food material 16 described in the first embodiment can be applied to the second embodiment.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Nutrition Science (AREA)
  • Epidemiology (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Cold Air Circulating Systems And Constructional Details In Refrigerators (AREA)
  • General Preparation And Processing Of Foods (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a nutrient quantity calculating device which uses a simple operation to calculate the quantity of a nutrient in a foodstuff, and a refrigerator equipped with the same. The nutrient quantity calculating device 10 is provided with a measuring instrument 12 which weighs and captures an image of a foodstuff 16, and a mobile terminal 24 which, for example, estimates the type of the foodstuff on the basis of information input from the measuring instrument 12. The measuring instrument 12 is primarily provided with: a weighing unit 14; an image-capturing unit 18 and an illuminating unit 20 disposed above the weighing unit 14; and a control unit 22 connected to the weighing unit 14. By means of such a configuration, the nutrient quantity calculating device 10 is capable of calculating the quantity of a nutrient contained in the foodstuff 16 placed on the measuring instrument 12, and presenting the calculation result to a user.

Description

栄養素量算出装置およびそれを備えた冷蔵庫Nutrient amount calculation device and refrigerator equipped with the same
 本発明は、使用者が料理に使用する食材の栄養素の量を算出する栄養素量算出装置およびそれを備えた冷蔵庫に関する。 The present invention relates to a nutrient amount calculation device that calculates the amount of nutrients of ingredients used by a user for cooking, and a refrigerator equipped with the nutrient amount calculation device.
 使用者が摂取する食材や料理の栄養素量を算出することは、健康管理や体調維持のために重要である。料理の栄養素量は、料理に用いられる食材の種類および量により決定されることから、食材の情報を基に料理の栄養素量を算出することは可能である。具体的には、食材の重量を計量し、その重量に食材の単位量当たりの栄養素量を乗算することで、その食材の総栄養素量が算出できる。しかしながら、使用者が料理を作る度にこの算出作業を行うことは煩雑であった。 Calculating the amount of ingredients and food nutrients consumed by the user is important for health management and physical condition maintenance. Since the amount of nutrients in a dish is determined by the type and amount of ingredients used in cooking, it is possible to calculate the amount of nutrients in a dish based on information on ingredients. Specifically, the total nutrient amount of the food can be calculated by measuring the weight of the food and multiplying the weight by the amount of nutrient per unit amount of the food. However, it is complicated to perform this calculation work every time the user prepares a dish.
 下記特許文献1に食材のカロリーを自動的に算出できるカロリー算出装置が記載されている。具体的には、図1および〔0011〕-〔0021〕等を参照して、ここでのカロリー算出装置1は、測定部20、重量検出部30、および制御部70を有して構成されている。また、測定部20は分析対象物Sに含まれる水分を測定し、重量検出部30は、分析対象物Sの重量を測定している。制御部70は、測定部の測定結果および重量検出部の測定結果を用いて分析対象物Sのカロリーを算出している。これにより、例えば食品である分析対象物Sのカロリーを計測できる効果が得られている。 Patent Document 1 listed below describes a calorie calculation device that can automatically calculate the calories of food ingredients. Specifically, referring to FIG. 1 and [0011]-[0021] and the like, the calorie calculation device 1 here includes a measurement unit 20, a weight detection unit 30, and a control unit 70. Yes. Further, the measurement unit 20 measures moisture contained in the analysis object S, and the weight detection unit 30 measures the weight of the analysis object S. The control unit 70 calculates the calorie of the analysis object S using the measurement result of the measurement unit and the measurement result of the weight detection unit. Thereby, the effect which can measure the calorie of analysis subject S which is food, for example is acquired.
特開2014-126559号公報JP 2014-126559 A
 しかしながら、上記した特許文献1に記載された発明では、カロリーの算出方法を簡易化する観点から改善の余地があった。具体的には、カロリーを算出するために測定部を動作させるためには、使用者が別途に設けられている操作部を操作する必要があった。また、通常の調理では複数の食材を使用するが、かかるカロリー算出装置では、食材毎にカロリーを算出するための作業を必要としていた。 However, the invention described in Patent Document 1 has room for improvement from the viewpoint of simplifying the calorie calculation method. Specifically, in order to operate the measurement unit in order to calculate calories, it is necessary for the user to operate an operation unit provided separately. In addition, a plurality of ingredients are used in normal cooking, but such a calorie calculating device requires an operation for calculating the calories for each ingredient.
 更に、特許文献1に記載されたカロリー算出装置では、解析対象物Sの水分比率に基いて分析対象物Sのカロリーを算出している。しかしながら、赤外線等を用いた解析方法では解析対象物の表面の状態を解析することは可能であるものの、内部の状態を解析することは困難であるため、その種類を特定することが難しい。よって、解析対象物Sのカロリーを適切に解析することが困難な場合が予想される。 Furthermore, in the calorie calculation device described in Patent Document 1, the calorie of the analysis object S is calculated based on the moisture ratio of the analysis object S. However, although it is possible to analyze the surface state of the object to be analyzed by the analysis method using infrared rays or the like, it is difficult to analyze the internal state, and thus it is difficult to specify the type. Therefore, the case where it is difficult to analyze the calorie of analysis object S appropriately is anticipated.
 更にまた、特許文献1では、食材に含まれるタンパク質、炭水化物等の重量から総カロリーを算出していたが、これらに加えて、ビタミンなどの無機質等の栄養素の量を推定できれば、使用者の利便性を更に向上することができる。 Furthermore, in Patent Document 1, the total calories are calculated from the weights of proteins, carbohydrates, and the like contained in foodstuffs. In addition to these, if the amount of nutrients such as inorganic substances such as vitamins can be estimated, it is convenient for the user. The property can be further improved.
 本発明の目的は、簡便な操作で使用者が料理に使用する食材の栄養素量を推定する栄養素量算出装置およびそれを具備する冷蔵庫を提供することにある。 An object of the present invention is to provide a nutrient amount calculation device for estimating the amount of nutrients of ingredients used by a user for cooking with a simple operation, and a refrigerator including the nutrient amount calculation device.
 本発明の栄養素量算出装置は、調理前の食材を撮影して食材画像データを得る食材撮影手段と、前記食材を計量して食材重量データを得る食材計量手段と、前記食材画像データに基づいて、前記食材の種類を推定する食材種類推定手段と、前記食材の種類と、前記食材重量データに基いて、前記食材に含まれる栄養素量を算出する栄養素量算出手段と、を具備し、前記食材撮影手段は、前記食材計量手段で計量される前記食材重量データの変動が一定未満となったら、前記食材を撮影することを特徴とする。 The nutrient amount calculation apparatus according to the present invention is based on the food material photographing means for obtaining food image data by photographing food before cooking, the food measuring means for obtaining food weight data by measuring the food, and the food image data. A food material type estimation means for estimating the food material type; a nutrient content calculation means for calculating the nutrient content contained in the food material based on the food material type and the food weight data; The photographing means photographs the food when the variation in the food weight data measured by the food measuring means is less than a certain value.
 更に本発明の栄養素量算出装置は、前記食材計量手段は、一定間隔で前記食材を計量することで前記食材重量データを取得し、前記食材撮影手段は、直前に計量された複数回における前記食材重量データの標準偏差が一定未満の場合に、前記食材を撮影することを特徴とする。 Furthermore, in the nutrient amount calculation apparatus according to the present invention, the food metering unit obtains the food weight data by measuring the food material at regular intervals, and the food photographing unit measures the food material at a plurality of times measured immediately before. The food is photographed when the standard deviation of the weight data is less than a certain value.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材重量データが前回撮影時と異なる場合に前記食材を撮影することで、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することを特徴とする。 Furthermore, in the nutrient amount calculation apparatus of the present invention, the food photographing unit captures the foods sequentially placed on the food metering unit by photographing the food when the food weight data is different from the previous photographing. A picture is taken every time the food is placed.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材計量手段に載置された前記食材の重量が確定してから一定期間後に、前記食材を撮影することを特徴とする。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the food photographing means photographs the food after a certain period of time after the weight of the food placed on the food measuring means is determined.
 更に本発明の栄養素量算出装置は、前記食材種類推定手段は、前記食材画像データから画像特徴量を算出し、前記画像特徴量と前記食材の種類とが対応づけてリスト化された食材リストから、前記画像特徴量が近い前記食材を選択し、前記栄養素量算出手段は、選択された前記食材の単位量当たりの栄養素量と、前記食材重量データとを乗算することで前記食材の栄養素量を算出することを特徴とする。 Furthermore, in the nutrient amount calculation apparatus of the present invention, the food material type estimation unit calculates an image feature value from the food image data, and from the food material list in which the image feature value and the food material type are listed in association with each other. The food material having a close image feature amount is selected, and the nutrient amount calculation means multiplies the nutrient amount per unit amount of the selected food material by the food material weight data to obtain the nutrient amount of the food material. It is characterized by calculating.
 更に本発明の栄養素量算出装置は、前記食材種類推定手段で選択された、前記食材の種類と前記画像特徴量とを、前記食材リストに追加することを特徴とする。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the type of food and the image feature amount selected by the food type estimation means are added to the food list.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することで、複数の前記食材画像データを取得し、前記食材種類推定手段は、最新の前記食材画像データと、前回に撮影した前記食材画像データとの差分を取ることで、新たに追加された前記食材の画像部分を特定し、前記画像部分から前記画像特徴量を算出することを特徴とする。 Furthermore, in the nutrient amount calculation apparatus according to the present invention, the food photographing unit is configured to photograph the foods sequentially placed on the food metering unit each time the foods are placed, so that a plurality of the food image data is obtained. The food material type estimation means identifies the image portion of the newly added food material by taking the difference between the latest food image data and the food image data photographed last time, The image feature amount is calculated from an image portion.
 更に本発明の栄養素量算出装置は、前記食材種類推定手段および前記栄養素量算出手段は、携帯端末の機能として実現されることを特徴とする。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the food material type estimation means and the nutrient amount calculation means are realized as a function of a mobile terminal.
 更に本発明の冷蔵庫は、上記した栄養素量算出装置を備えたことを特徴とする。 Furthermore, the refrigerator of the present invention is characterized by including the nutrient amount calculation device described above.
 本発明の栄養素量算出装置は、調理前の食材を撮影して食材画像データを得る食材撮影手段と、前記食材を計量して食材重量データを得る食材計量手段と、前記食材画像データに基づいて、前記食材の種類を推定する食材種類推定手段と、前記食材の種類と、前記食材重量データに基いて、前記食材に含まれる栄養素量を算出する栄養素量算出手段と、を具備し、前記食材撮影手段は、前記食材計量手段で計量される前記食材重量データの変動が一定未満となったら、前記食材を撮影することを特徴とする。従って、食材計量手段で計測される重量の変動が一定未満となった後に、食材を撮影することにより、食材計量手段に載置された食材をより鮮明に撮影することが可能となり、その食材画像データを用いた推定の精度が向上する。更に、食材計量手段の出力に基づいて、食材撮影手段が食材を撮影するので、撮影のための特段の操作を使用者が行わずとも、食材を撮影することが可能となる。 The nutrient amount calculation apparatus according to the present invention is based on the food material photographing means for obtaining food image data by photographing food before cooking, the food measuring means for obtaining food weight data by measuring the food, and the food image data. A food material type estimation means for estimating the food material type; a nutrient content calculation means for calculating the nutrient content contained in the food material based on the food material type and the food weight data; The photographing means photographs the food when the variation in the food weight data measured by the food measuring means is less than a certain value. Therefore, it is possible to photograph the food placed on the food weighing means more clearly by photographing the food after the fluctuation of the weight measured by the food weighing means becomes less than a certain value, and the food image The accuracy of estimation using data is improved. Further, since the food photographing means photographs the food based on the output of the food measuring means, it is possible to photograph the food without the user performing a special operation for photographing.
 更に本発明の栄養素量算出装置は、前記食材計量手段は、一定間隔で前記食材を計量することで前記食材重量データを取得し、前記食材撮影手段は、直前に計量された複数回における前記食材重量データの標準偏差が一定未満の場合に、前記食材を撮影することを特徴とする。従って、食材撮影手段を用いて、より安定化した状態で食材を撮影することが可能となる。 Furthermore, in the nutrient amount calculation apparatus according to the present invention, the food metering unit obtains the food weight data by measuring the food material at regular intervals, and the food photographing unit measures the food material at a plurality of times measured immediately before. The food is photographed when the standard deviation of the weight data is less than a certain value. Therefore, it is possible to photograph the food in a more stable state using the food photographing unit.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材重量データが前回撮影時と異なる場合に前記食材を撮影することで、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することを特徴とする。従って、使用者が撮影のための特段の操作をせずとも、食材の画像データを逐次的に撮影することが出来る。 Furthermore, in the nutrient amount calculation apparatus of the present invention, the food photographing unit captures the foods sequentially placed on the food metering unit by photographing the food when the food weight data is different from the previous photographing. A picture is taken every time the food is placed. Therefore, the image data of the food can be sequentially photographed without the user performing a special operation for photographing.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材計量手段に載置された前記食材の重量が確定してから一定期間後に、前記食材を撮影することを特徴とする。従って、食材を操作する使用者の手などが誤って食材画像データに写り込むことが防止される。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the food photographing means photographs the food after a certain period of time after the weight of the food placed on the food measuring means is determined. Accordingly, it is possible to prevent a user's hand operating the food material from being erroneously reflected in the food material image data.
 更に本発明の栄養素量算出装置は、前記食材種類推定手段は、前記食材画像データから画像特徴量を算出し、前記画像特徴量と前記食材の種類とが対応づけてリスト化された食材リストから、前記画像特徴量が近い前記食材を選択し、前記栄養素量算出手段は、選択された前記食材の単位量当たりの栄養素量と、前記食材重量データとを乗算することで前記食材の栄養素量を算出することを特徴とする。従って、画像の色彩等から算出される画像特徴量を用いて食材の種類を特定するため、使用者が食材の名称等を入力せずとも、食材の種類を簡易に特定することが可能となる。 Furthermore, in the nutrient amount calculation apparatus of the present invention, the food material type estimation unit calculates an image feature value from the food image data, and from the food material list in which the image feature value and the food material type are listed in association with each other. The food material having a close image feature amount is selected, and the nutrient amount calculation means multiplies the nutrient amount per unit amount of the selected food material by the food material weight data to obtain the nutrient amount of the food material. It is characterized by calculating. Accordingly, since the type of the food is specified using the image feature amount calculated from the color of the image, the type of the food can be easily specified without the user inputting the name of the food. .
 更に本発明の栄養素量算出装置は、前記食材種類推定手段で選択された、前記食材の種類と前記画像特徴量とを、前記食材リストに追加することを特徴とする。従って、次回の検索から用いられる食材リストの数が増大するので、その後の画像特徴量に基づいた食材種類の推定精度を向上することが出来る。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the type of food and the image feature amount selected by the food type estimation means are added to the food list. Accordingly, since the number of food lists used from the next search increases, it is possible to improve the estimation accuracy of the food material type based on the subsequent image feature amount.
 更に本発明の栄養素量算出装置は、前記食材撮影手段は、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することで、複数の前記食材画像データを取得し、前記食材種類推定手段は、最新の前記食材画像データと、前回に撮影した前記食材画像データとの差分を取ることで、新たに追加された前記食材の画像部分を特定し、前記画像部分から前記画像特徴量を算出することを特徴とする。従って、食材が撮影された部分のみの画像データを用いて食材の画像特徴量を算出するので、より正確に画像特徴量を算出して、食材の種類を連続的に推定する精度を向上することが出来る。 Furthermore, in the nutrient amount calculation apparatus according to the present invention, the food photographing unit is configured to photograph the foods sequentially placed on the food metering unit each time the foods are placed, so that a plurality of the food image data is obtained. The food material type estimation means identifies the image portion of the newly added food material by taking the difference between the latest food image data and the food image data photographed last time, The image feature amount is calculated from an image portion. Accordingly, since the image feature amount of the food is calculated using image data of only the portion where the food is photographed, the image feature amount is calculated more accurately and the accuracy of continuously estimating the type of the food is improved. I can do it.
 更に本発明の栄養素量算出装置は、前記食材種類推定手段および前記栄養素量算出手段は、携帯端末の機能として実現されることを特徴とする。従って、携帯端末を介して電話通信回線に接続し、サーバーに蓄積された情報を利用することで、上記した食材の種類を推定する精度を向上することが出来る。 Furthermore, the nutrient amount calculation apparatus of the present invention is characterized in that the food material type estimation means and the nutrient amount calculation means are realized as a function of a mobile terminal. Therefore, by connecting to a telephone communication line via a portable terminal and using the information stored in the server, it is possible to improve the accuracy of estimating the kind of food.
 更に本発明の冷蔵庫は、上記した栄養素量算出装置を備えたことを特徴とする。従って、食材が貯蔵される冷蔵庫に、食材の栄養素量を算出する機能を持たせることで、冷蔵庫を利用する使用者の利便性を向上することが出来る。 Furthermore, the refrigerator of the present invention is characterized by including the nutrient amount calculation device described above. Therefore, the convenience of the user who uses a refrigerator can be improved by giving the refrigerator in which a foodstuff is stored the function which calculates the nutrient amount of a foodstuff.
本発明の第1の実施の形態にかかる栄養素量算出装置を示す模式図である。It is a schematic diagram which shows the nutrient amount calculation apparatus concerning the 1st Embodiment of this invention. 本発明の第1の実施の形態にかかる栄養素量算出装置を用いて栄養素量を算出する方法を示すフローチャートである。It is a flowchart which shows the method of calculating a nutrient amount using the nutrient amount calculation apparatus concerning the 1st Embodiment of this invention. 本発明の第1の実施の形態にかかる栄養素量算出装置を用いて栄養素量を算出する方法を示すフローチャートである。It is a flowchart which shows the method of calculating a nutrient amount using the nutrient amount calculation apparatus concerning the 1st Embodiment of this invention. 本発明の第1の実施の形態にかかる栄養素量算出装置を用いて栄養素量を算出する方法において、画像データから食材の種類を推定する方法を示す図である。It is a figure which shows the method of estimating the kind of foodstuff from image data in the method of calculating a nutrient amount using the nutrient amount calculation apparatus concerning the 1st Embodiment of this invention. 本発明の第2の実施の形態にかかる栄養素量算出装置を示す図であり、(A)は栄養素量算出装置の構成を示すブロック図であり、(B)は使用されるデータ構成を示すブロック図である。It is a figure which shows the nutrient amount calculation apparatus concerning the 2nd Embodiment of this invention, (A) is a block diagram which shows the structure of a nutrient amount calculation apparatus, (B) is a block which shows the data structure used FIG. (A)は本発明の第2の実施の形態にかかる栄養素量算出装置の構成を示す斜視図であり、(B)は冷蔵庫を示す斜視図である。(A) is a perspective view which shows the structure of the nutrient amount calculation apparatus concerning the 2nd Embodiment of this invention, (B) is a perspective view which shows a refrigerator. 本発明の第2の実施の形態にかかる栄養素量算出装置で料理の栄養素量を算出する方法を示すフローチャートである。It is a flowchart which shows the method of calculating the nutrient amount of a dish with the nutrient amount calculation apparatus concerning the 2nd Embodiment of this invention. 本発明の第2の実施の形態にかかる栄養素量算出装置で料理の栄養素量を算出する方法を示すフローチャートである。It is a flowchart which shows the method of calculating the nutrient amount of a dish with the nutrient amount calculation apparatus concerning the 2nd Embodiment of this invention.
 <第1の実施の形態>
 以下、本形態に係る栄養素量算出装置10を説明する。
<First Embodiment>
Hereinafter, the nutrient amount calculation apparatus 10 according to the present embodiment will be described.
 図1に栄養素量算出装置10の構成を説明する。栄養素量算出装置10は、調理前の食材16を計量すると共にその撮影を行う測定器12と、測定器12から入力される情報を基にその食材種類の推定および栄養素量の算出を行う携帯端末24と、を備えている。栄養素量算出装置10の主たる機能は、測定器12に載置された食材16に含まれる栄養素量を算出し、その算出結果を使用者に提示することにある。従って、使用者は、料理前の食材16を測定器12に載置することで、その食材16の栄養素量を容易に知ることが出来る。本形態で、栄養素とは、例えば、カロリー、塩分等の無機質、ビタミン、タンパク質、炭水化物、脂肪等を含む。 FIG. 1 illustrates the configuration of the nutrient amount calculation apparatus 10. The nutrient amount calculation device 10 measures the food 16 before cooking and measures the measurement, and a portable terminal that estimates the type of the food and calculates the nutrient amount based on information input from the measurement device 12. 24. The main function of the nutrient amount calculation device 10 is to calculate the amount of nutrients contained in the food material 16 placed on the measuring instrument 12, and to present the calculation result to the user. Therefore, the user can easily know the nutrient amount of the food 16 by placing the food 16 before cooking on the measuring instrument 12. In this embodiment, nutrients include, for example, minerals such as calories and salt, vitamins, proteins, carbohydrates, fats and the like.
 また、本形態の栄養素量算出装置10は、上記した測定器12および携帯端末24に加えて、サーバー28を含んでも良い。この場合は、携帯端末24または測定器12は、インターネット等の通信網を経由して、サーバー28に接続される。そして、食材16を撮影した食材画像データ、この食材16の種類(食材名)を示す食材ID、その食材16の重量を示す食材重量データ、は携帯端末24からサーバー28に送信される。また、サーバー28では、多数の使用者の携帯端末24から送信されたこれらのデータを蓄積および解析し、その解析結果に基づく解析情報を携帯端末24にフィードバックすることで、食材16を撮影した食材画像データから、その食材16の種類を推定する精度を向上することが可能となる。 Further, the nutrient amount calculation device 10 of this embodiment may include a server 28 in addition to the measuring device 12 and the portable terminal 24 described above. In this case, the portable terminal 24 or the measuring instrument 12 is connected to the server 28 via a communication network such as the Internet. The food image data obtained by photographing the food 16, the food ID indicating the type (food name) of the food 16, and the food weight data indicating the weight of the food 16 are transmitted from the portable terminal 24 to the server 28. In addition, the server 28 accumulates and analyzes these data transmitted from the mobile terminals 24 of a large number of users, and feeds back the analysis information based on the analysis results to the mobile terminal 24, so that the food that has taken the food 16 is captured. It is possible to improve the accuracy of estimating the type of the food 16 from the image data.
 測定器12は、計量部14(食材計量手段)と、計量部14の上方に配置された撮像部18(食材撮影手段)および照明部20と、計量部14と接続された制御部22と、を主要に備えている。 The measuring instrument 12 includes a weighing unit 14 (foodstuff weighing unit), an imaging unit 18 (foodstuff photographing unit) and an illumination unit 20 disposed above the weighing unit 14, and a control unit 22 connected to the weighing unit 14. The main features.
 計量部14は、所謂電子天秤であり、その上面に載置された食材16の重量を示す電気信号を制御部22に伝達する。計量部14の上面の色は、載置される食材16の外縁を明瞭化するために、一般的な食材の色とは異なる色が採用される。本形態では、計量部14で計量される食材重量データに基づいて、食材16の栄養素量を算出するので、食材画像データのみから算出する場合と比較すると、栄養素量を正確に算出することが出来る。 The weighing unit 14 is a so-called electronic balance, and transmits an electric signal indicating the weight of the food 16 placed on the upper surface thereof to the control unit 22. In order to clarify the outer edge of the foodstuff 16 to be placed, a color different from the color of a general foodstuff is adopted as the color of the upper surface of the measuring unit 14. In this embodiment, since the nutrient amount of the food 16 is calculated based on the food weight data measured by the weighing unit 14, the nutrient amount can be accurately calculated as compared with the case of calculating from only the food image data. .
 撮像部18は、例えばCCD等の撮像素子から成り、計量部14の上方に配置されている。撮像部18は、食材16が載置された計量部14を上方から撮影することで、食材画像データを得る。得られた食材画像データは、制御部22に伝送される。撮像部18が食材16を撮影する際には、撮像部18と計量部14との相対的な位置は固定されている。従って、計量部14の上面に載置された食材16を安定して撮影することが可能となる。本形態では、撮像部18で食材画像データを取得するタイミングは、計量部14の出力に基づいて決定しているが、係る事項は図2等を参照して後述する。 The imaging unit 18 is composed of an imaging device such as a CCD, for example, and is disposed above the weighing unit 14. The imaging unit 18 obtains food image data by photographing the weighing unit 14 on which the food 16 is placed from above. The obtained food image data is transmitted to the control unit 22. When the imaging unit 18 images the food 16, the relative positions of the imaging unit 18 and the weighing unit 14 are fixed. Accordingly, it is possible to stably photograph the food 16 placed on the upper surface of the weighing unit 14. In this embodiment, the timing at which the food image data is acquired by the imaging unit 18 is determined based on the output of the weighing unit 14, and such matters will be described later with reference to FIG. 2 and the like.
 照明部20は、例えばLEDから成り、計量部14の上方であって撮像部18の近傍に配置されている。照明部20は、撮像部18が食材16を撮影する際に、食材16に向けて発光する機能を有する。照明部20の、撮像部18および計量部14に対する相対的位置が固定されていることで、食材16を撮影する際の条件を同一にし、取得される食材画像データを用いて食材16の種類を推定する際の精度を向上させることが出来る。 The illumination unit 20 includes, for example, an LED, and is disposed above the measuring unit 14 and in the vicinity of the imaging unit 18. The illumination unit 20 has a function of emitting light toward the food 16 when the imaging unit 18 photographs the food 16. Since the relative position of the illumination unit 20 with respect to the imaging unit 18 and the weighing unit 14 is fixed, the conditions for photographing the food 16 are made the same, and the type of the food 16 is selected using the obtained food image data. The accuracy at the time of estimation can be improved.
 制御部22は、所定の制御プログラムがインストールされており、上記した食材画像データおよび食材重量データが入力されると共に、撮像部18、照明部20の動作を制御する機能を有する。また、制御部22は、測定器12の近傍に配置された携帯端末24と通信する機能も有する。制御部22と携帯端末24とは、有線接続されても良いし無線接続されても良い。無線接続される場合は、例えば、Wi-Fi規格のデータ通信を採用することが出来る。 The control unit 22 has a predetermined control program installed therein, and has the function of controlling the operations of the imaging unit 18 and the illumination unit 20 as well as the above-described food image data and food weight data being input. The control unit 22 also has a function of communicating with the mobile terminal 24 arranged in the vicinity of the measuring instrument 12. The control unit 22 and the portable terminal 24 may be wired or wirelessly connected. For wireless connection, for example, Wi-Fi standard data communication can be employed.
 携帯端末24は、例えば使用者が所有するスマートフォンであり、上記した測定器12を制御するためのアプリケーションがインストールされている。携帯端末24には、例えばタッチパネルである表示部26が備えられており、上記した撮像部18で撮影した食材画像データを表示部26に表示することが出来る。また、表示部26を操作することによって、後述するように、食材16の種類を特定することも出来る。携帯端末24には、栄養素量算出手段、食材種類推定手段であるプログラムが、予めそれぞれ格納されている。 The mobile terminal 24 is, for example, a smartphone owned by the user, and an application for controlling the measuring instrument 12 is installed. The portable terminal 24 includes a display unit 26 that is a touch panel, for example, and can display the food image data captured by the imaging unit 18 on the display unit 26. Further, by operating the display unit 26, the type of the food material 16 can be specified as will be described later. The portable terminal 24 stores programs that are nutrient amount calculation means and food material type estimation means in advance.
 本形態では、使用者は、測定器12とは別体の携帯端末24を操作して、食材16の栄養素量を推定しているが、両者の機能を統合しても良い。即ち、測定器12にタッチパネル等の操作部を設け、その操作部を使用者が操作することで食材16の選択などをするようにしても良い。 In this embodiment, the user operates the portable terminal 24 separate from the measuring device 12 to estimate the nutrient amount of the food material 16, but the functions of both may be integrated. That is, an operation unit such as a touch panel may be provided in the measuring instrument 12, and the user may select the food 16 by operating the operation unit.
 次に、図2に基づいて、上記した図1も参照しつつ、上述した栄養素量算出装置10を用いて食材16の栄養素量を算出する方法を説明する。ここで、以下に述べるステップS11からステップS13、ステップS16からステップS21までの各ステップは、上記した測定器12で行われる。一方、ステップS14からステップS15、ステップS22からステップS36は、携帯端末24で行われる。 Next, a method for calculating the nutrient amount of the food material 16 using the nutrient amount calculation device 10 described above will be described with reference to FIG. 1 with reference to FIG. Here, each step from step S11 to step S13 and step S16 to step S21 described below is performed by the measuring instrument 12 described above. On the other hand, Step S14 to Step S15 and Step S22 to Step S36 are performed by the portable terminal 24.
 先ず、食材16の栄養素量を算出するために、使用者が測定器12の電源を入れる(ステップS10、S11)。次に、計量部14から重量に関する情報である初期重量データを取得し、初期値(0g)として制御部22に記録する(ステップS12)。その後、上面に食材16が載置されていない状態の計量部14を、撮像部18で撮影する(ステップS13)。撮影された初期の食材画像データは、初期値として制御部22に保存される。 First, in order to calculate the nutrient amount of the food material 16, the user turns on the measuring instrument 12 (steps S10 and S11). Next, initial weight data, which is information about the weight, is obtained from the weighing unit 14 and recorded in the control unit 22 as an initial value (0 g) (step S12). Thereafter, the measuring unit 14 in a state where the food material 16 is not placed on the upper surface is photographed by the imaging unit 18 (step S13). The photographed initial food image data is stored in the control unit 22 as an initial value.
 ここで、測定器12の周囲の照度が不足するなどの場合は、計量部14の上面を鮮明に撮影する為に、撮像部18で計量部14を撮影する際に、照明部20で計量部14を照射しても良い。係る場合は、撮影条件を均一化するために、後述するステップで食材16の食材画像データを撮影する際にも照明部20による照射が行われる。 Here, in the case where the illuminance around the measuring instrument 12 is insufficient, when the measuring unit 14 is imaged by the imaging unit 18 in order to capture a clear image of the upper surface of the measuring unit 14, the measuring unit is used by the illumination unit 20. 14 may be irradiated. In such a case, in order to make the photographing conditions uniform, irradiation by the illumination unit 20 is also performed when photographing the food image data of the food 16 in the steps described later.
 次に、スマートフォンである携帯端末24で専用のアプリケーションを起動した後に、Wi-Fi規格による無線接続などを用いて、測定器12と携帯端末24とを接続する(ステップS14)。そして、そのアプリケーションが、上記したステップで得られた初期の食材重量データおよび初期の食材画像データを、測定器12から取得する(ステップS15)。 Next, after starting a dedicated application on the mobile terminal 24 which is a smartphone, the measuring instrument 12 and the mobile terminal 24 are connected using a wireless connection based on the Wi-Fi standard (step S14). Then, the application acquires the initial food weight data and the initial food image data obtained in the above steps from the measuring instrument 12 (step S15).
 その後、測定器12の計量部14の上面に、調理予定の食材16を載置する(ステップS16)。計量部14で計測された食材重量データは制御部22に逐次伝達されているが、食材16が計量部14に載置された直後では、計量部14で計測される食材重量データの値は安定していない。よって、本形態では食材重量データの値が一定値になるまで待つ(ステップS17)。更に本形態では、食材を取り扱う使用者の手が食材画像データに写り込んでしまう結果、食材16を推定する精度が低下することを抑制するために、ステップS17にて計量部14で計量される食材重量データが安定した後に、更に一定時間撮影を待機するようにしている(ステップS18)。 Thereafter, the food 16 to be cooked is placed on the upper surface of the measuring unit 14 of the measuring instrument 12 (step S16). The food weight data measured by the weighing unit 14 is sequentially transmitted to the control unit 22, but immediately after the food 16 is placed on the weighing unit 14, the value of the food weight data measured by the weighing unit 14 is stable. Not done. Therefore, in this embodiment, the process waits until the value of the food weight data becomes a constant value (step S17). Further, in this embodiment, in order to prevent the accuracy of estimating the food 16 from being reduced as a result of the user's hand handling the food being reflected in the food image data, the weighing unit 14 measures in step S17. After the food weight data is stabilized, photographing is further waited for a predetermined time (step S18).
 上記した一定時間が経過したら、その時に計量部14が計測する食材16の食材重量データを制御部22に記録すると共に、撮像部18で上方から食材16を撮影することで食材画像データを得る(ステップS19、S20)。その後、取得した食材重量データと食材画像データを、測定器12から携帯端末24に送信する(ステップS21)。 When the above-described fixed time has elapsed, the food weight data of the food 16 measured by the measuring unit 14 at that time is recorded in the control unit 22 and the food image 16 is obtained by photographing the food 16 from above with the imaging unit 18 ( Steps S19 and S20). Thereafter, the obtained food weight data and food image data are transmitted from the measuring device 12 to the portable terminal 24 (step S21).
 以下のステップでは、携帯端末24で、送信された食材画像データおよび食材重量データを基に、その食材16の種類および栄養素量を算出する。 In the following steps, the portable terminal 24 calculates the type of the food 16 and the amount of nutrients based on the transmitted food image data and food weight data.
 具体的には、先ず、前回受信した食材重量データの値との差分を取る(ステップS22)。即ち、1回目の食材16を計量する場合は、1回目の食材16を載置した場合の食材重量データの値から、食材16が載置されていない状態、即ち0gを減算する。また、2回目の食材16を計量する場合は、2回目の食材16を載置した場合の食材重量データの値から、1回目の食材16を載置した場合の食材重量データの値を減算する。 Specifically, first, a difference from the value of the food weight data received last time is taken (step S22). That is, when the first food 16 is measured, the state in which the food 16 is not placed, that is, 0 g is subtracted from the value of the food weight data when the first food 16 is placed. When measuring the second food 16, the value of the food weight data when the first food 16 is placed is subtracted from the value of the food weight data when the second food 16 is placed. .
 係る事項を、図4を参照して詳述する。図4では、3つの食材A、B、Cを順次計量および撮影した場合の画像処理等の状況を模式的に示している。最初に食材Aを計量部14に載置して計測された重量Xgは、そのまま食材Aの食材重量データとしてその後の処理で用いられる。次に食材Bが計量部14に載置された場合には、Ygが計測されるが、(Y-X)gが食材Bの食材重量データとされる。また、更に食材Cが計量部14に載置された場合には、Zgが計測されるが、(Z-Y-X)gが食材Bの食材重量データとされる。 This matter will be described in detail with reference to FIG. FIG. 4 schematically shows a situation of image processing or the like when three food ingredients A, B, and C are sequentially weighed and photographed. The weight Xg first measured by placing the food A on the measuring unit 14 is used as it is as the food weight data of the food A in the subsequent processing. Next, when the food B is placed on the weighing unit 14, Yg is measured, but (YX) g is used as the food weight data of the food B. Further, when the food C is placed on the measuring unit 14, Zg is measured, but (ZYX) g is used as the food weight data of the food B.
 次に、前回撮影した食材画像データとの差分を取ることにより、更新された画像部分を新たに追加された食材16の画像として利用する(ステップS23、S24)。 Next, the updated image portion is used as an image of the newly added food material 16 by taking the difference from the previously captured food image data (steps S23 and S24).
 係る事項を、図4を参照して説明する。例えば、最初に食材Aを計量部14に載置された食材画像データ30を取得し、食材が載置されていない食材画像データ(ここでは図示せず)との差分から、食材Aの画像部分が分離された食材画像データ32を生成する。その後に、更に食材Bを載置して食材画像データ36を取得した場合は、食材画像データ30と食材画像データ36との差分を取ることで、食材Bの画像部分が分離された食材画像データ38が生成される。更に、更に食材Cを載置して食材画像データ42を取得した場合は、食材画像データ36と食材画像データ42との差分を取ることで、食材Cの画像部分が分離された食材画像データ44が生成される。 This matter will be described with reference to FIG. For example, first, the food image data 30 in which the food A is placed on the weighing unit 14 is acquired, and the image portion of the food A is obtained from the difference from the food image data (not shown here) on which the food is not placed. Is generated. After that, when the ingredient B is further placed and the ingredient image data 36 is acquired, the difference between the ingredient image data 30 and the ingredient image data 36 is taken to obtain the ingredient image data from which the image portion of the ingredient B is separated. 38 is generated. Furthermore, when the foodstuff image C is further placed and the foodstuff image data 42 is acquired, the foodstuff image data 44 in which the image portion of the foodstuff C is separated by taking the difference between the foodstuff image data 36 and the foodstuff image data 42. Is generated.
 次に、画像特徴量を算出する(ステップS25)。具体的には、上記のように生成された画像部分の色彩や粗度を基に、その画像部分の特徴を数値化した画像特徴量を算出する。そして、「過去に選択された食材リスト」および事前学習データの中から画像特徴量が近いものを検索する(ステップS26)。この「過去に選択された食材リスト」では、過去に選択された食材16の種類を示す食材IDと、その画像特徴量とを対応づけてリスト化されている。また、事前学習データは、携帯端末24のアプリケーションの出荷時に予め入力されているもので、食材IDと、その画像特徴量とを対応づけてリスト化されたものである。以下の説明では、「過去に選択された食材リスト」および事前学習データを、食材リストと総称する場合もある。そして、ステップS25で算出した画像特徴量と、画像特徴量が近い食材を、この食材リストの中から検索する(ステップS26)。 Next, an image feature amount is calculated (step S25). Specifically, based on the color and roughness of the image portion generated as described above, an image feature amount obtained by quantifying the feature of the image portion is calculated. And the thing with an image feature-value near is searched from "the foodstuff list selected in the past" and prior learning data (step S26). In the “past ingredient list selected in the past”, the ingredient ID indicating the type of the ingredient 16 selected in the past and the image feature amount are associated and listed. The pre-learning data is input in advance when the application of the mobile terminal 24 is shipped, and is a list in which the food ID and the image feature amount are associated with each other. In the following description, the “food list selected in the past” and the pre-learning data may be collectively referred to as a food list. And the image feature amount calculated in step S25 and the food material having the similar image feature amount are searched from the food material list (step S26).
 ステップS27では、食材リストの中から、検索対象となる食材と画像特徴量が近い上位複数の候補食材を、携帯端末24の表示部26に表示する。ここで、候補食材を表示部26に表示する場合は、食材名を表示しても良いし、食材を模した画像を表示しても良い。 In step S27, a plurality of candidate ingredients whose image feature amounts are close to the ingredients to be searched from the ingredient list are displayed on the display unit 26 of the portable terminal 24. Here, when displaying a candidate foodstuff on the display part 26, a foodstuff name may be displayed and the image imitating a foodstuff may be displayed.
 ここで、原理的には食材リストの中から、最も画像特徴量が近い食材を、検索対象の食材として採用することも可能である。本形態では、後述するように、食材リストから画像特徴量が近い食材を複数選び、その複数の食材から検索対象食材を使用者が選択することで、食材16を推定する精度を向上させている。 Here, in principle, it is also possible to use the food with the closest image feature quantity from the food list as the food to be searched. In this embodiment, as will be described later, by selecting a plurality of ingredients having similar image feature amounts from the ingredient list and selecting a search target ingredient from the plurality of ingredients, the accuracy of estimating the ingredient 16 is improved. .
 表示部26に表示された食材の中に、検索対象食材が存在する場合は(ステップS28のYES)、タッチパネルである表示部26を使用者が操作するなどして、正解の食材を選択する(ステップS29)。この時、図4の下部に示すように、選択された食材を示す食材画像データ34、40、46を表示部26に表示しても良い。 If the search target food is present in the food displayed on the display unit 26 (YES in step S28), the user selects the correct food by operating the display unit 26, which is a touch panel, etc. ( Step S29). At this time, as shown in the lower part of FIG. 4, the food image data 34, 40, 46 indicating the selected food may be displayed on the display unit 26.
 また、選択された食材16に対応する食材IDと、その食材16の画像特徴量とを対応づけて、「過去に選択された食材リスト」に追加する(ステップS30)。これにより、今回の検索結果を、次回からの食材の候補検索で利用することが出来るので、候補検索の精度を向上させることが出来る。即ち、「過去に選択された食材リスト」を、使用者の生活習慣に則して学習させることが出来る。 Also, the food material ID corresponding to the selected food material 16 is associated with the image feature amount of the food material 16 and added to the “food material list selected in the past” (step S30). As a result, the current search result can be used in the next candidate search for ingredients, so that the accuracy of the candidate search can be improved. That is, the “food list selected in the past” can be learned according to the lifestyle of the user.
 一方、ステップS28で表示された食材候補に、正解の食材が存在しない場合(ステップS28のNO)は、使用者が携帯端末24に食材16をテキスト入力し、入力された食材名と同名の食材を、食材マスタから検索する(ステップS31)。ここで、食材マスタとは、食材IDと、その食材の栄養素量とを対応させてリスト化したものである。入力されたテキストに基づく検索結果は、携帯端末24の表示部26に表示される(ステップS32)。 On the other hand, when there is no correct ingredient in the ingredient candidate displayed in step S28 (NO in step S28), the user inputs the ingredient 16 into the text on the portable terminal 24, and the ingredient having the same name as the inputted ingredient name. Is retrieved from the ingredient master (step S31). Here, the food master is a list in which the food ID is associated with the nutrient amount of the food. Search results based on the input text are displayed on the display unit 26 of the portable terminal 24 (step S32).
 その後、使用者は、表示された食材の中から、正解の食材を選択する(ステップS33)。選択された食材16を示す食材IDとその画像特徴量は、上記したステップS30と同様に、関連づけされて記録される(ステップS34)。 Thereafter, the user selects a correct food from the displayed foods (step S33). The food material ID indicating the selected food material 16 and its image feature amount are recorded in association with each other in the same manner as in step S30 described above (step S34).
 次に、上記したステップで特定された食材16の種類およびその重量を用いて、食材16の栄養素量を算出する(ステップS35)。具体的には、上記した食材マスタに含まれる食材IDに紐付く単位量あたりの栄養素量に、食材16の重量を乗算することにより、食材16に含まれる栄養素量を算出する。ここで上記したように、本形態における栄養素量には、カロリー、塩分等の無機質、ビタミン、タンパク質、炭水化物、脂肪等を含む。また、積算された栄養素量は、携帯端末24の表示部26に表示される(ステップS36)。これより、使用者は、測定器12に載置された食材16に含まれる全ての栄養素量を知ることが出来る。 Next, the nutrient amount of the food 16 is calculated using the type and weight of the food 16 specified in the above step (step S35). Specifically, the amount of nutrient contained in the ingredient 16 is calculated by multiplying the amount of nutrient per unit quantity associated with the ingredient ID contained in the aforementioned ingredient master by the weight of the ingredient 16. As described above, the nutrient amount in this embodiment includes minerals such as calories and salt, vitamins, proteins, carbohydrates, fats and the like. Further, the accumulated nutrient amount is displayed on the display unit 26 of the portable terminal 24 (step S36). Thus, the user can know the amount of all nutrients contained in the food material 16 placed on the measuring instrument 12.
 上記したステップS16からステップS36までの各ステップは、使用者が用意する各食材16に対して行われる(ステップS37)。 Each step from step S16 to step S36 described above is performed on each food material 16 prepared by the user (step S37).
 次に、図3を参照して、食材16を計量部14に載置した後に、それを撮影するステップS16からステップS21を、以下に詳述する。 Next, with reference to FIG. 3, step S <b> 16 to step S <b> 21 for photographing the food material 16 after it is placed on the weighing unit 14 will be described in detail below.
 先ず、食材16が計量部14に載置された後に、測定器12の制御部22は、計量部14から食材16の重量を示す食材重量データを取得する(ステップS50)。この食材重量データの取得は、一定間隔で連続して行われ、例えば、0.1秒間隔で連続して行われる。本形態では、連続して取得される食材重量データに対して、以下のステップを適用することで、安定した状態で食材画像データが取得され、その食材画像データを使用して食材種類の推定精度を向上することが出来る。 First, after the food 16 is placed on the weighing unit 14, the control unit 22 of the measuring instrument 12 acquires food weight data indicating the weight of the food 16 from the weighing unit 14 (step S50). The acquisition of the food weight data is performed continuously at regular intervals, for example, continuously at intervals of 0.1 seconds. In this embodiment, by applying the following steps to the food weight data acquired continuously, the food image data is acquired in a stable state, and the estimated accuracy of the food type using the food image data Can be improved.
 次に、後述する演算処理を可能とするために、アナログデータである食材重量データをデジタルデータに変換する(ステップS51)。次に、t回目のループに於いて直前5回の食材重量データの平均値w(t)を求める(ステップS52)。例えば、その食材重量データが5回目に測定されたものであれば、1回目から5回目までの食材重量データの平均値を算出する。更に、この直前5回の食材重量データを用いて、直前5回の食材重量データの標準偏差α(t)を算出する(ステップS53)。 Next, in order to enable the arithmetic processing described later, the food material weight data, which is analog data, is converted into digital data (step S51). Next, in the t-th loop, the average value w (t) of the five previous food weight data is obtained (step S52). For example, if the food weight data is measured at the fifth time, the average value of the food weight data from the first time to the fifth time is calculated. Further, the standard deviation α (t) of the last five food weight data is calculated using the previous five food weight data (step S53).
 上記のステップで算出した標準偏差α(t)が、所定の値(例えば0.5)より小さければ(ステップS54のYES)、直前の5回で計測された食材重量データの値は収束しつつあると判断する。即ち、制御部22は、食材重量データの時間変動が十分に小さく、食材16の姿勢が安定しており、食材画像データが鮮明に撮影できると判断できる。 If the standard deviation α (t) calculated in the above step is smaller than a predetermined value (for example, 0.5) (YES in step S54), the value of the food weight data measured in the last five times is converged. Judge that there is. That is, the control unit 22 can determine that the time variation of the food weight data is sufficiently small, the posture of the food 16 is stable, and the food image data can be captured clearly.
 次に、ステップS55では、食材重量データの時間変動が不安定状態(例えば3gを超える変動)を経ており且つ、食材重量データが前回撮影時の食材重量データと異なるか否かを判断する。食材重量データの時間変動が不安定状態を経ていることを確認することで、測定器12の長時間運用時のドリフト現象により、食材16の追加が行われていない状況下にて、食材重量データが僅かに変化(例えば3g以下)した場合に、撮像部18が撮影してしまうことを防止することができる。また、食材重量データが前回撮影時の食材重量データと異なることを確認することで、新たに食材16が計量部14に載置された場合のみに、後述する食材16を撮影することができる。 Next, in step S55, it is determined whether or not the time variation of the food material weight data is in an unstable state (for example, more than 3 g), and the food material weight data is different from the food material weight data at the previous photographing. By confirming that the time variation of the food material weight data is in an unstable state, the food material weight data is obtained under the situation where the food material 16 is not added due to a drift phenomenon during long-time operation of the measuring instrument 12. It is possible to prevent the imaging unit 18 from taking a picture when the value slightly changes (for example, 3 g or less). In addition, by confirming that the food weight data is different from the food weight data at the time of previous shooting, the food 16 described later can be shot only when the food 16 is newly placed on the measuring unit 14.
 食材重量データの時間変動が不安定状態を経ており且つ食材重量データが前回撮影時と異なる場合は(ステップS55のYES)、計測された食材重量データを確定する(ステップS57)。更に、制御部22は、食材16を操作する使用者の手を撮像部18の視野から待避させるために、所定時間(例えば0.5秒)経過した後に、撮像部18で食材16を撮影し、食材画像データを得る。更に、計量部14は安定状態に移行したと判断する。一方、食材重量データが不安定状態を経ていないか、又は食材重量データが前回撮影時と同じ場合は(ステップS55のNO)、食材重量データを確定することなく上記したステップS51に戻る。 When the time variation of the food material weight data is in an unstable state and the food material weight data is different from the previous photographing (YES in step S55), the measured food material weight data is determined (step S57). Further, the control unit 22 takes a picture of the food 16 with the imaging unit 18 after a predetermined time (for example, 0.5 seconds) has elapsed in order to retract the user's hand operating the food 16 from the field of view of the imaging unit 18. Obtain food image data. Further, the measuring unit 14 determines that the stable state has been reached. On the other hand, if the food weight data is not in an unstable state or if the food weight data is the same as that at the previous photographing (NO in step S55), the process returns to step S51 without determining the food weight data.
 また、上記した標準偏差α(t)が、0.5以上であれば(ステップS54のNO)、t回目(例えば5回目)に測定された食材重量データと、t-1回目(例えば4回目)に測定された食材重量データとの差の絶対値が、所定(例えば3グラム)以下であるかを確認する(ステップS56)。上記した値が所定以上で有れば(ステップS56のYES)、計量部14には新たな食材16が載置された可能性があり、食材重量データの時間変動は不安定状態に移行したと判断し、上記したステップS51に戻る。一方、上記した値が所定未満であれば(ステップS56のNO)、食材重量データの時間変動は安定状態のままであると判断し、上記したステップS51に戻る。 If the standard deviation α (t) is 0.5 or more (NO in step S54), the food weight data measured at the tth time (for example, the fifth time) and the t−1th time (for example, the fourth time) ) Is checked whether the absolute value of the difference from the measured food weight data is equal to or less than a predetermined value (for example, 3 grams) (step S56). If the above value is equal to or greater than the predetermined value (YES in step S56), there is a possibility that a new food 16 may be placed on the weighing unit 14, and the time variation of the food weight data has shifted to an unstable state. Judgment is made, and the process returns to step S51. On the other hand, if the above value is less than the predetermined value (NO in step S56), it is determined that the time variation of the food weight data remains in a stable state, and the process returns to the above step S51.
 上記した本形態によれば、食材16を計量部14に載置することで、計量部14の出力に基づいて撮像部18が食材16を撮影するので、使用者が撮影のための特段の操作をせずとも、食材画像データを簡易に取得することが出来る。 According to the above-described embodiment, since the imaging unit 18 captures the food 16 based on the output of the weighing unit 14 by placing the food 16 on the weighing unit 14, the user performs a special operation for shooting. The food image data can be easily acquired without performing the process.
 また、計量部14で計測される食材重量データの偏差値が所定以下の場合に、撮像部18が食材16を撮影するので、計量部14の上面で静止した状態の食材16を撮影することが可能となり、鮮明な食材画像データを取得し、食材16の種類を推定する精度を向上させることが出来る。 In addition, when the deviation value of the food weight data measured by the weighing unit 14 is equal to or less than a predetermined value, the imaging unit 18 photographs the food 16, so that the food 16 that is stationary on the upper surface of the weighing unit 14 can be photographed. Therefore, it is possible to improve the accuracy of obtaining clear food image data and estimating the type of food 16.
 更に本形態では、計量部14の出力に基づいて撮像部18で食材画像データを取得し、その食材画像データを用いて携帯端末24側で食材16の栄養素量を算出している。ここで本形態における食材画像データは静止画である。もし、計量部14の出力に基づいて静止画を撮影する時期を決定する機能が無ければ、動画像を解析して食材の栄養素量を算出する必要があるが、動画像の解析に要求される情報処理量は静止画に比べて多い。従って、本形態では携帯端末24側の情報処理量を格段に低減することができるとともに消費電力量を低減することが出来る。 Further, in this embodiment, the food image data is acquired by the imaging unit 18 based on the output of the measuring unit 14, and the nutrient amount of the food 16 is calculated on the mobile terminal 24 side using the food image data. Here, the food image data in this embodiment is a still image. If there is no function for determining when to shoot a still image based on the output of the weighing unit 14, it is necessary to analyze the moving image to calculate the nutrient content of the food, but this is required for the analysis of the moving image. The amount of information processing is large compared to still images. Therefore, in this embodiment, the information processing amount on the portable terminal 24 side can be remarkably reduced and the power consumption amount can be reduced.
 本形態では、食材画像データは、計量部14の上面で静止した状態の食材16を撮影した画像データであるので、食材16の種類を推定するうえで最も精度の高い画像データである。したがって、動画像を解析して食材の栄養素量を算出する場合と比較しても、食材16の種類を推定する精度が見劣りすることはない。即ち、本形態は、食材16の解析に最も適した食材画像データの撮像を、計量部14の出力をトリガーとして行い、解析すべきデータの情報量を低く抑えつつ、食材16の栄養素量を精度良く算出するものである。 In this embodiment, since the food image data is image data obtained by photographing the food 16 that is stationary on the upper surface of the measuring unit 14, it is the most accurate image data for estimating the type of the food 16. Therefore, even if it compares with the case where the amount of nutrients of a foodstuff is calculated by analyzing a moving picture, the accuracy which estimates the kind of foodstuff 16 is not inferior. In other words, in this embodiment, imaging of the food image data most suitable for the analysis of the food material 16 is performed using the output of the weighing unit 14 as a trigger, and the amount of nutrients of the food material 16 is accurately controlled while keeping the amount of data to be analyzed low. It is a good calculation.
 ここで、図4を参照して、上記した形態では、食材16を追加しつつ栄養素量を算出したが、途中で食材16を削除しても良い。その場合は、制御部22は、計量部14で計量される食材重量データが減少することで、食材16が削除されたことを検出することができる。また、制御部22は、撮像部18で撮影される食材画像データを直前のものと比較することで、削除された食材16の種類が特定される。 Here, with reference to FIG. 4, in the above-described form, the nutrient amount is calculated while adding the food material 16, but the food material 16 may be deleted along the way. In that case, the control part 22 can detect that the foodstuff 16 was deleted because the foodstuff weight data measured by the measurement part 14 reduce | decrease. Moreover, the control part 22 identifies the kind of the deleted foodstuff 16 by comparing the foodstuff image data image | photographed with the imaging part 18 with the last thing.
 更に、上記した栄養素量算出装置10は、冷蔵庫の機能として取り込まれても良い。多数の食材16が貯蔵される冷蔵庫に、食材16の栄養素量を算出する機能を持たせることで、冷蔵庫の付加価値を向上させることが出来る。 Further, the nutrient amount calculation device 10 described above may be incorporated as a function of the refrigerator. By adding a function of calculating the nutrient amount of the food 16 to the refrigerator in which a large number of food 16 is stored, the added value of the refrigerator can be improved.
 本形態は、食材16の栄養素量の算出をサーバー28にて行う場合に特に有効である。前述のように、食材画像データは静止画であり動画像に比べて格段に情報量が少ない。しかしながら、食材画像データは、撮像を計量部14の出力をトリガーとして行ったものであるので精度は高い。したがって、通信回線を通じてサーバー28に送信する情報量を低減しつつ、精度高く食材16の栄養素量を算出することができる。
 <第2の実施の形態>
 以下、本発明の実施形態に係る栄養素量算出装置110を図面に基づき詳細に説明する。本形態に係る栄養素量算出装置110では、調理前の食材140を撮影した食材画像データから栄養素量を算出すると共に、これらの食材140を調理することで得られた料理の種類と料理画像データ130とを対応して記録している。これにより、調理後の料理の画像を撮影するのみで、その栄養素量を算出することが可能となる。
This embodiment is particularly effective when the server 28 calculates the nutrient amount of the food material 16. As described above, the food image data is a still image and has a much smaller amount of information than a moving image. However, the food image data has high accuracy because it is obtained by taking an image of the output of the weighing unit 14 as a trigger. Therefore, it is possible to calculate the nutrient amount of the food 16 with high accuracy while reducing the amount of information transmitted to the server 28 through the communication line.
<Second Embodiment>
Hereinafter, a nutrient amount calculation apparatus 110 according to an embodiment of the present invention will be described in detail with reference to the drawings. In the nutrient amount calculation apparatus 110 according to the present embodiment, the nutrient amount is calculated from the food image data obtained by photographing the food 140 before cooking, and the type of dish and the cooking image data 130 obtained by cooking these foods 140. And correspondingly recorded. Thereby, it is possible to calculate the amount of nutrients only by taking an image of the cooked food.
 図5(A)は栄養素量算出装置110の概略的構成を示すブロック図であり、図5(B)は栄養素量を算出するために参照される栄養素量算出データ124を示すブロック図である。 FIG. 5A is a block diagram illustrating a schematic configuration of the nutrient amount calculation apparatus 110, and FIG. 5B is a block diagram illustrating nutrient amount calculation data 124 that is referred to in order to calculate the nutrient amount.
 図5(A)を参照して、本形態の栄養素量算出装置110は、食材撮影手段112と、食材計量手段114と、料理撮影手段116と、料理計量手段118と、栄養素量算出手段120と、記憶手段122、とを備えている。栄養素量算出装置110の概略的機能は、料理に用いる食材140または料理自体の重量データや画像データから、調理される料理の栄養素量を簡易に算出することにある。 With reference to FIG. 5 (A), the nutrient amount calculation device 110 of this embodiment includes a food photographing unit 112, a food measuring unit 114, a dish photographing unit 116, a dish measuring unit 118, and a nutrient amount calculating unit 120. Storage means 122. The general function of the nutrient amount calculation device 110 is to simply calculate the nutrient amount of the cooked food from the weight data or image data of the food 140 used for cooking or the food itself.
 食材撮影手段112は、調理予定の食材140をカラー撮影するための手段である。具体的には、CCD等の撮影素子等で食材撮影手段112が構成される。食材撮影手段112は食材140を撮影することで食材画像データ126を生成し、この食材画像データ126は栄養素量算出手段120に伝送される。ここで、食材撮影手段112は、料理に用いられる食材140を一つずつ撮影してもよいし、同種の材料を複数個撮影してもよい。 The food material photographing means 112 is a means for photographing the food material 140 to be cooked in color. Specifically, the food material photographing means 112 is constituted by a photographing element such as a CCD. The food imaging unit 112 generates the food image data 126 by shooting the food 140, and the food image data 126 is transmitted to the nutrient amount calculation unit 120. Here, the food photographing unit 112 may photograph one of the ingredients 140 used for cooking one by one, or may photograph a plurality of the same kind of materials.
 食材計量手段114は、調理予定の食材140の重量を計量する手段である。食材計量手段114が食材140を計量することで、食材140の重量を示す食材重量データ128が得られ、この食材重量データ128は栄養素量算出手段120に伝送される。ここで、食材計量手段114は、食材140を一つずつ計量してもよいし、同種の食材140を複数個同時に計量してもよい。 The food material measuring means 114 is a means for measuring the weight of the food material 140 to be cooked. The food material weighing unit 114 measures the food material 140 to obtain food material weight data 128 indicating the weight of the food material 140, and the food material weight data 128 is transmitted to the nutrient amount calculating unit 120. Here, the food measuring unit 114 may measure the foods 140 one by one, or may measure a plurality of the same kind of foods 140 simultaneously.
 料理撮影手段116は、上記した食材140を調理することで作られた料理を撮影する手段である。料理撮影手段116により料理を撮影することで得られた料理画像データ130は栄養素量算出手段120に伝送される。 The dish photographing unit 116 is a unit that photographs a dish made by cooking the above-described food material 140. The dish image data 130 obtained by photographing the dish by the dish photographing unit 116 is transmitted to the nutrient amount calculating unit 120.
 料理計量手段118は、調理された料理を計量する手段である。料理計量手段118が料理を計量することで得られた料理重量データ132は栄養素量算出手段120に伝送される。 The dish weighing means 118 is a means for weighing cooked dishes. The dish weight data 132 obtained by the dish weighing unit 118 weighing the dish is transmitted to the nutrient amount calculating unit 120.
 ここで、上記した食材撮影手段112と料理撮影手段116とは、個別に設けられても良いし、1つの撮影手段が兼用されても良い。更に、食材計量手段114と料理計量手段118とは、個別に設けられても良いし、1つの撮影手段が兼用されても良い。 Here, the above-described food photographing unit 112 and the dish photographing unit 116 may be provided individually or a single photographing unit may be used. Further, the food weighing means 114 and the food weighing means 118 may be provided individually or a single photographing means may be used.
 栄養素量算出手段120は、上記した各手段から伝送された各データから調理される料理の栄養素量を算出する。栄養素量算出手段120としては例えばCPUが採用される。また、栄養素量算出手段120は、料理の栄養素量に加えて、後述するように塩分等の栄養素量も算出してもよい。 The nutrient amount calculation means 120 calculates the amount of nutrients of the cooked food from each data transmitted from each means described above. As the nutrient amount calculation means 120, for example, a CPU is employed. The nutrient amount calculation means 120 may also calculate the amount of nutrients such as salt as will be described later in addition to the amount of nutrients in the dish.
 具体的には、食材140から料理の栄養素量を算出する場合は、食材画像データ126から食材140の種類を推定し、推定された食材140の単位量当たりの栄養素量に食材重量データ128を乗算する等して各食材140の栄養素量を算出する。そして、料理に用いられる全ての食材140の栄養素量を加算することで料理予定の料理の栄養素量が算出される。 Specifically, when calculating the amount of nutrients for cooking from the ingredients 140, the type of the ingredients 140 is estimated from the ingredients image data 126, and the estimated nutrient amount per unit amount of the ingredients 140 is multiplied by the ingredient weight data 128. For example, the nutrient amount of each food 140 is calculated. Then, the nutrient amount of the dish scheduled to be cooked is calculated by adding the nutrient amounts of all the ingredients 140 used for cooking.
 また、調理された料理に関するデータから栄養素量を算出する場合は、料理画像データ130から料理の種類を推定し、推定された料理の単位量当たりの栄養素量に料理重量データ132を乗算する等して料理の栄養素量が算出される。 Further, when calculating the nutrient amount from the data related to the cooked dish, the type of dish is estimated from the dish image data 130, and the estimated nutrient amount per unit amount of the dish is multiplied by the dish weight data 132. The nutrient content of the dish is calculated.
 料理の栄養素量の具体的算出方法に関しては、図7等に示すフローチャートを参照して後述する。 A specific method for calculating the amount of nutrients in a dish will be described later with reference to a flowchart shown in FIG.
 記憶手段122は、上記した各手段で得られた各画像データ、各重量データ等を保存する手段である。具体的には、記憶手段122としては、ハードディスクや半導体記憶装置が採用される。また、栄養素量算出装置110は、必ずしも記憶手段122を備える必要はなく、ネットワーク等を介して接続されたサーバー等を記憶手段として用い、上記した各データをサーバーに保存してもよい。 The storage means 122 is means for storing each image data, each weight data, etc. obtained by each means described above. Specifically, a hard disk or a semiconductor storage device is employed as the storage unit 122. The nutrient amount calculation apparatus 110 does not necessarily include the storage unit 122, and a server connected via a network or the like may be used as the storage unit, and the above-described data may be stored in the server.
 図5(B)を参照して、上記した栄養素量の算出に用いる栄養素量算出データ124に関して説明する。栄養素量算出データ124は、食材画像データ126、食材重量データ128、料理画像データ130、料理重量データ132、食材データベース125、料理データベース127、食材栄養素量データベース129、料理栄養素量データベース131を備えている。 With reference to FIG. 5 (B), the nutrient amount calculation data 124 used for calculating the nutrient amount described above will be described. The nutrient amount calculation data 124 includes ingredient image data 126, ingredient weight data 128, dish image data 130, dish weight data 132, ingredient database 125, dish database 127, ingredient nutrient quantity database 129, and dish nutrient quantity database 131. .
 これら各データを次に説明する。食材画像データ126は上記した食材撮影手段112で撮影することにより得られる静止画の画像データである。食材重量データ128は食材計量手段114で計量することにより得られた食材140の重量を示すデータである。料理画像データ130は、食材140を調理することで作られた料理を料理撮影手段116で撮影することで得られた静止画の画像データである。料理重量データ132は、作られた料理を料理計量手段118で計量することで計測された料理の重量を示すデータである。 These data will be explained next. The food image data 126 is image data of a still image obtained by photographing with the above-described food photographing means 112. The food weight data 128 is data indicating the weight of the food 140 obtained by weighing with the food weighing means 114. The dish image data 130 is image data of a still image obtained by photographing a dish made by cooking the ingredients 140 with the dish photographing unit 116. The dish weight data 132 is data indicating the weight of the dish measured by weighing the prepared dish with the dish weighing unit 118.
 食材データベース125は、食材画像データ126から抽出される特徴と食材140の種類とを関連付けるデータから成るデータベースであり、例えば、食材画像データ126から抽出された色彩や表面の粗度と食材140とを関連付けるデータベースである。料理データベースは、料理画像データ130と料理の種類とを関連付けるデータベースであり、例えば、料理の表面の色彩等と料理の種類とを関連付けるデータから構成されている。 The food material database 125 is a database composed of data associating features extracted from the food image data 126 with the types of the food materials 140. For example, the color and surface roughness extracted from the food image data 126 and the food material 140 are stored. The database to associate. The dish database is a database that associates the dish image data 130 with the kind of dish, and includes, for example, data that associates the color of the surface of the dish with the kind of dish.
 食材栄養素量データベース129は、食材140と、その食材140の単位量当たりの栄養素量とを関連付けるデータベースである。料理栄養素量データベース131は、料理と、その料理の単位量当たりの栄養素量とを関連付けるデータベースである。これらのデータとしては、公的機関が一般に公表しているデータを用いても良いし、使用者が本形態の栄養素量算出装置110を用いることで蓄積・改定されたデータを用いても良いし、両データを組み合わせて用いても良い。 The ingredient nutrient amount database 129 is a database that associates the ingredient 140 with the nutrient amount per unit quantity of the ingredient 140. The cooking nutrient amount database 131 is a database that associates a dish with the nutrient amount per unit amount of the dish. As these data, data publicly disclosed by public institutions may be used, or data accumulated and revised by the user using the nutrient amount calculation device 110 of this embodiment may be used. Both data may be used in combination.
 ここで、上記したデータベースは2以上が統合されても良い。例えば、食材データベース125と料理データベース127とを1つのデータベースに統合して用いても良い。また、食材栄養素量データベース129と料理栄養素量データベース131とを1つのデータベースに統合して用いても良い。 Here, two or more of the above databases may be integrated. For example, the ingredient database 125 and the dish database 127 may be integrated into one database. In addition, the food nutrient amount database 129 and the cooking nutrient amount database 131 may be integrated into a single database.
 図6を参照して具現化された栄養素量算出装置110の構成を説明する。図6(A)は栄養素量算出装置110の具体的な構成を示す斜視図であり、図6(B)は栄養素量算出装置110が組み込まれた冷蔵庫142を示す斜視図である。 The configuration of the nutrient amount calculation apparatus 110 embodied with reference to FIG. 6 will be described. 6A is a perspective view showing a specific configuration of the nutrient amount calculation apparatus 110, and FIG. 6B is a perspective view showing a refrigerator 142 in which the nutrient quantity calculation apparatus 110 is incorporated.
 図6(A)を参照して、栄養素量算出装置110は、食材140が載置される台座134と、この台座134に端部が固定された可動式の支持部136と、支持部136に備えられた撮像部138とを有している。 Referring to FIG. 6A, the nutrient amount calculation device 110 includes a pedestal 134 on which the food 140 is placed, a movable support 136 whose end is fixed to the pedestal 134, and a support 136. And an imaging unit 138 provided.
 台座134は、食材140を載置するための平坦面を有する板状の部材であり、食材140を計量するための計量モジュールが内蔵されている。また、この計量モジュールおよび撮像部138から各種データを受けて栄養素量を算出する栄養素量算出手段120が台座134に内蔵されても良い。 The pedestal 134 is a plate-like member having a flat surface on which the food material 140 is placed, and has a built-in measuring module for measuring the food material 140. Further, the nutrient amount calculation means 120 that receives various data from the measurement module and the imaging unit 138 and calculates the nutrient amount may be incorporated in the base 134.
 支持部136は台座134の端部付近に配置された棒状の部材であり、その下端は台座134に対して回転可能に接続されている。また、台座134の上面には支持部136の形状に対応した凹状領域が設けられているので、倒された支持部136がこの凹状領域に収納される。 The support portion 136 is a rod-like member disposed near the end of the pedestal 134, and the lower end thereof is rotatably connected to the pedestal 134. Further, since a concave region corresponding to the shape of the support portion 136 is provided on the upper surface of the pedestal 134, the tilted support portion 136 is accommodated in the concave region.
 支持部136の上端部付近に例えばCCD等からなる撮像部138が備えられている。撮像部138は、支持部136を起立させた状態で、台座134の上面に載置された食材140を撮影する位置に据え付けられている。撮像部138が撮影することで得られた食材画像データ126は、台座134に内蔵された処理部に伝送される。 Near the upper end of the support part 136, an imaging part 138 made of, for example, a CCD is provided. The imaging unit 138 is installed at a position where the food 140 placed on the upper surface of the pedestal 134 is photographed in a state where the support unit 136 is raised. The food image data 126 obtained by photographing by the imaging unit 138 is transmitted to a processing unit built in the pedestal 134.
 ここで、図5(B)を参照して説明した栄養素量算出データ124は、栄養素量算出装置110に内蔵されたハードディスク等の記録媒体に記録されても良いし、外部に配置された記録装置に記録されても良い。または、栄養素量算出データ124の一部が栄養素量算出装置110に内蔵された記録装置に記録され、栄養素量算出データ124の他の部分が外部に配置された記録装置に記録されても良い。ここでは、栄養素量算出データ124の全部または一部が、サーバー156に記録されている。栄養素量算出装置110とサーバー156とは、インターネット網等の通信網158を経由して接続されている。上記した栄養素量算出装置110の使用方法は、図7等を参照して後述する。 Here, the nutrient amount calculation data 124 described with reference to FIG. 5B may be recorded on a recording medium such as a hard disk built in the nutrient amount calculation device 110, or a recording device arranged outside. May be recorded. Alternatively, a part of the nutrient amount calculation data 124 may be recorded in a recording device built in the nutrient amount calculation device 110, and another part of the nutrient amount calculation data 124 may be recorded in a recording device arranged outside. Here, all or part of the nutrient amount calculation data 124 is recorded in the server 156. The nutrient amount calculation device 110 and the server 156 are connected via a communication network 158 such as the Internet network. A method of using the nutrient amount calculation apparatus 110 described above will be described later with reference to FIG.
 図6(B)に、上記した栄養素量算出装置110の適用例の一つとして冷蔵庫142を例示している。この冷蔵庫142は、冷蔵庫や冷凍庫等の複数の貯蔵庫を備えており、各貯蔵庫の前面開口部は扉144、146、148、150により開閉可能に閉鎖されている。例えば、扉144は左右の何れかの端部を支点として左右方向に回転開閉し、扉146、148、150は前後方向に開閉する。 FIG. 6B illustrates a refrigerator 142 as one application example of the nutrient amount calculation device 110 described above. The refrigerator 142 includes a plurality of storages such as a refrigerator and a freezer, and the front opening of each storage is closed by doors 144, 146, 148, and 150 so as to be opened and closed. For example, the door 144 rotates and opens in the left-right direction with either left or right end as a fulcrum, and the doors 146, 148, 150 open and close in the front-rear direction.
 小扉152は、扉146の一部を前後方向に回転可能とした扉であり、下端を支点として回転開閉する。開放状態とされた小扉152の上面が図6(A)に示した台座134として機能する。そして、扉146の開口部分付近に、食材140を撮影する撮像部154が配置されている。 The small door 152 is a door in which a part of the door 146 is rotatable in the front-rear direction, and rotates and opens with the lower end as a fulcrum. The upper surface of the small door 152 in the opened state functions as the pedestal 134 shown in FIG. An imaging unit 154 that captures the food 140 is disposed near the opening of the door 146.
 使用者が冷蔵庫142を用いて料理の栄養素量を算出する際には、先ず、小扉152を開き、この開かれた状態の小扉152の上面に食材140や料理を載置する。そして、この食材140または料理を小扉152に内蔵された計量モジュールで計量する。また、この食材140または料理は撮像部154で撮影される。これにより、食材140や料理の計量および撮影が行われる。 When the user calculates the nutrient amount of a dish using the refrigerator 142, first, the small door 152 is opened, and the ingredients 140 and the dish are placed on the upper surface of the small door 152 in the opened state. Then, the food material 140 or the dish is weighed by a weighing module built in the small door 152. In addition, the food 140 or the dish is photographed by the imaging unit 154. As a result, the food 140 and the food are measured and photographed.
 冷蔵庫142に栄養素量算出装置110を組み込むことにより、料理に用いられる食材140は一般に冷蔵庫に貯蔵される場合が多いので、食材140を直ぐに計量等することが可能になり、計量や撮影を更に簡易に行うことが可能となる。 By incorporating the nutrient amount calculating device 110 into the refrigerator 142, the food material 140 used for cooking is generally stored in the refrigerator in many cases, so that the food material 140 can be immediately measured and the measurement and photography can be further simplified. Can be performed.
 図7および図8に基いて、上記した各図も参照しつつ、栄養素量算出装置110を用いた料理の栄養素量の算出方法を説明する。図7は特定の料理を調理する最初の段階での栄養素量算出方法を示し、図8は特定の料理を調理する2回目以降の栄養素量算出方法を示している。ここで、以下に説明する各データは、図6(A)に示した栄養素量算出装置110やサーバー156等に適宜保存される。 Based on FIG. 7 and FIG. 8, the calculation method of the nutrient amount of the dish using the nutrient amount calculation device 110 will be described with reference to each of the above-described drawings. FIG. 7 shows a nutrient amount calculation method at the first stage of cooking a specific dish, and FIG. 8 shows a nutrient amount calculation method for the second and subsequent times of cooking the specific dish. Here, each data described below is appropriately stored in the nutrient amount calculation device 110, the server 156, and the like shown in FIG.
 図7を参照して、特定の料理を調理する最初の段階での栄養素量算出方法を説明する。 Referring to FIG. 7, the nutrient amount calculation method in the first stage of cooking a specific dish will be described.
 先ず、図6(A)に示す栄養素量算出装置110を起動さる(ステップS111)。具体的には、支持部136を台座134に対して起立させ、起動スイッチをON状態にする。 First, the nutrient amount calculation device 110 shown in FIG. 6A is activated (step S111). Specifically, the support 136 is raised with respect to the pedestal 134 and the start switch is turned on.
 そして、調理前の食材140を台座134の上面に載置する(ステップS112)。図6(A)では、調理前の魚が食材140として採用された場合を例示している。ここは、台座134の上面には1つのみの食材140が載置されているが、同種の複数の食材140が台座134の上面に載置されても良い。これにより、食材140の計量および撮影を簡便に行うことが可能となる。 And the foodstuff 140 before cooking is mounted on the upper surface of the base 134 (step S112). FIG. 6A illustrates a case where the fish before cooking is employed as the food material 140. Here, only one food 140 is placed on the upper surface of the pedestal 134, but a plurality of the same kind of foods 140 may be placed on the upper surface of the pedestal 134. Thereby, it becomes possible to measure and photograph the food 140 easily.
 次に、台座134の上面に載置された食材140を撮像部138で撮影する(ステップS113)。具体的には、必要に応じて不図示の発光手段にて食材140に光を照射した状態で、撮像部138で食材140を撮影する。これにより、食材140を撮影した食材画像データ126が得られる。取得された食材画像データ126はハードディスク等の記憶手段に記憶される。ここで、撮影される食材140は、加工前の物でもよいし加工後の物でも良い。例えば、食材140がバナナであれば、皮が剥かれる前の状態でも良いし、皮を剥いた後の状態でも良い。皮を向く前のバナナが撮影されたら、この画像で食材140の推定を行い、皮に相当する部分を除外して栄養素量算出を行う。一方、皮を剥いた後のバナナが撮影されたら、この食材画像データ126で食材140の推定を行い、全体が食材140として使用されるものとして栄養素量算出を行う。 Next, the food material 140 placed on the upper surface of the pedestal 134 is photographed by the imaging unit 138 (step S113). Specifically, the food 140 is imaged by the imaging unit 138 in a state where light is irradiated to the food 140 by a light emitting unit (not shown) as necessary. As a result, food image data 126 obtained by photographing the food 140 is obtained. The acquired food image data 126 is stored in a storage means such as a hard disk. Here, the food material 140 to be photographed may be an unprocessed product or a processed product. For example, if the food material 140 is a banana, the state before the skin is peeled off or the state after the skin is peeled off may be used. When the banana before facing the skin is photographed, the food 140 is estimated from this image, and the nutrient amount is calculated by excluding the portion corresponding to the skin. On the other hand, when the peeled banana is photographed, the food material 140 is estimated using the food image data 126, and the nutrient amount is calculated assuming that the whole is used as the food material 140.
 次に、上記の食材画像データ126から食材の種類を推定する(ステップS114)。食材画像データ126から食材140を推定する方法としては様々な手法が考えられるが、ここでは一例として食材140の表面の色彩と粗度に注目して食材140を推定する方法を説明する。 Next, the type of food is estimated from the food image data 126 (step S114). Various methods are conceivable as a method for estimating the food material 140 from the food image data 126. Here, as an example, a method for estimating the food material 140 by focusing on the color and roughness of the surface of the food material 140 will be described.
 具体的には、図6(A)を参照して、上記した食材画像データ126から食材140が撮影された部分を抽出し、この部分の色彩および表面粗度に関するデータを抽出する。一方、図5(B)に示した食材データベース125には、食材毎に、その食材の表面の色彩、表面粗度、種類等がテーブル化されている。よって、食材画像データ126から抽出された食材140の色彩および表面粗度と、食材データベース125に食材毎に記録された色彩および表面粗度とを比較し、これらの値が最も近似する食材が「推定される食材」とされる。 Specifically, referring to FIG. 6A, a portion where foodstuff 140 is photographed is extracted from the above-described food image data 126, and data relating to the color and surface roughness of this portion is extracted. On the other hand, in the food database 125 shown in FIG. 5B, the color, surface roughness, type, and the like of the surface of the food are tabulated for each food. Therefore, the color and surface roughness of the food material 140 extracted from the food image data 126 are compared with the color and surface roughness recorded for each food material in the food material database 125, and the food material having the closest approximation of these values is “ Estimated ingredients ”.
 次に、前ステップにて推定された食材が正しいか否かを使用者が判断する(ステップS115)。具体的には、栄養素量算出装置110に付随する表示装置等に、推定された食材の画像や名称を表示する。そして、使用者が推定された食材が正しいと判断したら、その旨の操作を行うことで次ステップに移行する(ステップS115のYES)。一方、表示された食材が正しくなければ(ステップS115のNO)、色彩および粗度が近い他の食材を使用者に対して表示(提示)する(ステップS116)。その結果、新たに表示された食材が正しければ次ステップに移行し(ステップS117のYES)、正しくなければ更に他の食材を推定して表示する(ステップS117のNO)。尚、この判断を行うステップS115、S117は、栄養素量算出装置110自体に備えられたスイッチやタッチパネルを使用者が操作して行っても良い。更には、特定のアプリケーションがインストールされたスマートフォン等の携帯情報端末に、推定された食材の画像や名称を表示し、それを使用者が操作して行っても良い。 Next, the user determines whether or not the food material estimated in the previous step is correct (step S115). Specifically, the estimated food image and name are displayed on a display device or the like associated with the nutrient amount calculation device 110. And if a user judges that the estimated foodstuff is correct, it will transfer to the next step by performing the operation to that effect (YES of step S115). On the other hand, if the displayed food is not correct (NO in step S115), other foods with similar colors and roughness are displayed (presented) to the user (step S116). As a result, if the newly displayed food is correct, the process proceeds to the next step (YES in step S117), and if it is not correct, another food is estimated and displayed (NO in step S117). Note that steps S115 and S117 for making this determination may be performed by a user operating a switch or a touch panel provided in the nutrient amount calculation apparatus 110 itself. Furthermore, the estimated food image or name may be displayed on a portable information terminal such as a smartphone in which a specific application is installed, and the user may operate it.
 ここで、ステップS115およびステップS117にて食材140の種類が特定された場合は、これらのステップに於ける、食材画像データ126と食材140の組み合わせが正しいものとして、対応付けられて記憶される。即ち、図5(B)に示す食材データベース125が改定される。そして、次回からは、この組み合わせがステップS114にて用いられることにより上記推定の精度が向上する。 Here, when the type of the food material 140 is specified in step S115 and step S117, the combination of the food material image data 126 and the food material 140 in these steps is associated and stored. That is, the food material database 125 shown in FIG. 5B is revised. From the next time, this combination is used in step S114, thereby improving the accuracy of the estimation.
 一般に、単位量当たりの栄養素量は食材140の種類により決定されるが、上記のステップにより栄養素量算出装置110に載置された食材140の種類が特定されることで、この栄養素量算出が可能となる。食材140の種類を特定する種類データは保存され、後のステップで使用される。 In general, the nutrient amount per unit amount is determined by the type of the food material 140, but the nutrient amount can be calculated by specifying the type of the food material 140 placed on the nutrient amount calculation device 110 by the above steps. It becomes. The type data specifying the type of the food material 140 is stored and used in a later step.
 また、上記したステップS115からステップS117では、推定された食材140が正しくなければ他の食材140を提示しているが、この推定が所定回以上(例えば5回以上)正しくなかったら、使用者が手入力にて食材140を入力するようにしても良い。これにより、使用者が食材140を選択するステップを簡易にすることが出来る。 Further, in the above-described step S115 to step S117, if the estimated food 140 is not correct, another food 140 is presented, but if this estimation is not correct more than a predetermined number of times (for example, 5 times or more), the user The ingredients 140 may be input manually. Thereby, the step which a user selects the foodstuff 140 can be simplified.
 次に、栄養素量算出装置110の台座134の上面に載置された食材140を計量する(ステップS118)。具体的には、台座134に内蔵された計量モジュールで、食材140の重量を測定する。この測定により得られた食材重量データ128は、栄養素量算出装置110が備える記憶装置に記憶される。 Next, the food material 140 placed on the upper surface of the pedestal 134 of the nutrient amount calculation device 110 is weighed (step S118). Specifically, the weight of the food 140 is measured by a measuring module built in the pedestal 134. The food weight data 128 obtained by this measurement is stored in a storage device provided in the nutrient amount calculation device 110.
 次に、栄養素量算出装置110が備えるCPUなどの演算手段(栄養素量算出手段120)にて、上記した食材140の種類の推定結果および食材重量データ128を用いて、調理される予定の料理の総栄養素量(第1推定料理栄養素量)を算出する(ステップS119)。 Next, in the calculation means (nutrient quantity calculation means 120) such as a CPU provided in the nutrient quantity calculation device 110, the estimation result of the kind of food 140 and the food weight data 128 described above are used for the cooking to be cooked. The total nutrient amount (first estimated cooking nutrient amount) is calculated (step S119).
 具体的には、食材栄養素量データベース129(図5(B))には、各食材毎に、その単位量当たりの栄養素量がデータ化されている。よって、食材140毎に食材重量データ128に単位量当たりの栄養素量を乗算し、これらを加算することで、調理される料理の総栄養素量(第1推定料理栄養素量)が算出される。 Specifically, the amount of nutrients per unit amount is converted into data in the food nutrient amount database 129 (FIG. 5B). Therefore, for each ingredient 140, the ingredient weight data 128 is multiplied by the nutrient amount per unit quantity, and these are added to calculate the total nutrient quantity (first estimated dish nutrient quantity) of the cooked food.
 上記したステップS112からステップS119までの作業は、食材140毎に行われる。一例としてカレーを調理する場合、人参、玉ねぎ、肉、馬鈴薯等の各材料に関して、ステップS112からステップS117の作業が行われる。これにより、各食材140の種類が特定され、それらの計量・積算も行われる。また、積算された食材140と栄養素量とは、栄養素量算出装置110が備えるディスプレイに表示される等して使用者に報知される。算出された食材140と栄養素量とは、栄養素量毎に積算されて料理と対応付けられて記憶され、後のステップにて栄養素量データベースとして用いられても良い。 The operations from step S112 to step S119 described above are performed for each food 140. As an example, when cooking curry, operations from step S112 to step S117 are performed for each material such as carrot, onion, meat, and potato. Thereby, the kind of each foodstuff 140 is specified, and those measurement and integration are also performed. In addition, the accumulated food 140 and nutrient amount are notified to the user, for example, by being displayed on a display provided in the nutrient amount calculation device 110. The calculated ingredients 140 and nutrient amounts may be accumulated for each nutrient amount, stored in association with the dish, and used as a nutrient amount database in a later step.
 ステップS120では、上記した食材140から料理を調理する。例えば、上記した人参等の材料を炒める、茹でる等してカレーを調理する。 In step S120, a dish is prepared from the above-described ingredients 140. For example, the curry is cooked by frying or boiling the above-mentioned ingredients such as carrots.
 次に、調理することにより作られた料理を図6(A)に示した台座134に載置して計量する(ステップS121)。このステップでは、料理に用いた鍋等の容器と共に料理を台座134に載置するので、容器の重量等を予め記憶させておくことで、容器の重量を総重量から差し引くことで、料理のみの重量を計量できる。本ステップにより、調理された料理の料理重量データ132が得られる。更に、台座134に載置された料理を、撮像部138を用いて撮影することで料理画像データ130が得られる。 Next, the dish made by cooking is placed on the pedestal 134 shown in FIG. 6A and weighed (step S121). In this step, since the dish is placed on the pedestal 134 together with a container such as a pot used for cooking, by storing the weight of the container in advance, the weight of the container is subtracted from the total weight, so that only the dish Can weigh. By this step, dish weight data 132 of the cooked dish is obtained. Furthermore, the dish image data 130 is obtained by photographing the dish placed on the pedestal 134 using the imaging unit 138.
 ここで、上記したステップS120とステップS121との間で、具体的な調理方法を選択するステップが行われても良い。具体的には、茹でる、炒める、蒸す、揚げる等の調理方法を使用者が栄養素量算出装置110にタッチパネル等の入力手段を介して入力する。一般に、使用される食材140が同種同量であっても調理の方法が異なると料理の栄養素量が異なる。例えば、炒めた場合は調理に使用する油が料理に添加されるので、蒸す場合と比較して調理される料理の栄養素量が高くなる。よって、例えば、「炒める」が調理方法として入力された際には、使用される油量も勘案して料理の栄養素量を再算出することで、算出される栄養素量の精度を向上させることが出来る。 Here, a step of selecting a specific cooking method may be performed between step S120 and step S121 described above. Specifically, the user inputs a cooking method such as boiling, frying, steaming, frying, etc. to the nutrient amount calculation device 110 via an input means such as a touch panel. In general, even if the ingredients 140 used are of the same type and amount, the amount of nutrients in the dish differs depending on the cooking method. For example, when fried, the oil used for cooking is added to the dish, so the amount of nutrients in the cooked dish is higher than when steamed. Thus, for example, when “stir fry” is input as a cooking method, the accuracy of the calculated nutrient amount can be improved by recalculating the nutrient amount of the dish in consideration of the amount of oil used. I can do it.
 更に、台座134の上面に載置された料理を撮像部138で撮影することで、料理画像データ130を得る。ここでは、料理の上面を撮影している。 Furthermore, the dish image data 130 is obtained by photographing the dish placed on the upper surface of the pedestal 134 with the imaging unit 138. Here, the top surface of the dish is taken.
 次に、前ステップで得られた料理画像データ130に基いて調理された料理を推定する(ステップS123)。この推定方法は、上記したステップS113と同様で良い。 Next, the cooked food is estimated based on the dish image data 130 obtained in the previous step (step S123). This estimation method may be the same as step S113 described above.
 具体的には、図6(A)を参照して、上記した食材画像データ126から料理が撮影された部分を抽出し、この部分の色彩および表面粗度に関するデータを抽出する。一方、図5(B)に示した料理データベース127には、料理毎に、その料理の上面の色彩、表面粗度、種類等がテーブル化されている。よって、料理画像データ130から抽出される料理の色彩および表面粗度と、料理データベース127に記録された色彩および表面粗度とを比較し、これらの値が最も近似する料理が「推定される料理」とされる。 Specifically, referring to FIG. 6 (A), a portion where a dish is photographed is extracted from the above-described food image data 126, and data relating to the color and surface roughness of this portion is extracted. On the other hand, in the dish database 127 shown in FIG. 5B, the color, surface roughness, type, and the like of the top surface of each dish are tabulated for each dish. Therefore, the color and surface roughness of the dish extracted from the dish image data 130 are compared with the color and surface roughness recorded in the dish database 127, and the dish having the closest approximation of these values is “estimated dish”. "
 次に、前ステップにて推定された料理が正しいか否かを使用者が判断する(ステップS124)。具体的には、栄養素量算出装置110に付随する表示装置等に、推定された料理の画像や名称を表示する。そして、使用者が推定された料理が正しいと判断したら、その旨の操作を行うことで栄養素量の算出は終了する(ステップS127)。尚、調理された料理の種類特定と計量が行われることで、料理画像データ130から推定された料理の種類(例えばカレー)と、その単位量当たりの栄養素量が対応付けられる。そして、この事項を示すデータが図5(B)に示す料理栄養素量データベース131に登録され、次回からの料理にこのデータが用いられる。 Next, the user determines whether or not the food estimated in the previous step is correct (step S124). Specifically, the estimated image and name of the dish are displayed on a display device or the like attached to the nutrient amount calculation device 110. When the user determines that the estimated food is correct, the calculation of the nutrient amount is completed by performing an operation to that effect (step S127). Note that, by specifying and weighing the type of cooked dish, the type of dish (for example, curry) estimated from the dish image data 130 is associated with the nutrient amount per unit amount. Data indicating this matter is registered in the cooking nutrient amount database 131 shown in FIG. 5B, and this data is used for the next cooking.
 一方、表示された料理が正しくなければ(ステップS124のNO)、色彩および粗度が近い他の料理を使用者に対して表示(提示)する(ステップS125)。その結果、その推定された料理が正しければ次ステップに移行し(ステップS126のYES)、正しくなければ更に他の料理を推定して表示する(ステップS126のNO)。 On the other hand, if the displayed dish is not correct (NO in step S124), other dishes having similar colors and roughness are displayed (presented) to the user (step S125). As a result, if the estimated dish is correct, the process proceeds to the next step (YES in step S126), and if it is not correct, another dish is estimated and displayed (NO in step S126).
 また、上記したステップS124からステップS126では、推定された料理が正しくなければ他の料理を提示するようにしているが、この推定が所定回以上(例えば5回以上)正しくなかったら、使用者が手入力にて料理の種類を入力するようにしても良い。これにより、使用者が選択するステップを簡易にすることが出来る。 In steps S124 to S126 described above, if the estimated dish is not correct, other dishes are presented. If this estimation is not correct more than a predetermined number of times (for example, five times or more), the user can The type of dish may be input manually. Thereby, the step which a user selects can be simplified.
 ここで、料理の種類が特定された後は、これらのステップに於ける、料理画像データ130と料理の組み合わせが正しいものとして、対応付けられて料理データベース127として記憶される。具体的には、料理毎に、料理の種類と、単位量当たりの栄養素量が対応付けられて記憶される。そして、次回の料理からは、この組み合わせがステップS123にて用いられることにより上記推定の精度が向上する。 Here, after the type of dish is specified, the combination of the dish image data 130 and the dish in these steps is assumed to be correct and stored as the dish database 127. Specifically, for each dish, the type of dish and the nutrient amount per unit amount are stored in association with each other. Then, from the next dish, this combination is used in step S123, so that the accuracy of the estimation is improved.
 上記した本形態の栄養素量算出方法では、先ず、ステップS113からステップS116に示した方法により、使用者が食材140の種類を入力せずとも、上記した画像解析により食材140の種類を特定できるので利便性が向上されている。同様に、ステップS122からステップS125に示した方法により料理の種類も特定されるので、その入力作業が不要になり、利便性が向上されている。 In the nutrient amount calculation method of the present embodiment described above, first, according to the method shown in steps S113 to S116, the type of the food 140 can be specified by the above-described image analysis without the user inputting the type of the food 140. Convenience has been improved. Similarly, since the type of dish is also specified by the method shown in steps S122 to S125, the input operation is not necessary and convenience is improved.
 図8を参照して、次に、使用者が同じ料理に関して行う2回目以降の料理方法を説明する。ここで示す調理方法は図7に基いて説明した方法と共通しており、ここでは調理前に食材140の撮影および計量を行わない点が異なる。 Referring to FIG. 8, the second and subsequent cooking methods performed by the user regarding the same dish will be described. The cooking method shown here is the same as the method described with reference to FIG. 7, except that the food 140 is not photographed and weighed before cooking.
 先ず、図6(A)に示す栄養素量算出装置110を起動させた後に(ステップS151)、食材140を用いて調理を行う(ステップS152)。ここでは、同様の使用者が調理を行う場合は、調理に用いる食材140の種類や割合は類似であると仮定しており、簡便さのために食材140の撮影および計量は省くことが出来る。尚、同じ料理を調理する場合であっても、使用する食材140の種類や割合が異なる場合は、食材140の撮影および計量を行っても良い。 First, after starting the nutrient amount calculation apparatus 110 shown in FIG. 6A (step S151), cooking is performed using the ingredients 140 (step S152). Here, when a similar user performs cooking, it is assumed that the types and ratios of the ingredients 140 used for cooking are similar, and the imaging and measurement of the ingredients 140 can be omitted for simplicity. Even when the same dish is cooked, if the type and ratio of the ingredients 140 to be used are different, the ingredients 140 may be photographed and measured.
 ここで、料理の調理を行った後に、上記したように、調理の種類(炒める、茹でる)等を使用者が栄養素量算出装置110に入力してもよい。これにより、調理に用いる調味料等を考慮することで正確に栄養素量を算出することが可能となる。 Here, after cooking the dish, as described above, the user may input the type of cooking (sauté, boil), etc., to the nutrient amount calculation device 110. Thereby, it becomes possible to calculate the nutrient amount accurately by taking into account the seasoning used for cooking.
 調理が終了した後は、その料理を図6(A)に示す栄養素量算出装置110の台座134の上面に載置して、その重量を計測することで料理重量データ132を得る(ステップS153)。そして、撮像部138を用いて上方から料理を撮影することで料理画像データ130を得る(ステップS154)。 After cooking is completed, the dish is placed on the upper surface of the pedestal 134 of the nutrient amount calculation device 110 shown in FIG. 6A, and the weight is measured to obtain dish weight data 132 (step S153). . Then, the dish image data 130 is obtained by photographing the dish from above using the imaging unit 138 (step S154).
 次に、料理画像データ130に基づいて調理された料理の種類を推定し、この推定が誤っている場合は訂正する(ステップS155、S156、S157、S158)。これらの各ステップの具体的な方法は、図7を参照して説明したステップS123、S124、S125、S126と同様である。 Next, the type of cooked food is estimated based on the dish image data 130, and if this estimation is incorrect, it is corrected (steps S155, S156, S157, S158). The specific method of each step is the same as steps S123, S124, S125, and S126 described with reference to FIG.
 次に、上記ステップで得られた、料理の種類に関するデータおよび料理重量データ132に基いて、調理された料理の栄養素量を算出する(ステップS159)。また、本ステップでは、各料理とその単位量当たりの栄養素量とが対応付けられた栄養素量データベースを用意し、計量された料理の重量に、対応する種類の料理の単位量当たりの栄養素量を乗算することで、総栄養素量が算出される。 Next, the nutrient amount of the cooked dish is calculated based on the data regarding the type of dish and the dish weight data 132 obtained in the above step (step S159). Also, in this step, a nutrient amount database is prepared in which each dish is associated with the nutrient amount per unit amount, and the nutrient amount per unit amount of the corresponding type of dish is added to the weight of the weighed dish. By multiplying, the total nutrient amount is calculated.
 具体的には、図7に示された栄養素量算出方法により、対象となる使用者が例えばカレーを調理した場合の、調理されるカレーの単位量当たりの栄養素量は予め記録されている。この情報は、図5(B)に示す料理栄養素量データベース131として、料理の種類毎に記録されている。従って、料理栄養素量データベース131に記録された、対象となる料理の単位量当たりの栄養素量に、上記したステップS153で計量された料理重量データ132を乗算することで、料理の総栄養素量(第2推定料理栄養素量)が算出される(ステップS160)。 Specifically, the nutrient amount per unit amount of curry to be cooked when the target user cooks curry, for example, is recorded in advance by the nutrient amount calculation method shown in FIG. This information is recorded for each type of dish in the dish nutrient database 131 shown in FIG. Therefore, the nutrient amount per unit amount of the target dish recorded in the dish nutrient database 131 is multiplied by the dish weight data 132 weighed in step S153 described above, so that the total nutrient amount (first) 2 (estimated cooking nutrient amount) is calculated (step S160).
 上記した栄養素量算出方法によれば、図7に示した1回目の調理により撮影した食材140に関する情報を用いて、調理される料理の単位量当たりの栄養素量が推定される。よって、調理後の料理の撮影および計量を行うのみでその栄養素量を算出することが可能となる。 According to the nutrient amount calculation method described above, the nutrient amount per unit amount of the dish to be cooked is estimated using the information regarding the ingredients 140 photographed by the first cooking shown in FIG. Therefore, it is possible to calculate the amount of nutrients only by photographing and weighing the dish after cooking.
 上記した本形態は、例えば以下のように変更することが可能である。 The above-described embodiment can be modified as follows, for example.
 図7に示すステップS118では、使用する調味料や、材料のロス率を考慮して栄養素量を算出してもよい。これにより、栄養素量をより正確に算出することが可能となる。 In step S118 shown in FIG. 7, the nutrient amount may be calculated in consideration of the seasoning used and the loss rate of the material. Thereby, it becomes possible to calculate the amount of nutrients more accurately.
 ここで、上記した各実施形態は互いに組み合わせて実施することが可能である。例えば、第1の実施の形態に記載した食材16の計量方法および撮影方法を、第2の実施の形態に適用することも出来る。 Here, the above-described embodiments can be implemented in combination with each other. For example, the weighing method and the photographing method of the food material 16 described in the first embodiment can be applied to the second embodiment.
10   栄養素量算出装置
12   測定器
14   計量部
16、A、B、C     食材
18   撮像部
20   照明部
22   制御部
24   携帯端末
26   表示部
28   サーバー
30   食材画像データ
32   食材画像データ
34   食材画像データ
36   食材画像データ
38   食材画像データ
40   食材画像データ
42   食材画像データ
44   食材画像データ
46   食材画像データ
110 栄養素量算出装置
112 食材撮影手段
114 食材計量手段
116 料理撮影手段
118 料理計量手段
120 栄養素量算出手段
122 記憶手段
124 栄養素量算出データ
125 食材データベース
126 食材画像データ
127 料理データベース
128 食材重量データ
129 食材栄養素量データベース
130 料理画像データ
131 料理栄養素量データベース
132 料理重量データ
134 台座
136 支持部
138 撮像部
140 食材
142 冷蔵庫
144 扉
146 扉
148 扉
150 扉
152 小扉
154 撮像部
156 サーバー
158 通信網
DESCRIPTION OF SYMBOLS 10 Nutrient amount calculation apparatus 12 Measuring device 14 Weighing part 16, A, B, C Foodstuff 18 Imaging part 20 Illumination part 22 Control part 24 Portable terminal 26 Display part 28 Server 30 Foodstuff image data 32 Foodstuff image data 34 Foodstuff image data 36 Foodstuff Image data 38 Ingredient image data 40 Ingredient image data 42 Ingredient image data 44 Ingredient image data 46 Ingredient image data 110 Ingredient amount calculating device 112 Ingredient photographing means 114 Ingredient measuring means 116 Ingredient measuring means 116 Ingredient photographing means 118 Ingredient measuring means 120 Ingredient amount calculating means 122 Storage Means 124 Nutrient amount calculation data 125 Ingredient database 126 Ingredient image data 127 Cooking database 128 Ingredient weight data 129 Ingredient nutrient amount database 130 Cooking image data 131 Cooking nutrient amount database 132 Cooking weight Data 134 base 136 support 138 imaging unit 140 ingredients 142 Refrigerator 144 door 146 door 148 door 150 door 152 small door 154 imaging unit 156 Server 158 communication network

Claims (9)

  1.  調理前の食材を撮影して食材画像データを得る食材撮影手段と、
     前記食材を計量して食材重量データを得る食材計量手段と、
     前記食材画像データに基づいて、前記食材の種類を推定する食材種類推定手段と、
     前記食材の種類と、前記食材重量データに基いて、前記食材に含まれる栄養素量を算出する栄養素量算出手段と、を具備し、
     前記食材撮影手段は、前記食材計量手段で計量される前記食材重量データの変動が一定未満となったら、前記食材を撮影することを特徴とする栄養素量算出装置。
    Ingredient photographing means for photographing ingredients before cooking and obtaining ingredient image data;
    Ingredient weighing means for weighing the ingredients to obtain ingredient weight data;
    On the basis of the food image data, the food material type estimation means for estimating the food material type,
    A nutrient amount calculating means for calculating a nutrient amount contained in the food based on the type of the food and the food weight data;
    The nutrient amount calculation apparatus, wherein the food photographing means photographs the food when a variation in the food weight data measured by the food measuring means becomes less than a certain value.
  2.  前記食材計量手段は、一定間隔で前記食材を計量することで前記食材重量データを取得し、
     前記食材撮影手段は、直前に計量された複数回における前記食材重量データの標準偏差が一定未満の場合に、前記食材を撮影することを特徴とする請求項1に記載の栄養素量算出装置。
    The food measuring means obtains the food weight data by measuring the food at regular intervals,
    The nutrient amount calculation apparatus according to claim 1, wherein the food photographing unit photographs the food when a standard deviation of the food weight data measured a plurality of times immediately before is less than a certain value.
  3.  前記食材撮影手段は、前記食材重量データが前回撮影時と異なる場合に前記食材を撮影することで、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することを特徴とする請求項1または請求項2に記載の栄養素量算出装置。 The food material photographing unit photographs the food material when the food material weight data is different from that at the previous photographing time, thereby photographing the food material sequentially placed on the food material measuring means every time the food material is placed. The nutrient amount calculation device according to claim 1 or 2, wherein the nutrient amount calculation device according to claim 1 or 2 is performed.
  4.  前記食材撮影手段は、前記食材計量手段に載置された前記食材の重量が確定してから一定期間後に、前記食材を撮影することを特徴とする請求項1から請求項3のいずれかに記載の栄養素量算出装置。 The said foodstuff imaging | photography means image | photographs the said foodstuff after a fixed period after the weight of the said foodstuff mounted in the said foodstuff measurement means is decided. Nutrient content calculator.
  5.  前記食材種類推定手段は、前記食材画像データから画像特徴量を算出し、前記画像特徴量と前記食材の種類とが対応づけてリスト化された食材リストから、前記画像特徴量が近い前記食材を選択し、
     前記栄養素量算出手段は、選択された前記食材の単位量当たりの栄養素量と、前記食材重量データとを乗算することで前記食材の栄養素量を算出することを特徴とする請求項1から請求項4の何れかに記載の栄養素量算出装置。
    The food material type estimation means calculates an image feature value from the food image data, and selects the food material having the image feature value close from a food material list in which the image feature data and the food material type are associated with each other. Selected,
    The said nutrient amount calculation means calculates the nutrient amount of the said foodstuff by multiplying the nutrient amount per unit amount of the selected said foodstuff and the said foodstuff weight data. 4. The nutrient amount calculation apparatus according to any one of 4 above.
  6.  前記食材種類推定手段で選択された、前記食材の種類と前記画像特徴量とを、前記食材リストに追加することを特徴とする請求項5に記載の栄養素量算出装置。 6. The nutrient amount calculation apparatus according to claim 5, wherein the type of the food material and the image feature amount selected by the food material type estimation unit are added to the food material list.
  7.  前記食材撮影手段は、前記食材計量手段に順次載置される前記食材を、前記食材が載置される度に撮影することで、複数の前記食材画像データを取得し、
     前記食材種類推定手段は、最新の前記食材画像データと、前回に撮影した前記食材画像データとの差分を取ることで、新たに追加された前記食材の画像部分を特定し、前記画像部分から前記画像特徴量を算出することを特徴とする請求項1から請求項6の何れかに記載の栄養素量算出装置。
    The food photographing means obtains a plurality of food image data by photographing the food sequentially placed on the food measuring means, each time the food is placed,
    The food material type estimation means identifies the image portion of the newly added food material by taking the difference between the latest food image data and the food image data photographed last time, and from the image portion, the The nutrient amount calculation apparatus according to claim 1, wherein an image feature amount is calculated.
  8.  前記食材種類推定手段および前記栄養素量算出手段は、携帯端末の機能として実現されることを特徴とする請求項1から請求項7の何れかに記載の栄養素量算出装置。 The nutrient amount calculation apparatus according to any one of claims 1 to 7, wherein the food material type estimation unit and the nutrient amount calculation unit are realized as a function of a mobile terminal.
  9.  請求項1から請求項8の何れかに記載の栄養素量算出装置を備えたことを特徴とする冷蔵庫。 A refrigerator comprising the nutrient amount calculation device according to any one of claims 1 to 8.
PCT/JP2015/004290 2014-08-26 2015-08-26 Nutrient quantity calculating device and refrigerator provided with same WO2016031246A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2015543170A JP6577365B2 (en) 2014-08-26 2015-08-26 Nutrient amount calculation device and refrigerator equipped with the same
CN201580045903.4A CN107077709B (en) 2014-08-26 2015-08-26 Nutrient amount calculating device and refrigerator having the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-171111 2014-08-26
JP2014171111 2014-08-26

Publications (1)

Publication Number Publication Date
WO2016031246A1 true WO2016031246A1 (en) 2016-03-03

Family

ID=55399159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004290 WO2016031246A1 (en) 2014-08-26 2015-08-26 Nutrient quantity calculating device and refrigerator provided with same

Country Status (3)

Country Link
JP (1) JP6577365B2 (en)
CN (1) CN107077709B (en)
WO (1) WO2016031246A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017187285A (en) * 2016-04-01 2017-10-12 東芝テック株式会社 Weighing system, and sales data processing device
WO2017190678A1 (en) * 2016-05-06 2017-11-09 Yuen Cheuk Ho An apparatus for receiving a container
WO2018008686A1 (en) * 2016-07-06 2018-01-11 株式会社EggStellar Management system for managing nutritional component in meal
CN109631484A (en) * 2017-10-06 2019-04-16 松下知识产权经营株式会社 Freezer
JP2019101736A (en) * 2017-12-01 2019-06-24 トヨタホーム株式会社 Cooking content discrimination system and intake content discrimination system
JP2019168134A (en) * 2018-03-22 2019-10-03 三菱電機株式会社 Refrigerator system
CN111503990A (en) * 2020-04-10 2020-08-07 海信集团有限公司 Refrigerator and food material identification method
JP2020166508A (en) * 2019-03-29 2020-10-08 株式会社日立ソリューションズ・クリエイト Food material management support system and food material management support method
JP7386433B2 (en) 2020-06-22 2023-11-27 パナソニックIpマネジメント株式会社 program

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107682450A (en) * 2017-10-27 2018-02-09 上海京颐科技股份有限公司 The monitoring method and device, storage medium, terminal of food intake
CN109725117A (en) * 2017-10-31 2019-05-07 青岛海尔智能技术研发有限公司 The method and device that foodstuff calories detect in refrigerator
CN108332504B (en) * 2017-12-08 2020-11-06 青岛海尔智能技术研发有限公司 Food heat detection method and refrigerator
KR20220037631A (en) * 2020-09-18 2022-03-25 삼성전자주식회사 Image display apparutus and controlling method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0692338A (en) * 1992-09-03 1994-04-05 Ishida Co Ltd Commodity information processor
JP2002318060A (en) * 2001-04-20 2002-10-31 Hitachi Ltd Refrigerator with foodstuff controlling function
WO2009031190A1 (en) * 2007-09-03 2009-03-12 Shimadzu Corporation Electronic balance
WO2010070645A1 (en) * 2008-12-17 2010-06-24 Omer Einav Method and system for monitoring eating habits
JP2012112855A (en) * 2010-11-26 2012-06-14 Akira Yamada Portable scale capable of managing dietary intake weight, and intake weight data management system using digital photograph, cellular phone and it

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140149937A1 (en) * 2012-11-26 2014-05-29 University Of Birmingham Visual meal creator
CN103888549B (en) * 2014-04-19 2017-04-19 顾坚敏 Cloud and intelligent terminal based nutrition and life management system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0692338A (en) * 1992-09-03 1994-04-05 Ishida Co Ltd Commodity information processor
JP2002318060A (en) * 2001-04-20 2002-10-31 Hitachi Ltd Refrigerator with foodstuff controlling function
WO2009031190A1 (en) * 2007-09-03 2009-03-12 Shimadzu Corporation Electronic balance
WO2010070645A1 (en) * 2008-12-17 2010-06-24 Omer Einav Method and system for monitoring eating habits
JP2012112855A (en) * 2010-11-26 2012-06-14 Akira Yamada Portable scale capable of managing dietary intake weight, and intake weight data management system using digital photograph, cellular phone and it

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017187285A (en) * 2016-04-01 2017-10-12 東芝テック株式会社 Weighing system, and sales data processing device
WO2017190678A1 (en) * 2016-05-06 2017-11-09 Yuen Cheuk Ho An apparatus for receiving a container
WO2018008686A1 (en) * 2016-07-06 2018-01-11 株式会社EggStellar Management system for managing nutritional component in meal
CN109631484A (en) * 2017-10-06 2019-04-16 松下知识产权经营株式会社 Freezer
JP2019101736A (en) * 2017-12-01 2019-06-24 トヨタホーム株式会社 Cooking content discrimination system and intake content discrimination system
JP7064853B2 (en) 2017-12-01 2022-05-11 トヨタホーム株式会社 Cooking content discrimination system and intake content discrimination system
JP2019168134A (en) * 2018-03-22 2019-10-03 三菱電機株式会社 Refrigerator system
JP7040193B2 (en) 2018-03-22 2022-03-23 三菱電機株式会社 Refrigerator system
JP2020166508A (en) * 2019-03-29 2020-10-08 株式会社日立ソリューションズ・クリエイト Food material management support system and food material management support method
JP7211677B2 (en) 2019-03-29 2023-01-24 株式会社日立ソリューションズ・クリエイト Food material management support system and food material management support method
CN111503990A (en) * 2020-04-10 2020-08-07 海信集团有限公司 Refrigerator and food material identification method
JP7386433B2 (en) 2020-06-22 2023-11-27 パナソニックIpマネジメント株式会社 program

Also Published As

Publication number Publication date
CN107077709A (en) 2017-08-18
JPWO2016031246A1 (en) 2017-06-08
JP6577365B2 (en) 2019-09-18
CN107077709B (en) 2021-06-25

Similar Documents

Publication Publication Date Title
JP6577365B2 (en) Nutrient amount calculation device and refrigerator equipped with the same
US20210228022A1 (en) System and Method for Collecting and Annotating Cooking Images for Training Smart Cooking Appliances
US20230039201A1 (en) Tailored food preparation with an oven
US20170332841A1 (en) Thermal Imaging Cooking System
JP2022172159A (en) health tracking device
US20180003687A1 (en) Dynamic Recipe Control
US20180063900A1 (en) Calibration Of Dynamic Conditioning Systems
US20080102175A1 (en) Cooking apparatus and method of displaying caloric information
US20210259453A1 (en) Cooking device and system
CN110488696B (en) Intelligent dry burning prevention method and system
KR20160082701A (en) Multi-conditioner control for conditioning nutritional substances
CN108073906B (en) Method and device for detecting nutrient components of dishes, cooking utensil and readable storage medium
CN212159540U (en) Device for determining sugar content information of food and heating appliance for cooking food
US20220273139A1 (en) System and Method for Optimal Food Cooking or Heating Operations
JP2017524887A (en) Method and apparatus for controlling food cooking process
KR102005404B1 (en) Heating roaster machine for meat and fish and Driving method therof
JP2013036907A (en) Caloric intake estimating device, caloric intake estimating method and caloric intake estimation data outputting device
US11443844B2 (en) Smart tray for measuring food intake and weight change, and weight management system comprising same
JP2013037648A (en) Caloric intake estimating device, caloric intake estimating method and caloric intake estimation data outputting device
CN112163006A (en) Information processing method and device, electronic equipment and storage medium
JP2023054824A (en) Food disposal timing management device, food disposal timing management system, and food disposal timing management method
JP2017054163A (en) Meal advice system
KR20190048922A (en) Smart table and controlling method thereof
JP2021103038A (en) Cooking appliance
WO2015195573A1 (en) Multi-conditioner control for conditioning nutritional substances

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2015543170

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15835548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15835548

Country of ref document: EP

Kind code of ref document: A1