WO2012111209A1 - Management device, management system, and display method - Google Patents

Management device, management system, and display method Download PDF

Info

Publication number
WO2012111209A1
WO2012111209A1 PCT/JP2011/076928 JP2011076928W WO2012111209A1 WO 2012111209 A1 WO2012111209 A1 WO 2012111209A1 JP 2011076928 W JP2011076928 W JP 2011076928W WO 2012111209 A1 WO2012111209 A1 WO 2012111209A1
Authority
WO
WIPO (PCT)
Prior art keywords
meal
information
meal information
weight
unit
Prior art date
Application number
PCT/JP2011/076928
Other languages
French (fr)
Japanese (ja)
Inventor
秀武 大島
Original Assignee
オムロンヘルスケア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロンヘルスケア株式会社 filed Critical オムロンヘルスケア株式会社
Publication of WO2012111209A1 publication Critical patent/WO2012111209A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Definitions

  • the present invention relates to a management device and a management system for managing meal contents, and a display method in the management system.
  • Patent Document 1 discloses a system that acquires meal contents as image data in a server, analyzes the image data, and presents improvement advice and the like. is doing.
  • the nutritional amount and nutritional balance based on the image data are input by an advisor who is a registered dietitian who viewed the image. That the server automatically calculates the nutritional amount and nutritional balance when the advisor decides and inputs the food and quantity, and the advisor needs to make such a determination was there.
  • the present invention has been made in view of such problems, and provides a management device, a management system, and a display method capable of performing meal management suitable for individual users without requiring complicated processing. It is an object.
  • the management device is a management device for managing meal information, and an input unit for inputting the meal information and weight of the user together with the date and time, A processing unit for processing meal information and body weight and an output unit for outputting information processed by the processing unit are included.
  • the processing unit associates the meal information of the unit period with the change in weight of the user during the unit period and stores it in the storage device, and associates the change in weight of the user during the unit period with the change in weight in the storage device.
  • a process of outputting meal contents based on the stored meal information is executed.
  • the unit period is one day
  • the processing unit outputs the meal information associated with the weight change of the user for one day larger than the threshold in the process of outputting the meal content, And classifying the meal information associated with the weight change of the user of the day smaller than the threshold value into the second meal information, and the meal contents based on each of the first meal information and the second meal information, Output together with the classification of the first meal information and the second meal information.
  • the processing unit calculates the threshold value using the weight change of the user during a predetermined period.
  • the meal information includes a captured image obtained by capturing the meal, and the processing unit executes a process of outputting an image based on the captured image as the content of the meal.
  • the management device is a management device for managing meal information, the first input unit for inputting the meal information and weight of the user together with the date and time, and the input of the meal contents
  • the processing unit associates the meal information of the unit period with the weight change of the user of the unit period and stores it in the storage device, and stores the meal information in association with the input meal content, A process of outputting the weight change of the user during the unit period as an estimated value of the weight change is executed.
  • the management system is a management system for managing meal information, the first input device for inputting meal contents together with the date and time, and the weight for inputting weight together with the date and time.
  • a second input device, a management device for processing meal information and weight, and a display device are included.
  • the management device associates the meal information of the unit period with the change in the user's weight during the unit period and stores it in the storage device, and associates the change in the weight of the user during the unit period with the change in the weight in the storage device.
  • a process of outputting display data for displaying meal contents to the display device based on the stored meal information is executed.
  • a display method of meal information in a management system is a display method of meal information in a management system for managing meal information, including a management device and a display device,
  • the step of inputting meal information and body weight together with the date and time, and the meal information of the unit period based on the weight change of the user during the unit period are set to be greater than the first meal information and threshold value where the weight change is greater than the threshold
  • the step of classifying into small second meal information and the step of displaying the meal information of the unit period together with the classification of the first meal information or the second meal information are included.
  • the present invention it is possible to present a meal that is easily fattened or hard to fat for an individual user without requiring complicated processing. Therefore, meal management suitable for each user can be performed.
  • FIG. 1 is a diagram showing a specific example of the configuration of a meal information management system (hereinafter referred to as a management system) 1 according to the present embodiment.
  • a management system a meal information management system
  • the management system 1 includes a management device 100, a weight scale 200 connected to the management device 100, a camera 300, an input device 400, and a display device 500 connected to the management device 100. .
  • the management system 1A according to the first embodiment described later
  • the management system 1B according to the second embodiment
  • the management system 1C according to the third embodiment. These are representatively referred to as the management system 1.
  • the system configuration is common to the management systems 1A to 1C.
  • the management device 100 may be a device having a communication function, and may be configured by, for example, a general personal computer.
  • the weight scale 200, the camera 300, the input device 400, and the display device 500 all have a communication function capable of communicating with the management device 100.
  • these devices and the management device 100 may be directly connected to each other through a dedicated line or a wireless line, or may be communicated via the Internet.
  • these devices communicate with the management device 100 via the Internet. Therefore, the communication function of these devices corresponds to the function of connecting to the Internet and communicating with the management device 100 using the access information of the management device 100 on the Internet.
  • the input device 400 may be any device as long as it has a function for mainly accepting input of text data and transmitting it to the management device 100.
  • a mobile phone or a general personal computer can be cited.
  • an email having the text data as a body may be transmitted from the input device 400 to the management device 100.
  • the scale 200 has a function of measuring the body weight and transmitting the measured value to the management apparatus 100.
  • the camera 300 has a function for capturing an image and transmitting the captured image to the management apparatus 100.
  • an electronic mail in which information to be transmitted is attached data may be transmitted from these apparatuses to the management apparatus 100.
  • the display device 500 can receive any display data expressed in, for example, HTML (Hyper Text Markup Language) from the management device 100, and can be any device having a function for executing display processing based on the display data. It may be a device. As an example, a mobile phone or a general personal computer can be cited.
  • HTML Hyper Text Markup Language
  • the weight scale 200, the camera 300, the input device 400, and the display device 500 are different devices, that is, constituted by separate devices.
  • the camera 300 and the input device 400 may be an integrated device, or the display unit of the input device 400 that is a camera may be used as the display device 500.
  • the management system 1 includes a scale 200 having a communication function.
  • the input device 400 may receive an input of a weight measurement value, and the input device 400 may transmit the information to the management device 100.
  • the management device 100 is a device different from any of the weight scale 200, the camera 300, the input device 400, and the display device 500.
  • the management apparatus 100 may be included in any of these apparatuses.
  • ⁇ Device configuration> 2 to 5 are diagrams showing specific examples of device configurations of the weight scale 200, the camera 300, the input device 400, and the management device 100, respectively.
  • a weight scale 200 includes a CPU (Central Processing Unit) 20 for performing overall control, a memory 21 for storing programs executed by the CPU 20, measurement values, and the like, and user operation inputs.
  • An operation button 22 for receiving, a measuring unit 23 for measuring body weight, and an access destination on the Internet of the management device 100 are stored in advance, and a communication unit for communicating with the management device 100 via the Internet 24.
  • the CPU 20 causes the measurement unit 23 to execute processing for measuring body weight in accordance with an operation signal based on an operation instructing measurement start from the operation button 22.
  • the measured value is temporarily stored in a predetermined area of the memory 21.
  • the CPU 20 reads the measurement value from the area of the memory 21, and sends the measurement value to the communication unit 24 as a management device.
  • the processing for transmitting to 100 is executed.
  • camera 300 includes a CPU 30 for performing overall control, a memory 31 for storing programs executed by CPU 30 and captured images, and operation buttons 32 for receiving user operation inputs.
  • the image capturing unit 33 for performing image capturing and the communication unit 34 for storing the access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
  • the CPU 30 causes the photographing unit 33 to execute a photographing process in response to an operation signal based on an operation instructing the measurement start from the operation button 32.
  • the captured image is temporarily stored in a predetermined area of the memory 31. Further, in response to an operation signal based on an operation for instructing transmission of a captured image from the operation button 32, the CPU 30 reads the captured image from the area of the memory 31 and sends the captured image to the communication unit 34 as a management device. The processing for transmitting to 100 is executed.
  • an input device 400 includes a CPU 40 for performing overall control, a memory 41 for storing programs executed by CPU 40, input information, and the like, and includes user alphabets and numeric keys.
  • An operation button 42 for receiving an operation input and a communication unit 44 for storing an access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
  • the CPU 40 specifies information obtained by converting the operation signal as input information in accordance with an operation signal based on an operation for inputting information from the operation button 42, and stores it in a predetermined area of the memory 41. Further, in response to an operation signal based on an operation instructing transmission of input information from the operation button 42, the CPU 40 reads the input information from the area of the memory 41, and transmits the input information to the communication unit 44. The processing for transmitting to 100 is executed.
  • management device 100 has a CPU 10 for overall control, a memory 11 for storing a program executed by CPU 10, and a communication for communicating with other devices via the Internet. Part 14.
  • the memory 11 stores a first database 111 described later.
  • FIG. 6 is a diagram for explaining a flow of management of meal information in the management system 1A.
  • the user measures the morning and evening weights using the weight scale 200, and transmits the measured value to the management apparatus 100. That is, the weight measurement is performed by the user instructing the weight scale 200 to measure in a time zone (for example, 6:00 am to 9:00 am) preliminarily designated as “morning” (step S11-1). The result is transmitted to the management apparatus 100 together with the measurement date and time (step S11-3).
  • the weight measured in the above time zone is also referred to as “morning weight”.
  • step S12-1 when the user instructs the weight scale 200 to measure in a time zone (for example, 8:00 pm to 11:00 pm, etc.) previously defined as “evening” (step S12-1), the weight measurement is performed. Is performed (step S12-2), and the result is transmitted to the management apparatus 100 together with the measurement date (step S12-3).
  • the weight measured in the above time zone is also referred to as “evening weight”.
  • the user captures the contents of the meal to be ingested with the camera 300 during the period between weight measurement in the morning and evening, and transmits the image data to the management apparatus 100. That is, breakfast is photographed by the user instructing the camera 300 to take a picture during a time period preliminarily defined as “breakfast” (for example, from 6 am to 10 am) (step S21-1). In step S21-2), the photographed image is transmitted to the management apparatus 100 together with the photographing date and time (step S21-3).
  • a time zone pre-defined as “lunch” for example, from 10 am to 2 pm
  • a time zone pre-defined as “dinner” for example, from 5 pm to 9 pm
  • the user may input using the input device 400 and transmit the content to the management device 100.
  • the management apparatus 100 calculates the weight difference between the morning weight and the evening weight for each measurement date from the measured weight value transmitted in association with the measurement date and time from the scale 200 (step S31).
  • the captured image from the camera 300 is stored as meal information in association with the shooting date and time. If there is input information such as a comment input from the input device 400 in association with the captured image, the input information is also stored in the first database 111 of the memory 11 as meal information.
  • the management apparatus 100 classifies the meal information into “meal that tends to be fat” and “meal that is difficult to gain” based on the weight difference.
  • a specific instruction method includes, for example, accessing a preset WEB site and performing an operation for requesting.
  • the display device 500 requests display data from the management device 100 (step S41-2).
  • the management apparatus 100 prepares in advance a photographed image that is meal information classified as “mealing easily” and a photographed image that is meal information classified as “food that is not easily fat”.
  • display data is generated (step S33-1), and the display data is transmitted to the display device 500 that is the request source.
  • the meal information includes user information or the like in advance
  • user authentication is performed using the information included in the request from the display device 500 and the user information, and display data is transmitted when the authentication is successful. You may do it.
  • the display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, a photographed image of a meal classified as “meal with easy weight gain” and a photographed image of a meal classified as “meal with difficulty in weight gain” are displayed.
  • the weight scale 200, the camera 300, and the input device 400 all store in advance a function for acquiring information (measurement value, captured image, input information) in accordance with an operation input from the user. It has a function for transmitting to the management apparatus 100. These functions are mainly realized by the CPU by the CPU included in each device reading and executing the program stored in the memory.
  • the display device 500 has a function for transmitting a request specified in advance according to a display operation from the user to the management device 100, and display data transmitted from the management device 100 in response to the request. And a function for executing a process for performing screen display based on the display data.
  • These functions are mainly realized by the CPU by reading and executing a program stored in the memory by a CPU (not shown) included in the display device 500.
  • FIG. 7 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1A.
  • Each function shown in FIG. 7 is mainly realized by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11.
  • at least a part may be configured by hardware such as an electric circuit.
  • the CPU 10 of the management apparatus 100 receives an image input unit 101 for receiving an input of a photographed image from the camera 300 via the communication unit 14 and text information from the input device 400 via the communication unit 14.
  • a text input unit 102 for accepting input of certain input information, and a captured image associated with the photographing date and time (or associated with the input information, if any) stored in the first database 111 stored in the memory 11 as meal information
  • a storage unit 103 for performing measurement, a measurement value input unit 104 for receiving an input of a measurement value from the scale 200 via the communication unit 14, and a measurement date based on information for specifying a measurement date and time associated with the measurement value
  • a morning weight and an evening weight for each measurement date and a calculation unit 105 for calculating a weight difference between the morning weight and the evening weight for each measurement day, and based on the weight difference
  • a classifying unit 106 for classifying whether meal information is likely to be fat or difficult to eat for each photographing date corresponding to a fixed day
  • a request unit 107 for
  • the classification unit 106 classifies whether the meal is easy to get fat or less fat based on the weight difference between the morning weight and the evening weight on the measurement day corresponding to the shooting date. Information specifying the classification is further included in the meal information and stored in the first database 111. The calculated weight difference is further included in the meal information and stored in the first database 111.
  • the classification unit 106 stores a threshold value in advance, compares the weight difference with the threshold value, and if the food exceeds the threshold value, There is a method of classifying a meal that is difficult to get fat when it is smaller than a threshold value.
  • a predetermined number for example, 0.2 kg
  • a meal that tends to become fat from the average value
  • a method of classifying as a meal that is difficult to get fat when it is smaller than the predetermined number is mentioned.
  • the reading unit 108 In response to a request from the display device 500, the reading unit 108, as an example, for meal information for a predetermined period (number of days) prior to the reception of the request, meal information classified as being easily fattened and fat Meal information classified as difficult meals may be read out. Display data is generated based on the read information and displayed on the screen, so that the tendency of recent meals can be known.
  • This display example is a first display example.
  • the weight difference between the morning weight and the evening weight included in the meal information is referred to, and the predetermined number in order from the largest weight difference as meal information classified as an easily fattened meal
  • the predetermined number of pieces of meal information may be read in order from the smallest weight difference as meal information classified as meals that are difficult to get fat.
  • Display data is generated based on the read information and displayed on the screen, so that it is possible to know the ranking from the top of a meal that is easily fattened and the lineking from the top of a meal that is not easily fattened.
  • the reading unit 108 reads meal information for a predetermined period (number of days) before the reception of the request, and generates display data after the generation unit 109 extracts the meal information according to the weight difference. You may make it do.
  • This display example is a first display example.
  • FIG. 8 is a flowchart showing a specific example of the flow of operations in the management apparatus 100 included in the management system 1A.
  • the operation shown in the flowchart of FIG. 8 is realized by the CPU 10 reading and executing a program stored in the memory 11.
  • step S103 CPU 10 determines the measurement date from the measurement date and time information associated with the measurement value.
  • the time period specified in advance and the measurement time are compared to specify whether the body weight is morning weight or evening weight, and the measurement value is stored together with the information.
  • step S107 the CPU 10 specifies a photographing date from information on the photographing date and time associated with the photographed image and is defined in advance. By comparing a certain time zone and the photographing time, it is specified whether it is breakfast, lunch or dinner, and the photographed image is stored as meal information together with the information.
  • step S109 the CPU 10 calculates the weight difference between the morning weight and the evening weight for each measurement day.
  • the process of step S109 may be performed at the timing when the received measurement value is identified as the evening weight in step S103, or may be performed at a predetermined time (for example, midnight). It may be performed at a timing when a display request described later is received.
  • step S111 the CPU 10 classifies the meal information associated with the shooting date corresponding to the measurement date based on the weight difference between the morning and evening weights calculated in step S109 as a meal that tends to be fat or a meal that is difficult to gain weight. .
  • the CPU 10 further stores information specifying the classification and the weight difference together with the meal information.
  • step S113 When a display request is received from the display device 500 via the communication unit 14 (YES in step S113), the CPU 10 reads meal information from the first database 111 in step S115, and generates display data corresponding to the classification in step S117. To do.
  • step S117 display data as described as the first display example and the second display example is generated as an example.
  • step S ⁇ b> 119 the CPU 10 transmits the generated display data to the display device 500.
  • meal information including captured images of daily meals is classified according to the weight difference between the morning and evening weights of the day, and displayed on the display device 500 based on the classification.
  • FIG. 9 is a diagram illustrating a first specific example of the first display example.
  • a meal image whose shooting date is the day is shown as meal information for that day. Is displayed. In the example of FIG. 9, only one meal is displayed, but an image of three meals (or four meals or more including snacks) may be displayed on one screen or scrolled and displayed on another screen. Further, as the classification result, whether the meal is a fat meal or a meal that is difficult to fat is displayed. In the example of FIG. 9, an example is shown in which the classification result is displayed as “B” for an easily fattened meal and “G” for a less easily fattened meal.
  • FIG. 10 is a diagram showing a second specific example of the first display example.
  • the example of FIG. 10 represents only a captured image and a classification result portion of the display screen as a first display example.
  • images of three meals are displayed side by side on one screen for each measurement day, and further, the day The weight difference between the morning weight and the evening weight is displayed. Further, as the classification result, the three meals on the measurement date are displayed surrounded by a frame corresponding to the classification result indicating whether the meal is easily fat or not.
  • a thick line double line
  • a dotted line frame is displayed for a meal that is not fat.
  • the measurement value is transmitted to the management apparatus 100 using the weight scale 200, and the captured image is transmitted to the management apparatus 100 using the camera 300.
  • the corresponding meal information may be input as text information representing a menu using the input device 400 and transmitted to the management device 100, for example.
  • input text information may be displayed instead of the captured image.
  • the measurement value is input as text information using the input device 400 and transmitted to the management device 100. May be.
  • FIG. 11 is a diagram illustrating a specific example of the second display example.
  • meal information on the measurement date is read out in order of increasing or decreasing weight difference between morning weight and evening weight for each day in a predetermined period. Is displayed.
  • captured images corresponding to measurement days with large weight differences are displayed in order of increasing weight difference as “menu ranking easy to gain weight”, and measurement dates with small weight difference are displayed as “menu ranking difficult to gain weight”.
  • An example is shown in which captured images on corresponding shooting dates are displayed in ascending order of weight difference.
  • the user can easily make the meal contents on the day when the weight difference of the day is large in his / her meal, and the meal contents on the day when the weight difference is not so large. Can be grasped as a meal that is difficult to gain weight.
  • the user can grasp the meal content that tends to increase the daily weight difference and the meal content that is difficult to increase in his / her meal.
  • Etc. can be grasped at a glance. Therefore, own meal management can be easily performed.
  • FIG. 12 is a diagram for explaining a flow of management of meal information in the management system 1B.
  • the measurement operation with the weight scale 200 is the same as the operation with the management system 1A shown in FIG.
  • management system 1 ⁇ / b> B compared with the management flow in management system 1 ⁇ / b> A shown in FIG. 6, in management system 1 ⁇ / b> B, the user transmits a captured image of each meal and uses input device 400.
  • a keyword for the meal content is input (steps S51-1, S52-1, and S53-1), and the keyword is transmitted to the management apparatus 100.
  • Keywords include, for example, ingredients included in the meal content, menu genres, and the like.
  • the keyword may be directly input by a character button constituting the operation button 42, or may be selected from options prepared in advance.
  • the management apparatus 100 also stores keywords as meal information along with the captured images. Further, upon receiving the display request (step S41-2) from the display device 500, the management device 100 generates display data using the classification results of “meal that tends to be fat” and “meal that is difficult to gain weight”. Then, meal information to be substituted for meal information classified as “fat easily” is extracted (step S33-0), and a captured image included in the meal information is displayed as an alternative menu in the display data (step S33-0). Step S33-1).
  • the display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, as with the management system 1A, a photographed image of a meal classified as “easy to get fat” and a photographed image of a meal classified as “a meal that is difficult to get fat” are displayed. The photographed image included in the meal information extracted as a meal that substitutes for the meal classified as "is displayed.
  • FIG. 13 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1B.
  • Each function shown in FIG. 13 is mainly formed by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11.
  • at least a part may be configured by hardware such as an electric circuit.
  • CPU 10 of management device 100 further includes an extraction unit 110 for extracting meal information to be substituted in addition to the functional configuration of management device 100 included in management system 1 ⁇ / b> A shown in FIG. 7. Including.
  • the memory 11 stores a second database 112.
  • FIG. 14 is a diagram showing a specific example of information described in the second database 112. As shown in FIG. 14, in the second database 112, information such as materials, categories, and calories is described for each menu and stored in the memory 11 in advance.
  • the extraction unit 110 refers to the meal information classified as “fat that is easy to get fat” by the classification unit 106 and extracts keywords representing the category and material of the menu included in the meal information. To do.
  • the extraction unit 110 reads the calorie of the menu by referring to the second database 112 based on the keyword. Then, the extraction unit 110 extracts from the second database 112 as a menu that substitutes for another menu that has a category smaller than that of the menu and that has the same category as that of the menu or another menu with overlapping materials. To do.
  • a menu having a lower calorie than the ingested meal can be extracted as an alternative menu based on a pre-defined database.
  • the extraction unit 110 refers to the meal information classified as “fat that tends to be fat” by the classification unit 106, and extracts a keyword representing the category or material of the menu included in the meal information. To do.
  • the extraction unit 110 refers to the first database 111 based on the keyword, so that meals on other measurement days associated with a weight difference smaller than the weight difference on the measurement date corresponding to the photographing date are associated with the meal information.
  • the menu is extracted from the first database 111 as a menu that substitutes for another menu having the same category as that of the menu or another menu having overlapping materials.
  • FIG. 15 is a flowchart showing a specific example of the operation flow in the management apparatus 100 included in the management system 1B. The operation shown in the flowchart of FIG. 15 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • step S101 to S115 operations similar to those of the management apparatus 100 included in the management system 1A shown in FIG. 8 are performed from step S101 to S115.
  • the CPU 10 reads meal information from the first database 111 in step S115, and in step S116, the above-described processing is performed. In this way, an alternative menu is extracted for meal information classified as fat meals. And CPU10 produces
  • an alternative menu is displayed in addition to the screen contents represented by the display data in the management apparatus 100 included in the management system 1A.
  • a display method a method of displaying an alternative menu on the screen as shown in FIG. 9 to FIG. 11 or an alternative menu on the screen as shown in FIG. 9 to FIG.
  • a button or the like for instructing display may be displayed, and an alternative menu may be displayed on a screen that is switched when the button is pressed.
  • the alternative menu when the alternative menu is extracted from the first database 111 as described above, the alternative menu may be displayed using a captured image included in the meal information.
  • the weight change prediction operation executed in the management system 1C and the menu suggestion operation executed in the management system 1C all represent menus, materials, meal categories, and the like from the input device 400 in the management device 100. This is executed by receiving a display request from the display device 500 together with the input of the keyword.
  • the user when executing a weight change prediction operation, the user inputs a menu, a material, a meal category, or the like using the input device 400 as text information before ingesting a meal or when transmitting a captured image. It is transmitted to the management apparatus 100 as input information. Furthermore, a request for prediction of weight transition is transmitted from the display device 500.
  • the management apparatus 100 refers to the meal information including the keyword stored in the first database 111 and reads the weight change of the day represented by the meal information. Then, the result is displayed as a predicted value of the change in weight. By displaying the information on the display device 500, the user can know the predicted value of the change from the morning weight by ingesting a meal corresponding to the input menu, material, meal category, or the like.
  • the user determines a material, a meal category, or the like before taking a meal or transmitting a photographed image, and uses the contents as keywords as a management device. 100, and the display device 500 transmits a menu presentation request.
  • the management apparatus 100 extracts meal information that is a weight difference that is considered to be a meal in which the weight difference of the user is less likely to be fat from the meal information including the keyword, and displays it as a suggestion menu.
  • the user By displaying the information on the display device 500, the user knows a menu that matches the input material, the category of the meal, and the like and is classified as “a meal that is difficult to get fat” in the previous meal. it can.
  • the functional configuration of the management apparatus 100 included in the management system 1C for performing the above operation is the same as the functional configuration of the management apparatus 100 included in the management system 1B shown in FIG. That is, the second database 112 is stored in the memory 11, and the extraction unit 110 is included in addition to the management device 100 included in the management system 1A.
  • the extraction unit 110 refers to the second database 112 using a keyword from the input device 400 and extracts a menu including the keyword. Furthermore, the extraction unit 110 refers to the first database 111 using the menu, extracts meal information including the menu stored in the first database 111, and the weight included in the meal information. Read the difference. Alternatively, the extraction unit 110 refers to the first database 111 using the menu, and among the meal information including the menu stored in the first database 111, the meal information classified as “a meal that is difficult to get fat” To extract.
  • the extraction unit 110 when the meal information stored in the first database 111 includes a keyword representing the meal material or the meal category input from the input device 400, the extraction unit 110 The first database 111 is referred to using the keyword from the input device 400, and the weight difference included in the information is read from the meal information including the keyword stored in the first database 111.
  • the extraction unit 110 refers to the first database 111 using the keyword from the input device 400 and classifies the meal information including the keyword stored in the first database 111 as “a meal that is difficult to get fat”. Extracted meal information.
  • the management apparatus 100 included in the management system 1C further performs a weight change prediction operation and a menu suggestion operation.
  • FIG. 16 is a flowchart showing a specific example of a flow of a weight change prediction operation in the management apparatus 100 included in the management system 1C.
  • the operation shown in the flowchart of FIG. 16 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • the weight change predicting operation shown in the flowchart of FIG. 16 is a specific example of the operation when the CPU 10 realizes the function shown in the first example by the extracting unit 110.
  • CPU 10 when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S201), CPU 10 uses the keyword in step S203. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
  • step S205 the CPU 10 refers to the first database 111, extracts the meal information including the menu extracted in step S203 from the first database 111, and reads the weight difference included in the information from the extracted meal information.
  • step S205 the CPU 10 reads the weight difference contained for every meal information extracted by the said step S205.
  • step S209 the CPU 10 predicts the weight change based on the weight difference read in step S205. Is generated and transmitted to the display device 500 in step S211.
  • step S209 the CPU 10 calculates an average value of the weight differences and specifies the average value as a predicted value. Or an operation such as specifying the largest of the weight differences as a predicted value is further performed.
  • FIG. 17 is a flowchart showing a specific example of the menu proposal operation flow in the management apparatus 100 included in the management system 1C.
  • the operation shown in the flowchart of FIG. 17 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • the menu suggestion operation shown in the flowchart of FIG. 17 is a specific example of the operation when the CPU 10 implements the function shown in the first example by the extraction unit 110.
  • CPU 10 when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S301), CPU 10 uses the keyword in step S303. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
  • step S305 the CPU 10 refers to the first database 111, and extracts meal information including the menu extracted in step S303 from the first database 111.
  • the meal information is classified as “a meal that is difficult to get fat” by the above-described operation (YES in step S307)
  • step S309 the CPU 10 identifies the menu represented by the meal information as a suggested menu. .
  • the CPU 10 determines, for each meal information extracted in step S305, whether or not the meal information is classified as “a meal that is difficult to get fat”. When the above determination is made for all meal information extracted in step S305 (NO in step S311), in step S313, the CPU 10 displays display data based on the meal information specified as the suggested menu in step S309. Generated and transmitted to the display device 500 in step S315.
  • the user can know the estimated value of change in morning and evening weight when the user ingests the meal from the input menu, material, meal category, and the like.
  • the menu of the meal to be taken can be reviewed. For example, it is possible to take “a meal that is difficult to get fat” based on the desired material, the category of the meal, and the like. Therefore, own meal management can be performed more appropriately.
  • a program for causing a computer to execute the operation of the management apparatus 100 can also be provided.
  • Such programs include non-primary (non-random access memory) such as a flexible disk attached to a computer, a CD-ROM (Compact Disk-Read Only Memory), a ROM (Read Only Memory), a RAM (Random Access Memory), and a memory card.
  • non-primary non-random access memory
  • CD-ROM Compact Disk-Read Only Memory
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a memory card Random Access Memory
  • It can be recorded on a computer-readable recording medium and provided as a program product.
  • the program can be provided by being recorded on a recording medium such as a hard disk built in the computer.
  • a program can also be provided by downloading via a network.
  • the program as described above may be a program module that is provided as a part of a computer operating system (OS) and that calls necessary modules in a predetermined arrangement at a predetermined timing to execute processing. .
  • OS computer operating system
  • the program itself does not include the module, and the process is executed in cooperation with the OS.
  • a program that does not include such a module can also be included in the program according to the present invention.
  • the program according to the present invention may be provided by being incorporated in a part of another program. Even in this case, the program itself does not include the module included in the other program, and the process is executed in cooperation with the other program. Such a program incorporated in another program can also be included in the program according to the present invention.
  • the provided program product is installed in a program storage unit such as a hard disk and executed.
  • the program product includes the program itself and a recording medium on which the program is recorded.
  • 1,1A, 1B, 1C management system 10 20, 30, 40 CPU, 11, 21, 31, 41 memory, 14, 24, 34, 44 communication unit, 22, 32 operation buttons, 23 measurement unit, 33 shooting Unit, 42 operation buttons, 100 management device, 101 image input unit, 102 text input unit, 103 storage unit, 104 measurement value input unit, 105 calculation unit, 106 classification unit, 107 request unit, 108 readout unit, 109 generation unit, 110 Extraction unit, 111 1st database, 112 2nd database, 200 scale, 300 camera, 400 input device, 500 display device.

Landscapes

  • Health & Medical Sciences (AREA)
  • Nutrition Science (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A management device for managing meal information contains an input unit for inputting the meal information and weight of a user along with the time and date (S11-3, S12-3), a processing unit for processing the meal information and the weight, and an output unit for outputting the information processed by means of the processing unit (S33-2). The processing unit executes a process (S31, S32) for storing the meal information during a unit period in association with the change in the weight of the user during said unit period to a storing device, and a process (S33-1) for outputting, in accordance with the change in the weight of the user during the unit period, a meal content based on the meal information stored in the storing device in association with the change in weight.

Description

管理装置、管理システム、および表示方法Management device, management system, and display method
 この発明は、食事内容を管理するための管理装置および管理システム、ならびに当該管理システムにおける表示方法に関する。 The present invention relates to a management device and a management system for managing meal contents, and a display method in the management system.
 従来の食事管理システムとして、たとえば特開2003-67497号公報(特許文献1)は、サーバにおいて食事内容を画像データとして取得し、該画像データを分析して改善のアドバイス等を提示するシステムを開示している。 As a conventional meal management system, for example, Japanese Patent Laid-Open No. 2003-67497 (Patent Document 1) discloses a system that acquires meal contents as image data in a server, analyzes the image data, and presents improvement advice and the like. is doing.
特開2003-67497号公報JP 2003-67497 A
 しかしながら、特許文献1に開示されているような従来の食事管理システムでは、画像データに基づいた栄養量や栄養バランスなどの判断は、かかる画像を見た管理栄養士などであるアドバイザが行なって入力するもの、または、アドバイザが料理と量とを確定して入力することで栄養量や栄養バランスなどをサーバが自動的に算出するもの、とされており、アドバイザによるかかる判断が必要となる、という問題があった。 However, in the conventional meal management system as disclosed in Patent Document 1, the nutritional amount and nutritional balance based on the image data are input by an advisor who is a registered dietitian who viewed the image. That the server automatically calculates the nutritional amount and nutritional balance when the advisor decides and inputs the food and quantity, and the advisor needs to make such a determination was there.
 また、仮に、アドバイザによる判断を不要とした場合、サーバにおいて画像データを解析して料理と量とを特定し、それに基づいて栄養量、栄養バランスなどを算出するという複雑な処理が必要となる、という問題があった。 In addition, if the judgment by the advisor is not necessary, a complicated process of analyzing the image data in the server to identify the dish and the amount and calculating the nutrition amount, nutrition balance, etc. based on the analysis is required. There was a problem.
 さらに、特許文献1に開示されているような従来の食事管理システムでは、一般的な食事指導は得られるものの個々のユーザに対応した食事指導ではないために、ユーザは必ずしも自身の食事管理に適した食事管理を行なうことができない場合もある、という問題もあった。 Further, in the conventional meal management system as disclosed in Patent Document 1, although general meal guidance can be obtained, it is not meal guidance corresponding to individual users, so the user is not necessarily suitable for own meal management. There was also a problem that sometimes it was not possible to manage meals.
 本発明はこのような問題に鑑みてなされたものであって、複雑な処理を要することなく個々のユーザに適した食事管理を行なうことのできる管理装置、管理システム、および表示方法を提供することを目的としている。 The present invention has been made in view of such problems, and provides a management device, a management system, and a display method capable of performing meal management suitable for individual users without requiring complicated processing. It is an object.
 上記目的を達成するために、本発明のある局面に従うと、管理装置は食事情報を管理するための管理装置であって、ユーザの食事情報と体重とを日時と共に入力するための入力部と、食事情報と体重とを処理するための処理部と、処理部で処理された情報を出力するための出力部とを含む。処理部は、単位期間の食事情報と当該単位期間のユーザの体重変化とを関連付けて記憶装置に記憶する処理と、単位期間のユーザの体重変化に応じて、当該体重変化に関連付けて記憶装置に記憶された食事情報に基づく、食事内容を出力する処理とを実行する。 In order to achieve the above object, according to one aspect of the present invention, the management device is a management device for managing meal information, and an input unit for inputting the meal information and weight of the user together with the date and time, A processing unit for processing meal information and body weight and an output unit for outputting information processed by the processing unit are included. The processing unit associates the meal information of the unit period with the change in weight of the user during the unit period and stores it in the storage device, and associates the change in weight of the user during the unit period with the change in weight in the storage device. A process of outputting meal contents based on the stored meal information is executed.
 好ましくは、単位期間は一日であって、処理部は、食事内容を出力する処理で、しきい値よりも大きい一日のユーザの体重変化と関連付けられた食事情報を第1の食事情報、および、しきい値よりも小さい一日のユーザの体重変化と関連付けられた食事情報を第2の食事情報に分類し、第1の食事情報および第2の食事情報のそれぞれに基づく食事内容を、第1の食事情報および第2の食事情報の分類と共に出力する。 Preferably, the unit period is one day, and the processing unit outputs the meal information associated with the weight change of the user for one day larger than the threshold in the process of outputting the meal content, And classifying the meal information associated with the weight change of the user of the day smaller than the threshold value into the second meal information, and the meal contents based on each of the first meal information and the second meal information, Output together with the classification of the first meal information and the second meal information.
 より好ましくは、処理部は、所定期間のユーザの体重変化を用いてしきい値を算出する。 More preferably, the processing unit calculates the threshold value using the weight change of the user during a predetermined period.
 好ましくは、食事情報は、当該食事を撮影して得られる撮影画像を含み、処理部は、撮影画像に基づく画像を食事内容として出力する処理を実行する。 Preferably, the meal information includes a captured image obtained by capturing the meal, and the processing unit executes a process of outputting an image based on the captured image as the content of the meal.
 本発明の他の局面に従うと、管理装置は食事情報を管理するための管理装置であって、ユーザの食事情報と体重とを日時と共に入力するための第1の入力部と、食事内容を入力するための第2の入力部と、食事情報と体重とを処理するための処理部と、処理部で処理された情報を出力するための出力部とを含む。処理部は、単位期間の食事情報と当該単位期間のユーザの体重変化とを関連付けて記憶装置に記憶する処理と、入力された食事内容と対応した食事情報と関連付けて記憶装置に記憶された、単位期間のユーザの体重変化を体重変化の推定値として出力する処理とを実行する。 According to another aspect of the present invention, the management device is a management device for managing meal information, the first input unit for inputting the meal information and weight of the user together with the date and time, and the input of the meal contents A second input unit for processing, a processing unit for processing meal information and body weight, and an output unit for outputting information processed by the processing unit. The processing unit associates the meal information of the unit period with the weight change of the user of the unit period and stores it in the storage device, and stores the meal information in association with the input meal content, A process of outputting the weight change of the user during the unit period as an estimated value of the weight change is executed.
 本発明のさらに他の局面に従うと、管理システムは食事情報を管理するための管理システムであって、食事内容を日時と共に入力するための第1の入力装置と、体重を日時と共に入力するための第2の入力装置と、食事情報と体重とを処理するための管理装置と、表示装置とを含む。管理装置は、単位期間の食事情報と当該単位期間のユーザの体重変化とを関連付けて記憶装置に記憶する処理と、単位期間のユーザの体重変化に応じて、当該体重変化に関連付けて記憶装置に記憶された食事情報に基づく、食事内容を表示させるための表示データを表示装置に出力する処理とを実行する。 According to still another aspect of the present invention, the management system is a management system for managing meal information, the first input device for inputting meal contents together with the date and time, and the weight for inputting weight together with the date and time. A second input device, a management device for processing meal information and weight, and a display device are included. The management device associates the meal information of the unit period with the change in the user's weight during the unit period and stores it in the storage device, and associates the change in the weight of the user during the unit period with the change in the weight in the storage device. A process of outputting display data for displaying meal contents to the display device based on the stored meal information is executed.
 本発明のさらに他の局面に従うと、管理システムでの食事情報の表示方法は管理装置と表示装置とを含む、食事情報を管理するための管理システムにおける食事情報の表示方法であって、ユーザの食事情報と体重とを日時と共に入力するステップと、単位期間のユーザの体重変化に基づき、単位期間の食事情報を、体重変化がしきい値よりも大きい第1の食事情報およびしきい値よりも小さい第2の食事情報に分類するステップと、単位期間の食事情報を、第1の食事情報または第2の食事情報の分類と共に表示するステップとを含む。 According to still another aspect of the present invention, a display method of meal information in a management system is a display method of meal information in a management system for managing meal information, including a management device and a display device, The step of inputting meal information and body weight together with the date and time, and the meal information of the unit period based on the weight change of the user during the unit period are set to be greater than the first meal information and threshold value where the weight change is greater than the threshold The step of classifying into small second meal information and the step of displaying the meal information of the unit period together with the classification of the first meal information or the second meal information are included.
 この発明によると、複雑な処理を要することなく個々のユーザにとって太りやすい食事、太りにくい食事を提示することができる。そのため、ユーザごとに適した食事管理を行なうことができる。 According to the present invention, it is possible to present a meal that is easily fattened or hard to fat for an individual user without requiring complicated processing. Therefore, meal management suitable for each user can be performed.
実施の形態にかかる食事情報管理システムの構成の具体例を示す図である。It is a figure which shows the specific example of a structure of the meal information management system concerning embodiment. 本実施の形態にかかる食事情報管理システムに含まれる体重計の装置構成の具体例を示す図である。It is a figure which shows the specific example of the apparatus structure of the weight scale contained in the meal information management system concerning this Embodiment. 本実施の形態にかかる食事情報管理システムに含まれるカメラの装置構成の具体例を示す図である。It is a figure which shows the specific example of the apparatus structure of the camera contained in the meal information management system concerning this Embodiment. 本実施の形態にかかる食事情報管理システムに含まれる入力装置の装置構成の具体例を示す図である。It is a figure which shows the specific example of the apparatus structure of the input device contained in the meal information management system concerning this Embodiment. 本実施の形態にかかる食事情報管理システムに含まれる管理装置の装置構成の具体例を示す図である。It is a figure which shows the specific example of the apparatus structure of the management apparatus contained in the meal information management system concerning this Embodiment. 第1の実施の形態にかかる食事情報管理システムでの食事情報の管理の流れを説明するための図である。It is a figure for demonstrating the flow of management of the meal information in the meal information management system concerning 1st Embodiment. 第1の実施の形態にかかる食事情報管理システムに含まれる管理装置の機能構成の具体例を示すブロック図である。It is a block diagram which shows the specific example of a function structure of the management apparatus contained in the meal information management system concerning 1st Embodiment. 第1の実施の形態にかかる食事情報管理システムに含まれる管理装置での動作の流れの具体例を表わしたフローチャートである。It is a flowchart showing the specific example of the flow of operation | movement with the management apparatus contained in the meal information management system concerning 1st Embodiment. 第1の表示例の、第1の具体例を示す図である。It is a figure which shows the 1st specific example of a 1st display example. 第1の表示例の、第2の具体例を示す図である。It is a figure which shows the 2nd specific example of a 1st display example. 第2の表示例の具体例を示す図である。It is a figure which shows the specific example of a 2nd display example. 第2の実施の形態にかかる食事情報管理システムでの食事情報の管理の流れを説明するための図である。It is a figure for demonstrating the flow of management of the meal information in the meal information management system concerning 2nd Embodiment. 第2の実施の形態にかかる食事情報管理システムに含まれる管理装置の機能構成の具体例を示すブロック図である。It is a block diagram which shows the specific example of a function structure of the management apparatus contained in the meal information management system concerning 2nd Embodiment. 第2データベースに記載される情報の具体例を示す図である。It is a figure which shows the specific example of the information described in a 2nd database. 第2の実施の形態にかかる食事情報管理システムに含まれる管理装置での動作の流れの具体例を表わしたフローチャートである。It is a flowchart showing the specific example of the flow of operation | movement with the management apparatus contained in the meal information management system concerning 2nd Embodiment. 第3の実施の形態にかかる食事情報管理システムに含まれる管理装置での体重変化の予測動作の流れの具体例を表わしたフローチャートである。It is a flowchart showing the specific example of the flow of the prediction operation | movement of the weight change in the management apparatus contained in the meal information management system concerning 3rd Embodiment. 第3の実施の形態にかかる食事情報管理システムに含まれる管理装置でのメニュー提案動作の流れの具体例を表わしたフローチャートである。It is a flowchart showing the specific example of the flow of the menu proposal operation | movement in the management apparatus contained in the meal information management system concerning 3rd Embodiment.
 以下に、図面を参照しつつ、本発明の実施の形態について説明する。以下の説明では、同一の部品および構成要素には同一の符号を付してある。それらの名称および機能も同じである。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts and components are denoted by the same reference numerals. Their names and functions are also the same.
 <システム構成>
 図1は、本実施の形態にかかる食事情報管理システム(以下、管理システムと称する)1の構成の具体例を示す図である。
<System configuration>
FIG. 1 is a diagram showing a specific example of the configuration of a meal information management system (hereinafter referred to as a management system) 1 according to the present embodiment.
 図1を参照して、管理システム1は、管理装置100と、管理装置100に接続された体重計200、カメラ300、および入力装置400と、管理装置100に接続された表示装置500とを含む。当該管理システムとして、後述する第1の実施の形態にかかるものを管理システム1A、第2の実施の形態にかかるものを管理システム1B、および第3の実施の形態にかかるものを管理システム1Cと称し、これらを代表させて管理システム1と称する。システム構成は、管理システム1A~1Cで共通である。 Referring to FIG. 1, the management system 1 includes a management device 100, a weight scale 200 connected to the management device 100, a camera 300, an input device 400, and a display device 500 connected to the management device 100. . As the management system, the management system 1A according to the first embodiment described later, the management system 1B according to the second embodiment, and the management system 1C according to the third embodiment. These are representatively referred to as the management system 1. The system configuration is common to the management systems 1A to 1C.
 管理装置100は、通信機能を有する装置であればよく、たとえば、一般的なパーソナルコンピュータなどで構成されるものであってよい。 The management device 100 may be a device having a communication function, and may be configured by, for example, a general personal computer.
 体重計200、カメラ300、入力装置400、および表示装置500は、いずれも、管理装置100と通信可能な通信機能を有する。これらの通信方法としては、これら装置と管理装置100とを専用の回線または無線回線で接続して直接通信するものであってもよいし、インターネットを介して通信するものであってもよい。本実施の形態においては、一例として、これら装置はインターネットを介して管理装置100と通信するものとする。そのため、これら装置の有する通信機能は、インターネットに接続し、インターネット上の管理装置100のアクセス情報を用いて管理装置100と通信を行なう機能に該当する。 The weight scale 200, the camera 300, the input device 400, and the display device 500 all have a communication function capable of communicating with the management device 100. As these communication methods, these devices and the management device 100 may be directly connected to each other through a dedicated line or a wireless line, or may be communicated via the Internet. In the present embodiment, as an example, these devices communicate with the management device 100 via the Internet. Therefore, the communication function of these devices corresponds to the function of connecting to the Internet and communicating with the management device 100 using the access information of the management device 100 on the Internet.
 入力装置400は、主に、テキストデータの入力を受付け、管理装置100に対して送信するための機能を有する装置であれば、どのような装置であってもよい。一例として、携帯電話機や一般的なパーソナルコンピュータなどが挙げられる。入力装置400からインターネットを介して管理装置100に対してテキストデータを送信する方法の一例として、入力装置400から当該テキストデータを本文とした電子メールが管理装置100に対して送信されてもよい。 The input device 400 may be any device as long as it has a function for mainly accepting input of text data and transmitting it to the management device 100. As an example, a mobile phone or a general personal computer can be cited. As an example of a method for transmitting text data from the input device 400 to the management device 100 via the Internet, an email having the text data as a body may be transmitted from the input device 400 to the management device 100.
 体重計200は、体重を測定し、当該測定値を管理装置100に対して送信するための機能を有する。また、カメラ300は、画像を撮影し、当該撮影画像を管理装置100に対して送信するための機能を有する。これら装置からインターネットを介して管理装置100に対して情報を送信する方法の一例として、これら装置から送信対象の情報を添付データとした電子メールが管理装置100に対して送信されてもよい。 The scale 200 has a function of measuring the body weight and transmitting the measured value to the management apparatus 100. The camera 300 has a function for capturing an image and transmitting the captured image to the management apparatus 100. As an example of a method for transmitting information from these apparatuses to the management apparatus 100 via the Internet, an electronic mail in which information to be transmitted is attached data may be transmitted from these apparatuses to the management apparatus 100.
 表示装置500は、管理装置100からたとえばHTML(Hyper Text Markup Language)などで表現された表示データを受け取り、その表示データに基づく表示処理を実行するための機能を有する装置であれば、どのような装置であってもよい。一例として、携帯電話機や一般的なパーソナルコンピュータなどが挙げられる。 The display device 500 can receive any display data expressed in, for example, HTML (Hyper Text Markup Language) from the management device 100, and can be any device having a function for executing display processing based on the display data. It may be a device. As an example, a mobile phone or a general personal computer can be cited.
 図1の例では、体重計200、カメラ300、入力装置400、および表示装置500がそれぞれ異なる装置であって、つまり別個の装置で構成される例が示されているが、たとえば入力装置400としてカメラ機能付きの携帯電話機を用いる場合、カメラ300と入力装置400とは一体の装置であってもよいし、カメラである入力装置400の表示部が表示装置500として用いられてもよい。 In the example of FIG. 1, an example is shown in which the weight scale 200, the camera 300, the input device 400, and the display device 500 are different devices, that is, constituted by separate devices. When a mobile phone with a camera function is used, the camera 300 and the input device 400 may be an integrated device, or the display unit of the input device 400 that is a camera may be used as the display device 500.
 また、図1の例では、通信機能を有する体重計200が管理システム1に含まれるものとしている。しかしながら、体重計200に代えて、体重の測定値の入力を入力装置400で受付けて、入力装置400がその情報を管理装置100に送信するようにしてもよい。 In the example of FIG. 1, it is assumed that the management system 1 includes a scale 200 having a communication function. However, instead of the weight scale 200, the input device 400 may receive an input of a weight measurement value, and the input device 400 may transmit the information to the management device 100.
 さらに、図1の例では、管理装置100が体重計200、カメラ300、入力装置400、および表示装置500のいずれとも異なる装置である例が示されている。しかしながら、管理装置100がこれらの装置のうちいずれかに含まれてもよい。 Furthermore, in the example of FIG. 1, an example is shown in which the management device 100 is a device different from any of the weight scale 200, the camera 300, the input device 400, and the display device 500. However, the management apparatus 100 may be included in any of these apparatuses.
 <装置構成>
 図2~図5は、それぞれ、体重計200、カメラ300、入力装置400、および管理装置100の装置構成の具体例を示す図である。
<Device configuration>
2 to 5 are diagrams showing specific examples of device configurations of the weight scale 200, the camera 300, the input device 400, and the management device 100, respectively.
 図2を参照して、体重計200は、全体制御を行なうためのCPU(Central Processing Unit)20と、CPU20で実行するプログラムや測定値などを記憶するためのメモリ21と、ユーザの操作入力を受付けるための操作ボタン22と、体重を測定するための測定部23と、予め管理装置100のインターネット上のアクセス先を記憶しており、インターネットを介して管理装置100と通信を行なうための通信部24とを含む。 Referring to FIG. 2, a weight scale 200 includes a CPU (Central Processing Unit) 20 for performing overall control, a memory 21 for storing programs executed by the CPU 20, measurement values, and the like, and user operation inputs. An operation button 22 for receiving, a measuring unit 23 for measuring body weight, and an access destination on the Internet of the management device 100 are stored in advance, and a communication unit for communicating with the management device 100 via the Internet 24.
 CPU20は、操作ボタン22からの測定開始を指示する操作に基づいた操作信号に応じて測定部23で体重の測定のための処理を実行させる。測定値は、いったんメモリ21の所定領域に格納される。また、操作ボタン22からの測定値の送信を指示する操作に基づいた操作信号に応じて、CPU20は、測定値をメモリ21の上記領域から読み出して、通信部24に、該測定値を管理装置100に送信するための処理を実行させる。 The CPU 20 causes the measurement unit 23 to execute processing for measuring body weight in accordance with an operation signal based on an operation instructing measurement start from the operation button 22. The measured value is temporarily stored in a predetermined area of the memory 21. Further, in response to an operation signal based on an operation for instructing transmission of the measurement value from the operation button 22, the CPU 20 reads the measurement value from the area of the memory 21, and sends the measurement value to the communication unit 24 as a management device. The processing for transmitting to 100 is executed.
 図3を参照して、カメラ300は、全体制御を行なうためのCPU30と、CPU30で実行するプログラムや撮影画像などを記憶するためのメモリ31と、ユーザの操作入力を受付けるための操作ボタン32と、撮影を行なうための撮影部33と、予め管理装置100のインターネット上のアクセス先を記憶しており、インターネットを介して管理装置100と通信を行なうための通信部34とを含む。 Referring to FIG. 3, camera 300 includes a CPU 30 for performing overall control, a memory 31 for storing programs executed by CPU 30 and captured images, and operation buttons 32 for receiving user operation inputs. The image capturing unit 33 for performing image capturing and the communication unit 34 for storing the access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
 CPU30は、操作ボタン32からの測定開始を指示する操作に基づいた操作信号に応じて撮影部33で撮影処理を実行させる。撮影画像は、いったんメモリ31の所定領域に格納される。また、操作ボタン32からの撮影画像の送信を指示する操作に基づいた操作信号に応じて、CPU30は、撮影画像をメモリ31の上記領域から読み出して、通信部34に、該撮影画像を管理装置100に送信するための処理を実行させる。 The CPU 30 causes the photographing unit 33 to execute a photographing process in response to an operation signal based on an operation instructing the measurement start from the operation button 32. The captured image is temporarily stored in a predetermined area of the memory 31. Further, in response to an operation signal based on an operation for instructing transmission of a captured image from the operation button 32, the CPU 30 reads the captured image from the area of the memory 31 and sends the captured image to the communication unit 34 as a management device. The processing for transmitting to 100 is executed.
 図4を参照して、入力装置400は、全体制御を行なうためのCPU40と、CPU40で実行するプログラムや入力情報などを記憶するためのメモリ41と、アルファベットキーやテンキーなどを含んだ、ユーザの操作入力を受付けるための操作ボタン42と、予め管理装置100のインターネット上のアクセス先を記憶しており、インターネットを介して管理装置100と通信を行なうための通信部44とを含む。 Referring to FIG. 4, an input device 400 includes a CPU 40 for performing overall control, a memory 41 for storing programs executed by CPU 40, input information, and the like, and includes user alphabets and numeric keys. An operation button 42 for receiving an operation input and a communication unit 44 for storing an access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
 CPU40は、操作ボタン42からの情報入力のための操作に基づいた操作信号に応じて、当該操作信号を変換して得られる情報を入力情報として特定し、メモリ41の所定領域に格納する。また、操作ボタン42からの入力情報の送信を指示する操作に基づいた操作信号に応じて、CPU40は、入力情報をメモリ41の上記領域から読み出して、通信部44に、該入力情報を管理装置100に送信するための処理を実行させる。 The CPU 40 specifies information obtained by converting the operation signal as input information in accordance with an operation signal based on an operation for inputting information from the operation button 42, and stores it in a predetermined area of the memory 41. Further, in response to an operation signal based on an operation instructing transmission of input information from the operation button 42, the CPU 40 reads the input information from the area of the memory 41, and transmits the input information to the communication unit 44. The processing for transmitting to 100 is executed.
 図5を参照して、管理装置100は、全体制御を行なうためのCPU10と、CPU10で実行するプログラムなどを記憶するためのメモリ11と、インターネットを介して他の装置と通信を行なうための通信部14とを含む。なお、メモリ11には、後述する第1データベース111が記憶されている。 Referring to FIG. 5, management device 100 has a CPU 10 for overall control, a memory 11 for storing a program executed by CPU 10, and a communication for communicating with other devices via the Internet. Part 14. The memory 11 stores a first database 111 described later.
 [第1の実施の形態]
 <動作概要>
 第1の実施の形態にかかる管理システム1Aでの食事情報の管理について説明する。図6は、管理システム1Aでの食事情報の管理の流れを説明するための図である。
[First Embodiment]
<Overview of operation>
The management of meal information in the management system 1A according to the first embodiment will be described. FIG. 6 is a diagram for explaining a flow of management of meal information in the management system 1A.
 図6を参照して、管理システム1Aにおいて、ユーザは、体重計200を用いて朝夕の体重を測定し、その測定値を管理装置100に対して送信する。すなわち、「朝」として予め規定されている時間帯(たとえば午前6時~午前9時等)においてユーザが体重計200に対して測定を指示することで(ステップS11-1)、体重測定が行なわれ(ステップS11-2)、その結果が測定日時と共に管理装置100に対して送信される(ステップS11-3)。なお、以降の説明において、上記時間帯に測定された体重を「朝体重」とも称する。 6, in the management system 1A, the user measures the morning and evening weights using the weight scale 200, and transmits the measured value to the management apparatus 100. That is, the weight measurement is performed by the user instructing the weight scale 200 to measure in a time zone (for example, 6:00 am to 9:00 am) preliminarily designated as “morning” (step S11-1). The result is transmitted to the management apparatus 100 together with the measurement date and time (step S11-3). In the following description, the weight measured in the above time zone is also referred to as “morning weight”.
 同様に、「夕」として予め規定されている時間帯(たとえば午後8時~午後11時等)においてユーザが体重計200に対して測定を指示することで(ステップS12-1)、体重測定が行なわれ(ステップS12-2)、その結果が測定日時と共に管理装置100に対して送信される(ステップS12-3)。なお、以降の説明において、上記時間帯に測定された体重を「夕体重」とも称する。 Similarly, when the user instructs the weight scale 200 to measure in a time zone (for example, 8:00 pm to 11:00 pm, etc.) previously defined as “evening” (step S12-1), the weight measurement is performed. Is performed (step S12-2), and the result is transmitted to the management apparatus 100 together with the measurement date (step S12-3). In the following description, the weight measured in the above time zone is also referred to as “evening weight”.
 また、ユーザは、朝夕の体重測定の間の期間で摂取する食事内容をカメラ300で撮影し、その画像データを管理装置100に対して送信する。すなわち、「朝食」として予め規定されている時間帯(たとえば午前6時~午前10時等)においてユーザがカメラ300に対して撮影を指示することで(ステップS21-1)、朝食が撮影され(ステップS21-2)、その撮影画像が撮影日時と共に管理装置100に対して送信される(ステップS21-3)。同様に、「昼食」として予め規定されている時間帯(たとえば午前10時~午後2時等)、および「夕食」として予め規定されている時間帯(たとえば午後5時~午後9時等)においてカメラ300に対してそれぞれ撮影を指示することで(ステップS22-1,ステップS23-1)、昼食および夕食がそれぞれ撮影され(ステップS22-2,ステップS23-2)、その撮影画像が撮影日時と共に管理装置100に対して送信される(ステップS22-3,ステップS23-3)。 In addition, the user captures the contents of the meal to be ingested with the camera 300 during the period between weight measurement in the morning and evening, and transmits the image data to the management apparatus 100. That is, breakfast is photographed by the user instructing the camera 300 to take a picture during a time period preliminarily defined as “breakfast” (for example, from 6 am to 10 am) (step S21-1). In step S21-2), the photographed image is transmitted to the management apparatus 100 together with the photographing date and time (step S21-3). Similarly, in a time zone pre-defined as “lunch” (for example, from 10 am to 2 pm) and a time zone pre-defined as “dinner” (for example, from 5 pm to 9 pm) By instructing the camera 300 to shoot (steps S22-1 and S23-1), lunch and dinner are respectively photographed (steps S22-2 and S23-2), and the photographed images are taken together with the photographing date and time. It is transmitted to the management apparatus 100 (step S22-3, step S23-3).
 また、その際に、何かコメントなどがあれば、ユーザは入力装置400を用いて入力して、その内容を管理装置100に対して送信してもよい。 In this case, if there is any comment or the like, the user may input using the input device 400 and transmit the content to the management device 100.
 管理装置100は、体重計200から測定日時と関連付けて送信された体重の測定値から、測定日ごとの朝の体重と夕の体重との体重差を算出する(ステップS31)。 The management apparatus 100 calculates the weight difference between the morning weight and the evening weight for each measurement date from the measured weight value transmitted in association with the measurement date and time from the scale 200 (step S31).
 また、カメラ300からの撮影画像を、食事情報として、撮影日時と関連付けて記憶する。なお、当該撮影画像に関連付けて入力装置400から入力されたコメントなどの入力情報がある場合には、その入力情報も食事情報としてメモリ11の第1データベース111に記憶する。第1データベース111に記憶する際(または記憶した後)に、管理装置100は、体重差に基づいて食事情報を「太りやすい食事」と「太りにくい食事」とに分類する。 Also, the captured image from the camera 300 is stored as meal information in association with the shooting date and time. If there is input information such as a comment input from the input device 400 in association with the captured image, the input information is also stored in the first database 111 of the memory 11 as meal information. When storing in the first database 111 (or after storing), the management apparatus 100 classifies the meal information into “meal that tends to be fat” and “meal that is difficult to gain” based on the weight difference.
 ユーザは、太りやすい食事、太りにくい食事に分類された食事の内容を知りたいときに、表示装置500を用いてその情報を表示させるための指示を行なう(ステップS41-1)。具体的な指示の仕方として、たとえば、予め設定されているWEBサイトにアクセスして、要求するための操作を行なう、などが挙げられる。その指示に応じて、表示装置500は、管理装置100に対して表示データを要求する(ステップS41-2)。 When the user wants to know the contents of meals classified as fat meals or fat meals, the user gives an instruction to display the information using the display device 500 (step S41-1). A specific instruction method includes, for example, accessing a preset WEB site and performing an operation for requesting. In response to the instruction, the display device 500 requests display data from the management device 100 (step S41-2).
 当該要求に応じて管理装置100では、「太りやすい食事」に分類された食事情報である撮影画像、および「太りにくい食事」に分類された食事情報である撮影画像を、予め用意している表示データのフォーマットに組み込むことで、表示データを生成し(ステップS33-1)、その表示データを要求元である表示装置500に対して送信する。この際、予め食事情報にユーザ情報などが含まれている場合、表示装置500からの要求に含まれる情報と当該ユーザ情報とを用いてユーザ認証を行ない、認証成功の場合に表示データを送信するようにしてもよい。 In response to the request, the management apparatus 100 prepares in advance a photographed image that is meal information classified as “mealing easily” and a photographed image that is meal information classified as “food that is not easily fat”. By incorporating the data into the data format, display data is generated (step S33-1), and the display data is transmitted to the display device 500 that is the request source. At this time, if the meal information includes user information or the like in advance, user authentication is performed using the information included in the request from the display device 500 and the user information, and display data is transmitted when the authentication is successful. You may do it.
 表示装置500では表示データに基づく表示処理が実行されることで、画面表示がなされる(ステップS42)。その画面において、「太りやすい食事」に分類された食事の撮影画像、および「太りにくい食事」に分類された食事の撮影画像が表示される。 The display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, a photographed image of a meal classified as “meal with easy weight gain” and a photographed image of a meal classified as “meal with difficulty in weight gain” are displayed.
 <機能構成>
 上記動作を行なうための各装置の機能構成を説明する。
<Functional configuration>
The functional configuration of each device for performing the above operation will be described.
 体重計200、カメラ300、および入力装置400は、いずれも、ユーザからの操作入力に応じて情報(測定値、撮影画像、入力情報)を取得する機能、および、その情報を予め記憶している管理装置100に対して送信するための機能を有する。これらの機能は、各装置に含まれるCPUがメモリに記憶されるプログラムを読み出して実行することで、主に、CPUによって実現される。 The weight scale 200, the camera 300, and the input device 400 all store in advance a function for acquiring information (measurement value, captured image, input information) in accordance with an operation input from the user. It has a function for transmitting to the management apparatus 100. These functions are mainly realized by the CPU by the CPU included in each device reading and executing the program stored in the memory.
 また、表示装置500は、ユーザからの表示操作に応じて予め規定されている要求を管理装置100に対して送信するための機能、および、その要求に応じて管理装置100から送信された表示データを受信して当該表示データに基づく画面表示を行なうための処理を実行するための機能を有する。これらの機能は、表示装置500に含まれる図示しないCPUがメモリに記憶されるプログラムを読み出して実行することで、主に、CPUによって実現される。 In addition, the display device 500 has a function for transmitting a request specified in advance according to a display operation from the user to the management device 100, and display data transmitted from the management device 100 in response to the request. And a function for executing a process for performing screen display based on the display data. These functions are mainly realized by the CPU by reading and executing a program stored in the memory by a CPU (not shown) included in the display device 500.
 図7は、管理システム1Aに含まれる管理装置100の機能構成の具体例を示すブロック図である。図7に示される各機能は、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで、主に、CPU10によって実現される。しかしながら、少なくとも一部が電気回路などのハードウェアで構成されてもよい。 FIG. 7 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1A. Each function shown in FIG. 7 is mainly realized by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11. However, at least a part may be configured by hardware such as an electric circuit.
 図7を参照して、管理装置100のCPU10は、通信部14を介してカメラ300から撮影画像の入力を受付けるための画像入力部101と、通信部14を介して入力装置400からテキスト情報である上記入力情報の入力を受付けるためのテキスト入力部102と、撮影画像を撮影日時に関連付けて(また、あれば入力情報とも関連付けて)食事情報としてメモリ11に記憶される第1データベース111に格納するための格納部103と、通信部14を介して体重計200から測定値の入力を受付けるための測定値入力部104と、測定値に関連付けられた測定日時を特定する情報に基づいて測定日ごとに朝体重と夕体重とを特定し、測定日ごとの朝体重と夕体重との体重差を算出するための算出部105と、上記体重差に基づいて測定日に対応する撮影日時ごとに食事情報を太りやすい食事か太りにくい食事か分類するための分類部106と、通信部14を介して表示装置500から表示データの要求を受付けるための要求部107と、当該要求に従って第1データベース111から食事情報を読み出すための読出部108と、食事情報に基づいて表示データを生成するための生成部109とを含む。 Referring to FIG. 7, the CPU 10 of the management apparatus 100 receives an image input unit 101 for receiving an input of a photographed image from the camera 300 via the communication unit 14 and text information from the input device 400 via the communication unit 14. A text input unit 102 for accepting input of certain input information, and a captured image associated with the photographing date and time (or associated with the input information, if any) stored in the first database 111 stored in the memory 11 as meal information A storage unit 103 for performing measurement, a measurement value input unit 104 for receiving an input of a measurement value from the scale 200 via the communication unit 14, and a measurement date based on information for specifying a measurement date and time associated with the measurement value A morning weight and an evening weight for each measurement date, and a calculation unit 105 for calculating a weight difference between the morning weight and the evening weight for each measurement day, and based on the weight difference A classifying unit 106 for classifying whether meal information is likely to be fat or difficult to eat for each photographing date corresponding to a fixed day, and a request unit 107 for receiving a request for display data from the display device 500 via the communication unit 14. And a reading unit 108 for reading meal information from the first database 111 according to the request, and a generation unit 109 for generating display data based on the meal information.
 分類部106が第1データベース111に格納された食事情報ごとに、当該撮影日に対応した測定日の朝体重と夕体重との体重差に基づいて太りやすい食事か太りにくい食事かを分類し、その分類を特定する情報をさらに食事情報に含めて第1データベース111に格納する。また、算出された上記体重差もさらに食事情報に含めて第1データベース111に格納する。 For each meal information stored in the first database 111, the classification unit 106 classifies whether the meal is easy to get fat or less fat based on the weight difference between the morning weight and the evening weight on the measurement day corresponding to the shooting date. Information specifying the classification is further included in the meal information and stored in the first database 111. The calculated weight difference is further included in the meal information and stored in the first database 111.
 分類部106での分類方法の一例として、分類部106は、予めしきい値を記憶しておき、上記体重差としきい値とを比較して、しきい値より大なる場合に太りやすい食事、しきい値より小なる場合に太りにくい食事、と分類する方法が挙げられる。他の例として、予め規定された期間の体重差の平均値を算出し、上記体重差が当該平均値から所定数(たとえば0.2kg等)以上大なる場合に太りやすい食事、当該平均値から上記所定数以上小なる場合に太りにくい食事、と分類する方法が挙げられる。 As an example of a classification method in the classification unit 106, the classification unit 106 stores a threshold value in advance, compares the weight difference with the threshold value, and if the food exceeds the threshold value, There is a method of classifying a meal that is difficult to get fat when it is smaller than a threshold value. As another example, an average value of weight differences during a predetermined period is calculated, and if the weight difference is larger than the average value by a predetermined number (for example, 0.2 kg) or more, a meal that tends to become fat, from the average value A method of classifying as a meal that is difficult to get fat when it is smaller than the predetermined number is mentioned.
 表示装置500からの要求に応じて、読出部108は、一例として、当該要求の受信時以前の予め規定された期間(日数)分の食事情報について、太りやすい食事と分類された食事情報および太りにくい食事と分類された食事情報を読み出してもよい。これらの読出した情報に基づいて表示データが生成されて画面表示がなされることで、最近の食事の傾向を知ることができる。この表示の例を第1の表示例とする。 In response to a request from the display device 500, the reading unit 108, as an example, for meal information for a predetermined period (number of days) prior to the reception of the request, meal information classified as being easily fattened and fat Meal information classified as difficult meals may be read out. Display data is generated based on the read information and displayed on the screen, so that the tendency of recent meals can be known. This display example is a first display example.
 読出部108での読み出しの他の例として、食事情報に含まれる朝体重と夕体重との体重差を参照して、太りやすい食事と分類された食事情報として体重差の大きいものから順に所定数の食事情報を読み出し、太りにくい食事と分類された食事情報として体重差の小さいものから順に所定数の食事情報を読み出してもよい。これらの読出した情報に基づいて表示データが生成されて画面表示がなされることで、太りやすい食事の上位からのランキング、太りにくい食事の上位からのラインキングを知ることができる。なお、読出部108は、上記要求の受信時以前の予め規定された期間(日数)分の食事情報を読み出し、生成部109が上記体重差に応じて食事情報を抽出した上で表示データを生成するようにしてもよい。この表示の例を第1の表示例とする。 As another example of reading by the reading unit 108, the weight difference between the morning weight and the evening weight included in the meal information is referred to, and the predetermined number in order from the largest weight difference as meal information classified as an easily fattened meal The predetermined number of pieces of meal information may be read in order from the smallest weight difference as meal information classified as meals that are difficult to get fat. Display data is generated based on the read information and displayed on the screen, so that it is possible to know the ranking from the top of a meal that is easily fattened and the lineking from the top of a meal that is not easily fattened. Note that the reading unit 108 reads meal information for a predetermined period (number of days) before the reception of the request, and generates display data after the generation unit 109 extracts the meal information according to the weight difference. You may make it do. This display example is a first display example.
 <動作フロー>
 図8は、管理システム1Aに含まれる管理装置100での動作の流れの具体例を表わしたフローチャートである。図8のフローチャートに表わされた動作は、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで実現される。
<Operation flow>
FIG. 8 is a flowchart showing a specific example of the flow of operations in the management apparatus 100 included in the management system 1A. The operation shown in the flowchart of FIG. 8 is realized by the CPU 10 reading and executing a program stored in the memory 11.
 図8を参照して、CPU10は、通信部14を介して体重計200から測定値を受信すると(ステップS101でYES)、ステップS103で、当該測定値に関連付けられた測定日時の情報から測定日を特定すると共に、予め規定されている時間帯と測定時刻とを比較することで朝体重であるか夕体重であるかを特定し、それらの情報と共に測定値を記憶する。 Referring to FIG. 8, when CPU 10 receives a measurement value from weight scale 200 via communication unit 14 (YES in step S101), in step S103, CPU 10 determines the measurement date from the measurement date and time information associated with the measurement value. In addition, the time period specified in advance and the measurement time are compared to specify whether the body weight is morning weight or evening weight, and the measurement value is stored together with the information.
 通信部14を介してカメラ300から撮影画像を受信すると(ステップS105でYES)、ステップS107でCPU10は、当該撮影画像に関連付けられた撮影日時の情報から撮影日を特定すると共に、予め規定されている時間帯と撮影時刻とを比較することで朝食であるか昼食であるか夕食であるかを特定し、それらの情報と共に撮影画像を食事情報として記憶する。 When a photographed image is received from the camera 300 via the communication unit 14 (YES in step S105), in step S107, the CPU 10 specifies a photographing date from information on the photographing date and time associated with the photographed image and is defined in advance. By comparing a certain time zone and the photographing time, it is specified whether it is breakfast, lunch or dinner, and the photographed image is stored as meal information together with the information.
 ステップS109でCPU10は、測定日ごとの朝体重と夕体重との体重差を算出する。ステップS109の処理は、上記ステップS103で、受信した測定値が夕体重と特定されたタイミングで行なわれてもよいし、予め規定された時刻(たとえば午前0時等)に行なわれてもよいし、後述の表示要求を受信したタイミングで行なわれてもよい。 In step S109, the CPU 10 calculates the weight difference between the morning weight and the evening weight for each measurement day. The process of step S109 may be performed at the timing when the received measurement value is identified as the evening weight in step S103, or may be performed at a predetermined time (for example, midnight). It may be performed at a timing when a display request described later is received.
 そして、ステップS111でCPU10は、上記ステップS109で算出された朝夕体重の体重差に基づいて当該測定日に対応した撮影日に関連付けられている食事情報を、太りやすい食事または太りにくい食事に分類する。CPU10は、分類を特定する情報および上記体重差をさらに食事情報に併せて記憶させる。 In step S111, the CPU 10 classifies the meal information associated with the shooting date corresponding to the measurement date based on the weight difference between the morning and evening weights calculated in step S109 as a meal that tends to be fat or a meal that is difficult to gain weight. . The CPU 10 further stores information specifying the classification and the weight difference together with the meal information.
 通信部14を介して表示装置500から表示の要求を受信すると(ステップS113でYES)、ステップS115でCPU10は、第1データベース111から食事情報を読み出し、ステップS117で分類に応じた表示データを生成する。ステップS117では一例として上記第1の表示例や上記第2の表示例として説明されたような表示データが生成される。そして、ステップS119でCPU10は、生成した表示データを表示装置500に対して送信する。 When a display request is received from the display device 500 via the communication unit 14 (YES in step S113), the CPU 10 reads meal information from the first database 111 in step S115, and generates display data corresponding to the classification in step S117. To do. In step S117, display data as described as the first display example and the second display example is generated as an example. In step S <b> 119, the CPU 10 transmits the generated display data to the display device 500.
 以上の処理は、管理装置100において繰り返される。それによって、日々の食事の撮影画像を含む食事情報が、当該日の朝夕体重の体重差に応じて分類され、それに基づいて表示装置500に表示される。 The above processing is repeated in the management apparatus 100. Thereby, meal information including captured images of daily meals is classified according to the weight difference between the morning and evening weights of the day, and displayed on the display device 500 based on the classification.
 <第1の表示例>
 図9は、第1の表示例の、第1の具体例を示す図である。
<First display example>
FIG. 9 is a diagram illustrating a first specific example of the first display example.
 図9を参照して、第1の表示例の第1の具体例では、日ごとの朝体重と夕体重との測定値と共に、その日の食事情報として撮影日が当該日である食事の画像が表示される。図9の例では1食のみが表示されているが、3食(または間食も含めて4食以上)の画像が1画面あるいはスクロールして他の画面に表示されるようにしてもよい。さらに、その分類結果として、当該食事が太りやすい食事であるか太りにくい食事であるかの別が表示される。図9の例では、太りやすい食事が「B」、太りにくい食事「G」として分類結果が表示される例が示されている。 Referring to FIG. 9, in the first specific example of the first display example, together with the measurement values of morning weight and evening weight for each day, a meal image whose shooting date is the day is shown as meal information for that day. Is displayed. In the example of FIG. 9, only one meal is displayed, but an image of three meals (or four meals or more including snacks) may be displayed on one screen or scrolled and displayed on another screen. Further, as the classification result, whether the meal is a fat meal or a meal that is difficult to fat is displayed. In the example of FIG. 9, an example is shown in which the classification result is displayed as “B” for an easily fattened meal and “G” for a less easily fattened meal.
 図10は、第1の表示例の、第2の具体例を示す図である。図10の例は、第1の表示例として、表示画面のうちの撮影画像および分類結果の部分のみを表わしたものである。 FIG. 10 is a diagram showing a second specific example of the first display example. The example of FIG. 10 represents only a captured image and a classification result portion of the display screen as a first display example.
 図10を参照して、第1の表示例の第2の具体例では、測定日ごとに3食(または間食も含めて4食以上)の画像が1画面に並んで表示され、さらに、その日の朝体重と夕体重との体重差が表示される。また、その分類結果として、当該測定日の3食の食事が、太りやすい食事であるか太りにくい食事であるかの分類結果に対応した枠で囲まれて表示されている。図10の例では、太りやすい食事の場合には太線(二重線)、太りにくい食事の場合には点線の枠で囲まれて表示される例が示されている。 Referring to FIG. 10, in the second specific example of the first display example, images of three meals (or four or more meals including snacks) are displayed side by side on one screen for each measurement day, and further, the day The weight difference between the morning weight and the evening weight is displayed. Further, as the classification result, the three meals on the measurement date are displayed surrounded by a frame corresponding to the classification result indicating whether the meal is easily fat or not. In the example of FIG. 10, an example is shown in which a thick line (double line) is displayed for a meal that tends to be fat, and a dotted line frame is displayed for a meal that is not fat.
 なお、以上の例では、体重計200を用いて測定値を管理装置100に送信し、かつ、カメラ300を用いて撮影画像を管理装置100に送信するものとしている。しかしながら、それらの装置がない場合も想定される。たとえばカメラ300がない場合には、該当する食事の情報をたとえばメニューを表わすテキスト情報として入力装置400を用いて入力し、管理装置100に送信するようにしてもよい。この場合、図10に示されたように、撮影画像に代えて入力されたテキスト情報が表示されてもよい。また、体重計200がない場合には、あるいは当該体重計が通信機能を有していない場合には、測定値をテキスト情報として入力装置400を用いて入力し、管理装置100に送信するようにしてもよい。 In the above example, the measurement value is transmitted to the management apparatus 100 using the weight scale 200, and the captured image is transmitted to the management apparatus 100 using the camera 300. However, the case where there is no such device is also assumed. For example, when there is no camera 300, the corresponding meal information may be input as text information representing a menu using the input device 400 and transmitted to the management device 100, for example. In this case, as shown in FIG. 10, input text information may be displayed instead of the captured image. When there is no weight scale 200 or when the scale does not have a communication function, the measurement value is input as text information using the input device 400 and transmitted to the management device 100. May be.
 <第2の表示例>
 図11は、第2の表示例の具体例を示す図である。
<Second display example>
FIG. 11 is a diagram illustrating a specific example of the second display example.
 図11を参照して、第2の表示例では、所定期間の日ごとの朝体重と夕体重との体重差が大きい順および小さい順に、当該測定日の食事情報が読み出されて、その順で表示される。図11の例では、「太りやすいメニューランキング」として体重差の大きい測定日に対応した撮影日の撮影画像が体重差の大きい順に表示され、「太りにくいメニューランキング」として体重差の小さい測定日に対応した撮影日の撮影画像が体重差の小さい順に表示される例が示されている。 Referring to FIG. 11, in the second display example, meal information on the measurement date is read out in order of increasing or decreasing weight difference between morning weight and evening weight for each day in a predetermined period. Is displayed. In the example of FIG. 11, captured images corresponding to measurement days with large weight differences are displayed in order of increasing weight difference as “menu ranking easy to gain weight”, and measurement dates with small weight difference are displayed as “menu ranking difficult to gain weight”. An example is shown in which captured images on corresponding shooting dates are displayed in ascending order of weight difference.
 <第1の実施の形態の利点>
 管理システム1Aで以上の動作が実行されることで、ユーザは、自身の食事において一日の体重差が大きくなった日の食事内容を太りやすい食事として、体重差があまり大きくない日の食事内容を太りにくい食事として把握することができる。また、ユーザは、自身の食事において一日の体重差が大きくなりやすい食事内容と大きくなりにくい食事内容とを把握することができる。さらに、食事内容が撮影画像で表示されることで、同じメニューであっても、実際にどのようにして摂取したか(何かをかけて摂取したか、等)、どれくらいの量を摂取したか、などを一目で把握することができる。そのため、自身の食事管理を容易に行なうことができる。
<Advantages of the first embodiment>
By executing the above operation in the management system 1A, the user can easily make the meal contents on the day when the weight difference of the day is large in his / her meal, and the meal contents on the day when the weight difference is not so large. Can be grasped as a meal that is difficult to gain weight. In addition, the user can grasp the meal content that tends to increase the daily weight difference and the meal content that is difficult to increase in his / her meal. In addition, by displaying the meal contents in the captured image, how you actually consumed it (even if you consumed something, etc.) and how much you consumed, even if it was the same menu , Etc. can be grasped at a glance. Therefore, own meal management can be easily performed.
 [第2の実施の形態]
 <動作概要>
 第2の実施の形態にかかる管理システム1Bでの食事情報の管理について説明する。図12は、管理システム1Bでの食事情報の管理の流れを説明するための図である。なお、図12においては、体重計200での測定動作は図6に示された管理システム1Aでの動作と同様であるため、その表記を行なっていない。
[Second Embodiment]
<Overview of operation>
The management of meal information in the management system 1B according to the second embodiment will be described. FIG. 12 is a diagram for explaining a flow of management of meal information in the management system 1B. In FIG. 12, the measurement operation with the weight scale 200 is the same as the operation with the management system 1A shown in FIG.
 図12を参照して、図6に示された管理システム1Aでの管理の流れと比較すると、管理システム1Bにおいては、ユーザは、各食の撮影画像を送信すると共に、入力装置400を用いて当該食事内容についてのキーワードを入力し(ステップS51-1,S52-1,S53-1)、そのキーワードを管理装置100に対して送信する。 Referring to FIG. 12, compared with the management flow in management system 1 </ b> A shown in FIG. 6, in management system 1 </ b> B, the user transmits a captured image of each meal and uses input device 400. A keyword for the meal content is input (steps S51-1, S52-1, and S53-1), and the keyword is transmitted to the management apparatus 100.
 キーワードとしては、たとえば、当該食事内容に含まれる材料や、メニューのジャンルなどが該当する。また、キーワードは直接操作ボタン42を構成する文字ボタンで入力されてもよいし、予め用意されている選択肢の中から選択されるものでもよい。 キ ー ワ ー ド Keywords include, for example, ingredients included in the meal content, menu genres, and the like. In addition, the keyword may be directly input by a character button constituting the operation button 42, or may be selected from options prepared in advance.
 管理装置100は、食事情報として撮影画像と共にキーワードも記憶する。
 さらに、管理装置100は、表示装置500からの表示の要求(ステップS41-2)を受付けると、「太りやすい食事」と「太りにくい食事」との分類結果を用いた表示データを生成する際に、「太りやすい食事」と分類された食事情報に代替する食事情報を抽出し(ステップS33-0)、その表示データにおいて、代替メニューとして当該食事情報に含まれる撮影画像を表示させるようにする(ステップS33-1)。
The management apparatus 100 also stores keywords as meal information along with the captured images.
Further, upon receiving the display request (step S41-2) from the display device 500, the management device 100 generates display data using the classification results of “meal that tends to be fat” and “meal that is difficult to gain weight”. Then, meal information to be substituted for meal information classified as “fat easily” is extracted (step S33-0), and a captured image included in the meal information is displayed as an alternative menu in the display data (step S33-0). Step S33-1).
 表示装置500では表示データに基づく表示処理が実行されることで、画面表示がなされる(ステップS42)。その画面において、管理システム1Aと同様に「太りやすい食事」に分類された食事の撮影画像、および「太りにくい食事」に分類された食事の撮影画像が表示されると共に、さらに、「太りやすい食事」に分類された食事に代替する食事として抽出された食事情報に含まれる撮影画像が表示される。 The display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, as with the management system 1A, a photographed image of a meal classified as “easy to get fat” and a photographed image of a meal classified as “a meal that is difficult to get fat” are displayed. The photographed image included in the meal information extracted as a meal that substitutes for the meal classified as "is displayed.
 <機能構成>
 上記動作を行なうための各装置の機能構成を説明する。管理システム1Bに含まれる装置のうちの体重計200、カメラ300、入力装置400、および表示装置500は、管理システム1Aに含まれる装置と同じ機能構成を有する。
<Functional configuration>
The functional configuration of each device for performing the above operation will be described. Of the devices included in the management system 1B, the scale 200, the camera 300, the input device 400, and the display device 500 have the same functional configuration as the devices included in the management system 1A.
 図13は、管理システム1Bに含まれる管理装置100の機能構成の具体例を示すブロック図である。図13に示される各機能は、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで、主に、CPU10で形成される。しかしながら、少なくとも一部が電気回路などのハードウェアで構成されてもよい。 FIG. 13 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1B. Each function shown in FIG. 13 is mainly formed by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11. However, at least a part may be configured by hardware such as an electric circuit.
 図13を参照して、管理装置100のCPU10は、図7に示された管理システム1Aに含まれる管理装置100の機能構成に加えて、代替する食事情報を抽出するための抽出部110をさらに含む。また、メモリ11には、第1データベース111に加えて第2データベース112が記憶される。 Referring to FIG. 13, CPU 10 of management device 100 further includes an extraction unit 110 for extracting meal information to be substituted in addition to the functional configuration of management device 100 included in management system 1 </ b> A shown in FIG. 7. Including. In addition to the first database 111, the memory 11 stores a second database 112.
 図14は、第2データベース112に記載される情報の具体例を示す図である。図14に示されるように、第2データベース112には、メニューごとに材料やカテゴリやカロリなどの情報が記載され、メモリ11に予め記憶されている。 FIG. 14 is a diagram showing a specific example of information described in the second database 112. As shown in FIG. 14, in the second database 112, information such as materials, categories, and calories is described for each menu and stored in the memory 11 in advance.
 第1の例として、抽出部110は、分類部106において「太りやすい食事」と分類された食事情報を参照して、当該食事情報に含まれる、当該メニューのカテゴリや材料を現したキーワードを抽出する。抽出部110は、そのキーワードを元に第2データベース112を参照することで当該メニューのカロリを読み出す。そして、抽出部110は、当該メニューのカロリよりも小さいカロリであって、当該メニューとカテゴリが同一である他のメニュー、または材料が重複する他のメニューを代替するメニューとして第2データベース112より抽出する。 As a first example, the extraction unit 110 refers to the meal information classified as “fat that is easy to get fat” by the classification unit 106 and extracts keywords representing the category and material of the menu included in the meal information. To do. The extraction unit 110 reads the calorie of the menu by referring to the second database 112 based on the keyword. Then, the extraction unit 110 extracts from the second database 112 as a menu that substitutes for another menu that has a category smaller than that of the menu and that has the same category as that of the menu or another menu with overlapping materials. To do.
 このように第2データベース112を利用して抽出することで、予め規定されているデータベースに基づいて、摂取した食事よりもよりカロリの低いメニューを代替メニューとして抽出することができる。 As described above, by extracting using the second database 112, a menu having a lower calorie than the ingested meal can be extracted as an alternative menu based on a pre-defined database.
 第2の例として、抽出部110は、分類部106において「太りやすい食事」と分類された食事情報を参照して、当該食事情報に含まれる、当該メニューのカテゴリや材料を現したキーワードを抽出する。抽出部110は、そのキーワードを元に第1データベース111を参照することで、当該食事情報に撮影日に対応した測定日における体重差よりも小さい体重差に対応付けられた他の測定日の食事情報のうち、当該メニューとカテゴリが同一である他のメニュー、または材料が重複する他のメニューを代替するメニューとして第1データベース111より抽出する。 As a second example, the extraction unit 110 refers to the meal information classified as “fat that tends to be fat” by the classification unit 106, and extracts a keyword representing the category or material of the menu included in the meal information. To do. The extraction unit 110 refers to the first database 111 based on the keyword, so that meals on other measurement days associated with a weight difference smaller than the weight difference on the measurement date corresponding to the photographing date are associated with the meal information. Of the information, the menu is extracted from the first database 111 as a menu that substitutes for another menu having the same category as that of the menu or another menu having overlapping materials.
 このように第1データベース111を利用して抽出することで、そのユーザにとって「太りやすい食事」とされたメニューよりも太りにくいと分類されたメニューを代替メニューとして抽出することができる。 As described above, by extracting using the first database 111, it is possible to extract a menu classified as being less likely to be fat than a menu that is regarded as “fat easily” for the user as an alternative menu.
 <動作フロー>
 図15は、管理システム1Bに含まれる管理装置100での動作の流れの具体例を表わしたフローチャートである。図15のフローチャートに表わされた動作もまた、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで実現される。
<Operation flow>
FIG. 15 is a flowchart showing a specific example of the operation flow in the management apparatus 100 included in the management system 1B. The operation shown in the flowchart of FIG. 15 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
 管理システム1Bに含まれる管理装置100では、ステップS101~S115まで、図8に示された管理システム1Aに含まれる管理装置100と同様の動作が行なわれる。 In the management apparatus 100 included in the management system 1B, operations similar to those of the management apparatus 100 included in the management system 1A shown in FIG. 8 are performed from step S101 to S115.
 管理システム1Bに含まれる管理装置100では、表示装置500から表示の要求を受信すると(ステップS113でYES)、ステップS115でCPU10は、第1データベース111から食事情報を読み出すと共に、ステップS116で、上述のような方法で、太りやすい食事と分類された食事情報について代替メニューを抽出する。そしてCPU10は、それらの情報に応じて表示データを生成する(ステップS117’)。以降は、管理システム1Aに含まれる管理装置100での動作と同様である。 In the management device 100 included in the management system 1B, when a display request is received from the display device 500 (YES in step S113), the CPU 10 reads meal information from the first database 111 in step S115, and in step S116, the above-described processing is performed. In this way, an alternative menu is extracted for meal information classified as fat meals. And CPU10 produces | generates display data according to those information (step S117 '). Subsequent operations are the same as those performed by the management apparatus 100 included in the management system 1A.
 上記ステップS117’での表示データの例としては、管理システム1Aに含まれる管理装置100での表示データで表わされる画面内容に加えて、代替メニューを表示するものである。その表示の仕方として、図9~図11に示されたような画面にさらに代替メニューを含めて表示する方法であってもよいし、図9~図11に示されたような画面に代替メニューの表示を指示するためのボタン等をさらに表示し、そのボタンの押下に伴って切り替わる画面に代替メニューを表示する方法であってもよい。 As an example of the display data in the above step S117 ', an alternative menu is displayed in addition to the screen contents represented by the display data in the management apparatus 100 included in the management system 1A. As a display method, a method of displaying an alternative menu on the screen as shown in FIG. 9 to FIG. 11 or an alternative menu on the screen as shown in FIG. 9 to FIG. Alternatively, a button or the like for instructing display may be displayed, and an alternative menu may be displayed on a screen that is switched when the button is pressed.
 なお、この代替メニューが上述のように第1データベース111から抽出されるものである場合には、食事情報に含まれる撮影画像を用いて代替メニューを表示するようにしてもよい。 In addition, when the alternative menu is extracted from the first database 111 as described above, the alternative menu may be displayed using a captured image included in the meal information.
 <第2の実施の形態の利点>
 管理システム1Bで以上の動作が実行されることで、ユーザは、管理システム1Aと同様に自身の食事において一日の体重差に関連して食事内容を把握できると共に、さらに、太りやすい食事と分類されたメニューについて、よりよいメニューを把握することができる。そのため、自身の食事管理をより容易に行なうことができる。
<Advantages of Second Embodiment>
By performing the above operation in the management system 1B, the user can grasp the contents of meals related to the daily weight difference in his / her meal as in the management system 1A, and further classify the meal as being easily fat. For a given menu, a better menu can be grasped. Therefore, own meal management can be performed more easily.
 [第3の実施の形態]
 <動作概要>
 管理システム1A,1Bでは、日々の食事情報を日々のユーザの体重変化に基づいて「太りやすい食事」と「太りにくい食事」とに分類し、第1データベース111に蓄積している。第3の実施の形態にかかる管理システム1Cでは、管理システム1Aまたは管理システム1Bでの食事情報の管理に併せて、体重変化の予測動作およびメニュー提案動作を行なう。
[Third Embodiment]
<Overview of operation>
In the management systems 1 </ b> A and 1 </ b> B, daily meal information is classified into “fat easily” and “fat hard to eat” based on the daily weight change of the user and stored in the first database 111. In the management system 1C according to the third embodiment, a weight change prediction operation and a menu suggestion operation are performed in conjunction with the management of meal information in the management system 1A or the management system 1B.
 管理システム1Cで実行される体重変化の予測動作、および管理システム1Cで実行されるメニュー提案動作は、いずれも、管理装置100において、入力装置400からの、メニューや材料や食事のカテゴリなどを表わすキーワードの入力と共に、表示装置500からの表示の要求を受付けることによって実行される。 The weight change prediction operation executed in the management system 1C and the menu suggestion operation executed in the management system 1C all represent menus, materials, meal categories, and the like from the input device 400 in the management device 100. This is executed by receiving a display request from the display device 500 together with the input of the keyword.
 すなわち、体重変化の予測動作を実行させる際、ユーザは、食事を摂取する前、または撮影画像を送信する際にメニューや材料や食事のカテゴリなどをテキスト情報として入力装置400を用いて入力し、入力情報として管理装置100に対して送信する。さらに、表示装置500から、体重推移の予測の要求を送信する。 That is, when executing a weight change prediction operation, the user inputs a menu, a material, a meal category, or the like using the input device 400 as text information before ingesting a meal or when transmitting a captured image. It is transmitted to the management apparatus 100 as input information. Furthermore, a request for prediction of weight transition is transmitted from the display device 500.
 管理装置100は、第1データベース111に蓄積されている当該キーワードを含む食事情報を参照して、その食事情報に表わされた日の体重変化を読み出す。そして、その結果を体重の推移の予測値として表示する。表示装置500にその情報が表示されることで、ユーザは、入力したメニューや材料や食事のカテゴリなどに対応した食事を摂取することでの朝体重からの変化の予測値を知ることができる。 The management apparatus 100 refers to the meal information including the keyword stored in the first database 111 and reads the weight change of the day represented by the meal information. Then, the result is displayed as a predicted value of the change in weight. By displaying the information on the display device 500, the user can know the predicted value of the change from the morning weight by ingesting a meal corresponding to the input menu, material, meal category, or the like.
 また、メニュー提案動作を実行させる際、ユーザは、食事を摂取する前、または撮影画像を送信する際に材料や食事のカテゴリなどを決定し、その内容をキーワードとして入力装置400を用いて管理装置100に対して送信し、さらに、表示装置500からメニューの提示の要求を送信する。 Further, when executing the menu suggestion operation, the user determines a material, a meal category, or the like before taking a meal or transmitting a photographed image, and uses the contents as keywords as a management device. 100, and the display device 500 transmits a menu presentation request.
 管理装置100は、当該キーワードを含む食事情報のうち当該ユーザの上記体重差が太りにくい食事とされる体重差である食事情報を抽出して、提案メニューとして表示する。表示装置500にその情報が表示されることで、ユーザは、入力した材料や食事のカテゴリなどに合致し、かつ、以前の自身の食事において「太りにくい食事」と分類されたメニューを知ることができる。 The management apparatus 100 extracts meal information that is a weight difference that is considered to be a meal in which the weight difference of the user is less likely to be fat from the meal information including the keyword, and displays it as a suggestion menu. By displaying the information on the display device 500, the user knows a menu that matches the input material, the category of the meal, and the like and is classified as “a meal that is difficult to get fat” in the previous meal. it can.
 <機能構成>
 上記動作を行なうための管理システム1Cに含まれる管理装置100の機能構成は、図13に示された、管理システム1Bに含まれる管理装置100の機能構成と同様である。すなわち、メモリ11に上記第2データベース112が記憶され、管理システム1Aに含まれる管理装置100に加えて、上記抽出部110が含まれる。
<Functional configuration>
The functional configuration of the management apparatus 100 included in the management system 1C for performing the above operation is the same as the functional configuration of the management apparatus 100 included in the management system 1B shown in FIG. That is, the second database 112 is stored in the memory 11, and the extraction unit 110 is included in addition to the management device 100 included in the management system 1A.
 第1の例として、抽出部110は、入力装置400からのキーワードを用いて第2データベース112を参照して、当該キーワードを含むメニューを抽出する。さらに、抽出部110は、当該メニューを用いて第1データベース111を参照して、第1データベース111に記憶されている当該メニューを含む食事情報を抽出して、その食事情報に含まれている体重差を読み出す。または、抽出部110は、当該メニューを用いて第1データベース111を参照して、第1データベース111に記憶されている当該メニューを含む食事情報のうち、「太りにくい食事」と分類された食事情報を抽出する。 As a first example, the extraction unit 110 refers to the second database 112 using a keyword from the input device 400 and extracts a menu including the keyword. Furthermore, the extraction unit 110 refers to the first database 111 using the menu, extracts meal information including the menu stored in the first database 111, and the weight included in the meal information. Read the difference. Alternatively, the extraction unit 110 refers to the first database 111 using the menu, and among the meal information including the menu stored in the first database 111, the meal information classified as “a meal that is difficult to get fat” To extract.
 第2の例として、第1データベース111に記憶されている食事情報に、入力装置400から入力された当該食事の材料や食事のカテゴリなどを表わすキーワードが含まれている場合、抽出部110は、入力装置400からのキーワードを用いて第1データベース111を参照して、第1データベース111に記憶されている当該キーワードを含む食事情報から当該情報に含まれている体重差を読み出す。または、抽出部110は、入力装置400からのキーワードを用いて第1データベース111を参照して、第1データベース111に記憶されている当該キーワードを含む食事情報のうち、「太りにくい食事」と分類された食事情報を抽出する。 As a second example, when the meal information stored in the first database 111 includes a keyword representing the meal material or the meal category input from the input device 400, the extraction unit 110 The first database 111 is referred to using the keyword from the input device 400, and the weight difference included in the information is read from the meal information including the keyword stored in the first database 111. Alternatively, the extraction unit 110 refers to the first database 111 using the keyword from the input device 400 and classifies the meal information including the keyword stored in the first database 111 as “a meal that is difficult to get fat”. Extracted meal information.
 <動作フロー>
 管理システム1Cに含まれる管理装置100では、前提として、図8や図15に表わされた動作を行なって、日々の食事内容を表わした撮影画像を含む食事情報を日々のユーザの体重差に基づいて「太りやすい食事」と「太りにくい食事」とに分類した上で、第1データベース111に格納する動作を行なっている。
<Operation flow>
In the management apparatus 100 included in the management system 1C, as a premise, the operation shown in FIG. 8 and FIG. 15 is performed, and the meal information including the photographed image representing the daily meal content is changed to the daily weight difference of the user. Based on the classification, it is classified into “meal that is easy to get fat” and “meal that is hard to get fat”, and the operation of storing in the first database 111 is performed.
 管理システム1Cに含まれる管理装置100では、さらに、体重変化の予測動作およびメニュー提案動作を行なう。 The management apparatus 100 included in the management system 1C further performs a weight change prediction operation and a menu suggestion operation.
 図16は、管理システム1Cに含まれる管理装置100での体重変化の予測動作の流れの具体例を表わしたフローチャートである。図16のフローチャートに表わされた動作もまた、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで実現される。図16のフローチャートに表わされた体重変化の予測動作は、CPU10が抽出部110によって上記第1の例で示された機能を実現する場合の動作の具体例である。 FIG. 16 is a flowchart showing a specific example of a flow of a weight change prediction operation in the management apparatus 100 included in the management system 1C. The operation shown in the flowchart of FIG. 16 is also realized by the CPU 10 reading and executing a program stored in the memory 11. The weight change predicting operation shown in the flowchart of FIG. 16 is a specific example of the operation when the CPU 10 realizes the function shown in the first example by the extracting unit 110.
 図16を参照して、CPU10は、通信部14を介して入力装置400からキーワードを受信し、表示装置500から表示の要求を受信すると(ステップS201でYES)、ステップS203で当該キーワードを用いて第2データベース112を参照し、第2データベース112から当該キーワードを含むメニューを抽出する。 Referring to FIG. 16, when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S201), CPU 10 uses the keyword in step S203. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
 ステップS205でCPU10は、第1データベース111を参照して第1データベース111からステップS203で抽出されたメニューを含む食事情報を抽出し、抽出された食事情報から当該情報に含まれる体重差を読み出す。 In step S205, the CPU 10 refers to the first database 111, extracts the meal information including the menu extracted in step S203 from the first database 111, and reads the weight difference included in the information from the extracted meal information.
 CPU10は、上記ステップS205で抽出された食事情報ごとに含まれる体重差を読み出す。そして、上記ステップS205で抽出されたすべての食事情報について上記判断を行なうと(ステップS207でNO)、ステップS209でCPU10は、上記ステップS205で読み出された体重差に基づいて体重変化の予測値を表示するための表示データを生成し、ステップS211で表示装置500に対して送信する。 CPU10 reads the weight difference contained for every meal information extracted by the said step S205. When the above determination is made for all meal information extracted in step S205 (NO in step S207), in step S209, the CPU 10 predicts the weight change based on the weight difference read in step S205. Is generated and transmitted to the display device 500 in step S211.
 上記ステップS205で複数の食事情報が抽出され、それらに含まれる体重差が異なっている場合、上記ステップS209でCPU10は、たとえばそれら体重差の平均値を算出して該平均値を予測値として特定する、または、体重差のうちの最も大きいものを予測値として特定する、などの動作をさらに行なう。 When a plurality of pieces of meal information are extracted in step S205 and the weight differences included in them are different, in step S209, for example, the CPU 10 calculates an average value of the weight differences and specifies the average value as a predicted value. Or an operation such as specifying the largest of the weight differences as a predicted value is further performed.
 図17は、管理システム1Cに含まれる管理装置100でのメニュー提案動作の流れの具体例を表わしたフローチャートである。図17のフローチャートに表わされた動作もまた、CPU10がメモリ11に記憶されるプログラムを読み出して実行することで実現される。図17のフローチャートに表わされたメニュー提案動作は、CPU10が抽出部110によって上記第1の例で示された機能を実現する場合の動作の具体例である。 FIG. 17 is a flowchart showing a specific example of the menu proposal operation flow in the management apparatus 100 included in the management system 1C. The operation shown in the flowchart of FIG. 17 is also realized by the CPU 10 reading and executing a program stored in the memory 11. The menu suggestion operation shown in the flowchart of FIG. 17 is a specific example of the operation when the CPU 10 implements the function shown in the first example by the extraction unit 110.
 図17を参照して、CPU10は、通信部14を介して入力装置400からキーワードを受信し、表示装置500から表示の要求を受信すると(ステップS301でYES)、ステップS303で当該キーワードを用いて第2データベース112を参照し、第2データベース112から当該キーワードを含むメニューを抽出する。 Referring to FIG. 17, when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S301), CPU 10 uses the keyword in step S303. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
 ステップS305でCPU10は、第1データベース111を参照し、第1データベース111からステップS303で抽出されたメニューを含む食事情報を抽出する。該食事情報が上述の動作によって「太りにくい食事」に分類されているものである場合(ステップS307でYES)、ステップS309でCPU10は、当該食事情報に表わされたメニューを提案メニューと特定する。 In step S305, the CPU 10 refers to the first database 111, and extracts meal information including the menu extracted in step S303 from the first database 111. When the meal information is classified as “a meal that is difficult to get fat” by the above-described operation (YES in step S307), in step S309, the CPU 10 identifies the menu represented by the meal information as a suggested menu. .
 CPU10は、上記ステップS305で抽出された食事情報ごとに、当該食事情報が「太りにくい食事」に分類されているものであるか否かを判断する。そして、上記ステップS305で抽出されたすべての食事情報について上記判断を行なうと(ステップS311でNO)、ステップS313でCPU10は、上記ステップS309で提案メニューと特定された食事情報に基づいて表示データを生成し、ステップS315で表示装置500に対して送信する。 The CPU 10 determines, for each meal information extracted in step S305, whether or not the meal information is classified as “a meal that is difficult to get fat”. When the above determination is made for all meal information extracted in step S305 (NO in step S311), in step S313, the CPU 10 displays display data based on the meal information specified as the suggested menu in step S309. Generated and transmitted to the display device 500 in step S315.
 <第3の実施の形態の利点>
 管理システム1Cで以上の動作が実行されることで、ユーザは、入力したメニューや材料や食事のカテゴリなどからその食事を摂取した場合の朝夕体重の変化の推定値を知ることができる。また、入力したメニューや材料や食事のカテゴリなどに合致し、以前の自身の食事において「太りにくい食事」と分類されたメニューを知ることができる。
<Advantages of the third embodiment>
By executing the above operation in the management system 1C, the user can know the estimated value of change in morning and evening weight when the user ingests the meal from the input menu, material, meal category, and the like. In addition, it is possible to know a menu that matches the input menu, material, meal category, and the like and is classified as “a meal that is difficult to get fat” in the previous meal.
 特に、これらを食事を摂取する前に知ることで、すなわち、摂取しようとしている食事に関するキーワードを入力することでこれら情報を得ることで、摂取しようとしている食事のメニューを見直すことができる。たとえば、希望する材料や食事のカテゴリなどに基づき、「太りにくい食事」を摂取することが可能となる。そのため、自身の食事管理をより適切に行なうことができる。 In particular, by knowing these before taking a meal, that is, by inputting these keywords by inputting a keyword related to the meal to be taken, the menu of the meal to be taken can be reviewed. For example, it is possible to take “a meal that is difficult to get fat” based on the desired material, the category of the meal, and the like. Therefore, own meal management can be performed more appropriately.
 [他の実施の形態]
 管理装置100での動作をコンピュータに実行させるためのプログラムを提供することもできる。このようなプログラムは、コンピュータに付属するフレキシブルディスク、CD-ROM(Compact Disk-Read Only Memory)、ROM(Read Only Memory)、RAM(Random Access Memory)およびメモリカードなどの一次的ではない(non-transitory)コンピュータ読取り可能な記録媒体にて記録させて、プログラム製品として提供することもできる。あるいは、コンピュータに内蔵するハードディスクなどの記録媒体にて記録させて、プログラムを提供することもできる。また、ネットワークを介したダウンロードによって、プログラムを提供することもできる。
[Other embodiments]
A program for causing a computer to execute the operation of the management apparatus 100 can also be provided. Such programs include non-primary (non-random access memory) such as a flexible disk attached to a computer, a CD-ROM (Compact Disk-Read Only Memory), a ROM (Read Only Memory), a RAM (Random Access Memory), and a memory card. (transition) It can be recorded on a computer-readable recording medium and provided as a program product. Alternatively, the program can be provided by being recorded on a recording medium such as a hard disk built in the computer. A program can also be provided by downloading via a network.
 上述のようなプログラムは、コンピュータのオペレーティングシステム(OS)の一部として提供されるプログラムモジュールのうち、必要なモジュールを所定の配列で所定のタイミングで呼出して処理を実行させるものであってもよい。その場合、プログラム自体には上記モジュールが含まれずOSと協働して処理が実行される。このようなモジュールを含まないプログラムも、本発明にかかるプログラムに含まれ得る。 The program as described above may be a program module that is provided as a part of a computer operating system (OS) and that calls necessary modules in a predetermined arrangement at a predetermined timing to execute processing. . In that case, the program itself does not include the module, and the process is executed in cooperation with the OS. A program that does not include such a module can also be included in the program according to the present invention.
 また、本発明にかかるプログラムは、他のプログラムの一部に組込まれて提供されるものであってもよい。その場合にも、プログラム自体には上記他のプログラムに含まれるモジュールが含まれず、他のプログラムと協働して処理が実行される。このような他のプログラムに組込まれたプログラムも、本発明にかかるプログラムに含まれ得る。 Further, the program according to the present invention may be provided by being incorporated in a part of another program. Even in this case, the program itself does not include the module included in the other program, and the process is executed in cooperation with the other program. Such a program incorporated in another program can also be included in the program according to the present invention.
 提供されるプログラム製品は、ハードディスクなどのプログラム格納部にインストールされて実行される。なお、プログラム製品は、プログラム自体と、プログラムが記録された記録媒体とを含む。 The provided program product is installed in a program storage unit such as a hard disk and executed. The program product includes the program itself and a recording medium on which the program is recorded.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1,1A,1B,1C 管理システム、10,20,30,40 CPU、11,21,31,41 メモリ、14,24,34,44 通信部、22,32 操作ボタン、23 測定部、33 撮影部、42 操作ボタン、100 管理装置、101 画像入力部、102 テキスト入力部、103 格納部、104 測定値入力部、105 算出部、106 分類部、107 要求部、108 読出部、109 生成部、110 抽出部、111 第1データベース、112 第2データベース、200 体重計、300 カメラ、400 入力装置、500 表示装置。 1,1A, 1B, 1C management system 10, 20, 30, 40 CPU, 11, 21, 31, 41 memory, 14, 24, 34, 44 communication unit, 22, 32 operation buttons, 23 measurement unit, 33 shooting Unit, 42 operation buttons, 100 management device, 101 image input unit, 102 text input unit, 103 storage unit, 104 measurement value input unit, 105 calculation unit, 106 classification unit, 107 request unit, 108 readout unit, 109 generation unit, 110 Extraction unit, 111 1st database, 112 2nd database, 200 scale, 300 camera, 400 input device, 500 display device.

Claims (7)

  1.  食事情報を管理するための管理装置であって、
     ユーザの食事情報と体重とを日時と共に入力するための入力部と、
     前記食事情報と前記体重とを処理するための処理部と、
     前記処理部で処理された情報を出力するための出力部とを備え、
     前記処理部は、
      単位期間の前記食事情報と当該単位期間の前記ユーザの体重変化とを関連付けて記憶装置に記憶する処理と、
      前記単位期間の前記ユーザの体重変化に応じて、当該体重変化に関連付けて前記記憶装置に記憶された前記食事情報に基づく、食事内容を出力する処理とを実行する、管理装置。
    A management device for managing meal information,
    An input unit for inputting the meal information and weight of the user together with the date and time;
    A processing unit for processing the meal information and the weight;
    An output unit for outputting information processed by the processing unit,
    The processor is
    A process of associating and storing in the storage device the meal information of the unit period and the weight change of the user of the unit period;
    The management apparatus which performs the process which outputs the content of a meal based on the said meal information memorize | stored in the said memory | storage device in association with the said weight change according to the said user's weight change of the said unit period.
  2.  前記単位期間は一日であって、
     前記処理部は、前記食事内容を出力する処理で、しきい値よりも大きい一日の前記ユーザの体重変化と関連付けられた食事情報を第1の食事情報、および、前記しきい値よりも小さい一日の前記ユーザの体重変化と関連付けられた食事情報を第2の食事情報に分類し、前記第1の食事情報および前記第2の食事情報のそれぞれに基づく食事内容を、前記第1の食事情報および前記第2の食事情報の分類と共に出力する、請求項1に記載の管理装置。
    The unit period is one day,
    In the process of outputting the meal content, the processing unit sets the meal information associated with the weight change of the user in a day larger than a threshold value to be smaller than the first meal information and the threshold value. The meal information associated with the weight change of the user in one day is classified into second meal information, and the meal content based on each of the first meal information and the second meal information is defined as the first meal. The management apparatus according to claim 1, wherein the management apparatus outputs the information together with the classification of the second meal information.
  3.  前記処理部は、所定期間の前記ユーザの体重変化を用いて前記しきい値を算出する、請求項2に記載の管理装置。 The management device according to claim 2, wherein the processing unit calculates the threshold value using a change in the weight of the user during a predetermined period.
  4.  前記食事情報は、当該食事を撮影して得られる撮影画像を含み、
     前記処理部は、前記撮影画像に基づく画像を前記食事内容として出力する処理を実行する、請求項1に記載の管理装置。
    The meal information includes a photographed image obtained by photographing the meal,
    The management device according to claim 1, wherein the processing unit executes a process of outputting an image based on the captured image as the meal content.
  5.  食事情報を管理するための管理装置であって、
     ユーザの食事情報と体重とを日時と共に入力するための第1の入力部と、
     食事内容を入力するための第2の入力部と、
     前記食事情報と前記体重とを処理するための処理部と、
     前記処理部で処理された情報を出力するための出力部とを備え、
     前記処理部は、
      単位期間の前記食事情報と当該単位期間の前記ユーザの体重変化とを関連付けて記憶装置に記憶する処理と、
      前記入力された食事内容と対応した食事情報と関連付けて前記記憶装置に記憶された、前記単位期間の前記ユーザの体重変化を体重変化の推定値として出力する処理とを実行する、管理装置。
    A management device for managing meal information,
    A first input unit for inputting meal information and weight of the user together with the date and time;
    A second input unit for inputting meal contents;
    A processing unit for processing the meal information and the weight;
    An output unit for outputting information processed by the processing unit,
    The processor is
    A process of associating and storing in the storage device the meal information of the unit period and the weight change of the user of the unit period;
    The management apparatus which performs the process which outputs the said user's weight change of the said unit period as an estimated value of a weight change memorize | stored in the said memory | storage device in association with the said meal content corresponding to the said input meal content.
  6.  食事情報を管理するための管理システムであって、
     食事内容を日時と共に入力するための第1の入力装置と、
     体重を日時と共に入力するための第2の入力装置と、
     前記食事情報と前記体重とを処理するための管理装置と、
     表示装置とを備え、
     前記管理装置は、
      単位期間の前記食事情報と当該単位期間の前記ユーザの体重変化とを関連付けて記憶装置に記憶する処理と、
      前記単位期間の前記ユーザの体重変化に応じて、当該体重変化に関連付けて前記記憶装置に記憶された前記食事情報に基づく、食事内容を表示させるための表示データを前記表示装置に出力する処理とを実行する、管理システム。
    A management system for managing meal information,
    A first input device for inputting meal contents together with the date and time;
    A second input device for entering weight along with date and time;
    A management device for processing the meal information and the weight;
    A display device,
    The management device
    A process of associating and storing in the storage device the meal information of the unit period and the weight change of the user of the unit period;
    A process of outputting display data for displaying meal contents to the display device based on the meal information stored in the storage device in association with the weight change according to the weight change of the user in the unit period; Run the management system.
  7.  管理装置と表示装置とを含む、食事情報を管理するための管理システムにおける前記食事情報の表示方法であって、
     ユーザの食事情報と体重とを日時と共に入力するステップと、
     単位期間の前記ユーザの体重変化に基づき、前記単位期間の食事情報を、前記体重変化がしきい値よりも大きい第1の食事情報および前記しきい値よりも小さい第2の食事情報に分類するステップと、
     前記単位期間の食事情報を、前記第1の食事情報または前記第2の食事情報の分類と共に表示するステップとを備える、表示方法。
    A display method of the meal information in a management system for managing meal information, including a management device and a display device,
    Entering the user's meal information and weight along with the date and time;
    Based on the weight change of the user in the unit period, the meal information in the unit period is classified into first meal information in which the weight change is larger than a threshold value and second meal information in which the weight change is smaller than the threshold value. Steps,
    And displaying the unit period meal information together with the classification of the first meal information or the second meal information.
PCT/JP2011/076928 2011-02-16 2011-11-22 Management device, management system, and display method WO2012111209A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011030777A JP2012168857A (en) 2011-02-16 2011-02-16 Management device, management system and display method in management system
JP2011-030777 2011-02-16

Publications (1)

Publication Number Publication Date
WO2012111209A1 true WO2012111209A1 (en) 2012-08-23

Family

ID=46672159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/076928 WO2012111209A1 (en) 2011-02-16 2011-11-22 Management device, management system, and display method

Country Status (2)

Country Link
JP (1) JP2012168857A (en)
WO (1) WO2012111209A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6671156B2 (en) * 2015-11-26 2020-03-25 共同印刷株式会社 System, server and method
JP6639216B2 (en) * 2015-12-15 2020-02-05 共同印刷株式会社 System, server and method
JP7007101B2 (en) * 2017-04-19 2022-01-24 Nttテクノクロス株式会社 Meal content proposal device, meal content proposal system and meal content proposal method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296481A (en) * 2005-04-15 2006-11-02 Matsushita Electric Works Ltd Bathroom vanity unit with weight measuring function
JP2008304421A (en) * 2007-06-11 2008-12-18 Omron Healthcare Co Ltd Bathroom scale
JP2010277476A (en) * 2009-05-29 2010-12-09 Kyoto Univ Health guidance system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296481A (en) * 2005-04-15 2006-11-02 Matsushita Electric Works Ltd Bathroom vanity unit with weight measuring function
JP2008304421A (en) * 2007-06-11 2008-12-18 Omron Healthcare Co Ltd Bathroom scale
JP2010277476A (en) * 2009-05-29 2010-12-09 Kyoto Univ Health guidance system

Also Published As

Publication number Publication date
JP2012168857A (en) 2012-09-06

Similar Documents

Publication Publication Date Title
US11430571B2 (en) Wellness aggregator
US10841476B2 (en) Wearable unit for selectively withholding actions based on recognized gestures
JP5527423B2 (en) Image processing system, image processing method, and storage medium storing image processing program
US20180144831A1 (en) Real-time or just-in-time online assistance for individuals to help them in achieving personalized health goals
CN109599161A (en) Body movement and body-building monitor
KR102330878B1 (en) Information processing device, information processing method, and information processing system
EP4053767A1 (en) Method for displaying and selecting data
König et al. Characteristics of smartphone-based dietary assessment tools: A systematic review
JP4972527B2 (en) Movie display system, movie display method, and computer program
WO2012111209A1 (en) Management device, management system, and display method
KR20080021513A (en) System and method for managing a dietetic therapy using the network
CN112988789A (en) Medical data query method, device and terminal
JP2013029877A (en) Data management system, data management method and program
JP2001299767A (en) Allergy disease information processing system, allergy disease information processing method and computer readable recording medium storing program for making computer perform such method
JP2001318991A (en) Nutrition control system using information system
CN116959733A (en) Medical data analysis method, device, equipment and storage medium
US20200234187A1 (en) Information processing apparatus, information processing method, and program
CN111295716A (en) Health management support device, method, and program
CN106062807A (en) Systems and methods for delivering task-oriented content
KR102214792B1 (en) Health status and eating habit check system and method of providing customized health information thereof
JP2005275606A (en) Mobile communication terminal, healthcare apparatus, health counseling apparatus, calorie browse terminal, and healthcare support system and method
JP7069997B2 (en) Health information management server and health information management system
JP6931959B1 (en) Recipe search support device, recipe search support method, and recipe search support program
JP7041332B2 (en) Diet management server and diet management server control method and diet management program
JP6325241B2 (en) Medical information system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11858938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11858938

Country of ref document: EP

Kind code of ref document: A1