WO2012111209A1 - Dispositif et système de gestion, et procédé d'affichage - Google Patents

Dispositif et système de gestion, et procédé d'affichage Download PDF

Info

Publication number
WO2012111209A1
WO2012111209A1 PCT/JP2011/076928 JP2011076928W WO2012111209A1 WO 2012111209 A1 WO2012111209 A1 WO 2012111209A1 JP 2011076928 W JP2011076928 W JP 2011076928W WO 2012111209 A1 WO2012111209 A1 WO 2012111209A1
Authority
WO
WIPO (PCT)
Prior art keywords
meal
information
meal information
weight
unit
Prior art date
Application number
PCT/JP2011/076928
Other languages
English (en)
Japanese (ja)
Inventor
秀武 大島
Original Assignee
オムロンヘルスケア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロンヘルスケア株式会社 filed Critical オムロンヘルスケア株式会社
Publication of WO2012111209A1 publication Critical patent/WO2012111209A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Definitions

  • the present invention relates to a management device and a management system for managing meal contents, and a display method in the management system.
  • Patent Document 1 discloses a system that acquires meal contents as image data in a server, analyzes the image data, and presents improvement advice and the like. is doing.
  • the nutritional amount and nutritional balance based on the image data are input by an advisor who is a registered dietitian who viewed the image. That the server automatically calculates the nutritional amount and nutritional balance when the advisor decides and inputs the food and quantity, and the advisor needs to make such a determination was there.
  • the present invention has been made in view of such problems, and provides a management device, a management system, and a display method capable of performing meal management suitable for individual users without requiring complicated processing. It is an object.
  • the management device is a management device for managing meal information, and an input unit for inputting the meal information and weight of the user together with the date and time, A processing unit for processing meal information and body weight and an output unit for outputting information processed by the processing unit are included.
  • the processing unit associates the meal information of the unit period with the change in weight of the user during the unit period and stores it in the storage device, and associates the change in weight of the user during the unit period with the change in weight in the storage device.
  • a process of outputting meal contents based on the stored meal information is executed.
  • the unit period is one day
  • the processing unit outputs the meal information associated with the weight change of the user for one day larger than the threshold in the process of outputting the meal content, And classifying the meal information associated with the weight change of the user of the day smaller than the threshold value into the second meal information, and the meal contents based on each of the first meal information and the second meal information, Output together with the classification of the first meal information and the second meal information.
  • the processing unit calculates the threshold value using the weight change of the user during a predetermined period.
  • the meal information includes a captured image obtained by capturing the meal, and the processing unit executes a process of outputting an image based on the captured image as the content of the meal.
  • the management device is a management device for managing meal information, the first input unit for inputting the meal information and weight of the user together with the date and time, and the input of the meal contents
  • the processing unit associates the meal information of the unit period with the weight change of the user of the unit period and stores it in the storage device, and stores the meal information in association with the input meal content, A process of outputting the weight change of the user during the unit period as an estimated value of the weight change is executed.
  • the management system is a management system for managing meal information, the first input device for inputting meal contents together with the date and time, and the weight for inputting weight together with the date and time.
  • a second input device, a management device for processing meal information and weight, and a display device are included.
  • the management device associates the meal information of the unit period with the change in the user's weight during the unit period and stores it in the storage device, and associates the change in the weight of the user during the unit period with the change in the weight in the storage device.
  • a process of outputting display data for displaying meal contents to the display device based on the stored meal information is executed.
  • a display method of meal information in a management system is a display method of meal information in a management system for managing meal information, including a management device and a display device,
  • the step of inputting meal information and body weight together with the date and time, and the meal information of the unit period based on the weight change of the user during the unit period are set to be greater than the first meal information and threshold value where the weight change is greater than the threshold
  • the step of classifying into small second meal information and the step of displaying the meal information of the unit period together with the classification of the first meal information or the second meal information are included.
  • the present invention it is possible to present a meal that is easily fattened or hard to fat for an individual user without requiring complicated processing. Therefore, meal management suitable for each user can be performed.
  • FIG. 1 is a diagram showing a specific example of the configuration of a meal information management system (hereinafter referred to as a management system) 1 according to the present embodiment.
  • a management system a meal information management system
  • the management system 1 includes a management device 100, a weight scale 200 connected to the management device 100, a camera 300, an input device 400, and a display device 500 connected to the management device 100. .
  • the management system 1A according to the first embodiment described later
  • the management system 1B according to the second embodiment
  • the management system 1C according to the third embodiment. These are representatively referred to as the management system 1.
  • the system configuration is common to the management systems 1A to 1C.
  • the management device 100 may be a device having a communication function, and may be configured by, for example, a general personal computer.
  • the weight scale 200, the camera 300, the input device 400, and the display device 500 all have a communication function capable of communicating with the management device 100.
  • these devices and the management device 100 may be directly connected to each other through a dedicated line or a wireless line, or may be communicated via the Internet.
  • these devices communicate with the management device 100 via the Internet. Therefore, the communication function of these devices corresponds to the function of connecting to the Internet and communicating with the management device 100 using the access information of the management device 100 on the Internet.
  • the input device 400 may be any device as long as it has a function for mainly accepting input of text data and transmitting it to the management device 100.
  • a mobile phone or a general personal computer can be cited.
  • an email having the text data as a body may be transmitted from the input device 400 to the management device 100.
  • the scale 200 has a function of measuring the body weight and transmitting the measured value to the management apparatus 100.
  • the camera 300 has a function for capturing an image and transmitting the captured image to the management apparatus 100.
  • an electronic mail in which information to be transmitted is attached data may be transmitted from these apparatuses to the management apparatus 100.
  • the display device 500 can receive any display data expressed in, for example, HTML (Hyper Text Markup Language) from the management device 100, and can be any device having a function for executing display processing based on the display data. It may be a device. As an example, a mobile phone or a general personal computer can be cited.
  • HTML Hyper Text Markup Language
  • the weight scale 200, the camera 300, the input device 400, and the display device 500 are different devices, that is, constituted by separate devices.
  • the camera 300 and the input device 400 may be an integrated device, or the display unit of the input device 400 that is a camera may be used as the display device 500.
  • the management system 1 includes a scale 200 having a communication function.
  • the input device 400 may receive an input of a weight measurement value, and the input device 400 may transmit the information to the management device 100.
  • the management device 100 is a device different from any of the weight scale 200, the camera 300, the input device 400, and the display device 500.
  • the management apparatus 100 may be included in any of these apparatuses.
  • ⁇ Device configuration> 2 to 5 are diagrams showing specific examples of device configurations of the weight scale 200, the camera 300, the input device 400, and the management device 100, respectively.
  • a weight scale 200 includes a CPU (Central Processing Unit) 20 for performing overall control, a memory 21 for storing programs executed by the CPU 20, measurement values, and the like, and user operation inputs.
  • An operation button 22 for receiving, a measuring unit 23 for measuring body weight, and an access destination on the Internet of the management device 100 are stored in advance, and a communication unit for communicating with the management device 100 via the Internet 24.
  • the CPU 20 causes the measurement unit 23 to execute processing for measuring body weight in accordance with an operation signal based on an operation instructing measurement start from the operation button 22.
  • the measured value is temporarily stored in a predetermined area of the memory 21.
  • the CPU 20 reads the measurement value from the area of the memory 21, and sends the measurement value to the communication unit 24 as a management device.
  • the processing for transmitting to 100 is executed.
  • camera 300 includes a CPU 30 for performing overall control, a memory 31 for storing programs executed by CPU 30 and captured images, and operation buttons 32 for receiving user operation inputs.
  • the image capturing unit 33 for performing image capturing and the communication unit 34 for storing the access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
  • the CPU 30 causes the photographing unit 33 to execute a photographing process in response to an operation signal based on an operation instructing the measurement start from the operation button 32.
  • the captured image is temporarily stored in a predetermined area of the memory 31. Further, in response to an operation signal based on an operation for instructing transmission of a captured image from the operation button 32, the CPU 30 reads the captured image from the area of the memory 31 and sends the captured image to the communication unit 34 as a management device. The processing for transmitting to 100 is executed.
  • an input device 400 includes a CPU 40 for performing overall control, a memory 41 for storing programs executed by CPU 40, input information, and the like, and includes user alphabets and numeric keys.
  • An operation button 42 for receiving an operation input and a communication unit 44 for storing an access destination on the Internet of the management apparatus 100 in advance and communicating with the management apparatus 100 via the Internet are included.
  • the CPU 40 specifies information obtained by converting the operation signal as input information in accordance with an operation signal based on an operation for inputting information from the operation button 42, and stores it in a predetermined area of the memory 41. Further, in response to an operation signal based on an operation instructing transmission of input information from the operation button 42, the CPU 40 reads the input information from the area of the memory 41, and transmits the input information to the communication unit 44. The processing for transmitting to 100 is executed.
  • management device 100 has a CPU 10 for overall control, a memory 11 for storing a program executed by CPU 10, and a communication for communicating with other devices via the Internet. Part 14.
  • the memory 11 stores a first database 111 described later.
  • FIG. 6 is a diagram for explaining a flow of management of meal information in the management system 1A.
  • the user measures the morning and evening weights using the weight scale 200, and transmits the measured value to the management apparatus 100. That is, the weight measurement is performed by the user instructing the weight scale 200 to measure in a time zone (for example, 6:00 am to 9:00 am) preliminarily designated as “morning” (step S11-1). The result is transmitted to the management apparatus 100 together with the measurement date and time (step S11-3).
  • the weight measured in the above time zone is also referred to as “morning weight”.
  • step S12-1 when the user instructs the weight scale 200 to measure in a time zone (for example, 8:00 pm to 11:00 pm, etc.) previously defined as “evening” (step S12-1), the weight measurement is performed. Is performed (step S12-2), and the result is transmitted to the management apparatus 100 together with the measurement date (step S12-3).
  • the weight measured in the above time zone is also referred to as “evening weight”.
  • the user captures the contents of the meal to be ingested with the camera 300 during the period between weight measurement in the morning and evening, and transmits the image data to the management apparatus 100. That is, breakfast is photographed by the user instructing the camera 300 to take a picture during a time period preliminarily defined as “breakfast” (for example, from 6 am to 10 am) (step S21-1). In step S21-2), the photographed image is transmitted to the management apparatus 100 together with the photographing date and time (step S21-3).
  • a time zone pre-defined as “lunch” for example, from 10 am to 2 pm
  • a time zone pre-defined as “dinner” for example, from 5 pm to 9 pm
  • the user may input using the input device 400 and transmit the content to the management device 100.
  • the management apparatus 100 calculates the weight difference between the morning weight and the evening weight for each measurement date from the measured weight value transmitted in association with the measurement date and time from the scale 200 (step S31).
  • the captured image from the camera 300 is stored as meal information in association with the shooting date and time. If there is input information such as a comment input from the input device 400 in association with the captured image, the input information is also stored in the first database 111 of the memory 11 as meal information.
  • the management apparatus 100 classifies the meal information into “meal that tends to be fat” and “meal that is difficult to gain” based on the weight difference.
  • a specific instruction method includes, for example, accessing a preset WEB site and performing an operation for requesting.
  • the display device 500 requests display data from the management device 100 (step S41-2).
  • the management apparatus 100 prepares in advance a photographed image that is meal information classified as “mealing easily” and a photographed image that is meal information classified as “food that is not easily fat”.
  • display data is generated (step S33-1), and the display data is transmitted to the display device 500 that is the request source.
  • the meal information includes user information or the like in advance
  • user authentication is performed using the information included in the request from the display device 500 and the user information, and display data is transmitted when the authentication is successful. You may do it.
  • the display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, a photographed image of a meal classified as “meal with easy weight gain” and a photographed image of a meal classified as “meal with difficulty in weight gain” are displayed.
  • the weight scale 200, the camera 300, and the input device 400 all store in advance a function for acquiring information (measurement value, captured image, input information) in accordance with an operation input from the user. It has a function for transmitting to the management apparatus 100. These functions are mainly realized by the CPU by the CPU included in each device reading and executing the program stored in the memory.
  • the display device 500 has a function for transmitting a request specified in advance according to a display operation from the user to the management device 100, and display data transmitted from the management device 100 in response to the request. And a function for executing a process for performing screen display based on the display data.
  • These functions are mainly realized by the CPU by reading and executing a program stored in the memory by a CPU (not shown) included in the display device 500.
  • FIG. 7 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1A.
  • Each function shown in FIG. 7 is mainly realized by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11.
  • at least a part may be configured by hardware such as an electric circuit.
  • the CPU 10 of the management apparatus 100 receives an image input unit 101 for receiving an input of a photographed image from the camera 300 via the communication unit 14 and text information from the input device 400 via the communication unit 14.
  • a text input unit 102 for accepting input of certain input information, and a captured image associated with the photographing date and time (or associated with the input information, if any) stored in the first database 111 stored in the memory 11 as meal information
  • a storage unit 103 for performing measurement, a measurement value input unit 104 for receiving an input of a measurement value from the scale 200 via the communication unit 14, and a measurement date based on information for specifying a measurement date and time associated with the measurement value
  • a morning weight and an evening weight for each measurement date and a calculation unit 105 for calculating a weight difference between the morning weight and the evening weight for each measurement day, and based on the weight difference
  • a classifying unit 106 for classifying whether meal information is likely to be fat or difficult to eat for each photographing date corresponding to a fixed day
  • a request unit 107 for
  • the classification unit 106 classifies whether the meal is easy to get fat or less fat based on the weight difference between the morning weight and the evening weight on the measurement day corresponding to the shooting date. Information specifying the classification is further included in the meal information and stored in the first database 111. The calculated weight difference is further included in the meal information and stored in the first database 111.
  • the classification unit 106 stores a threshold value in advance, compares the weight difference with the threshold value, and if the food exceeds the threshold value, There is a method of classifying a meal that is difficult to get fat when it is smaller than a threshold value.
  • a predetermined number for example, 0.2 kg
  • a meal that tends to become fat from the average value
  • a method of classifying as a meal that is difficult to get fat when it is smaller than the predetermined number is mentioned.
  • the reading unit 108 In response to a request from the display device 500, the reading unit 108, as an example, for meal information for a predetermined period (number of days) prior to the reception of the request, meal information classified as being easily fattened and fat Meal information classified as difficult meals may be read out. Display data is generated based on the read information and displayed on the screen, so that the tendency of recent meals can be known.
  • This display example is a first display example.
  • the weight difference between the morning weight and the evening weight included in the meal information is referred to, and the predetermined number in order from the largest weight difference as meal information classified as an easily fattened meal
  • the predetermined number of pieces of meal information may be read in order from the smallest weight difference as meal information classified as meals that are difficult to get fat.
  • Display data is generated based on the read information and displayed on the screen, so that it is possible to know the ranking from the top of a meal that is easily fattened and the lineking from the top of a meal that is not easily fattened.
  • the reading unit 108 reads meal information for a predetermined period (number of days) before the reception of the request, and generates display data after the generation unit 109 extracts the meal information according to the weight difference. You may make it do.
  • This display example is a first display example.
  • FIG. 8 is a flowchart showing a specific example of the flow of operations in the management apparatus 100 included in the management system 1A.
  • the operation shown in the flowchart of FIG. 8 is realized by the CPU 10 reading and executing a program stored in the memory 11.
  • step S103 CPU 10 determines the measurement date from the measurement date and time information associated with the measurement value.
  • the time period specified in advance and the measurement time are compared to specify whether the body weight is morning weight or evening weight, and the measurement value is stored together with the information.
  • step S107 the CPU 10 specifies a photographing date from information on the photographing date and time associated with the photographed image and is defined in advance. By comparing a certain time zone and the photographing time, it is specified whether it is breakfast, lunch or dinner, and the photographed image is stored as meal information together with the information.
  • step S109 the CPU 10 calculates the weight difference between the morning weight and the evening weight for each measurement day.
  • the process of step S109 may be performed at the timing when the received measurement value is identified as the evening weight in step S103, or may be performed at a predetermined time (for example, midnight). It may be performed at a timing when a display request described later is received.
  • step S111 the CPU 10 classifies the meal information associated with the shooting date corresponding to the measurement date based on the weight difference between the morning and evening weights calculated in step S109 as a meal that tends to be fat or a meal that is difficult to gain weight. .
  • the CPU 10 further stores information specifying the classification and the weight difference together with the meal information.
  • step S113 When a display request is received from the display device 500 via the communication unit 14 (YES in step S113), the CPU 10 reads meal information from the first database 111 in step S115, and generates display data corresponding to the classification in step S117. To do.
  • step S117 display data as described as the first display example and the second display example is generated as an example.
  • step S ⁇ b> 119 the CPU 10 transmits the generated display data to the display device 500.
  • meal information including captured images of daily meals is classified according to the weight difference between the morning and evening weights of the day, and displayed on the display device 500 based on the classification.
  • FIG. 9 is a diagram illustrating a first specific example of the first display example.
  • a meal image whose shooting date is the day is shown as meal information for that day. Is displayed. In the example of FIG. 9, only one meal is displayed, but an image of three meals (or four meals or more including snacks) may be displayed on one screen or scrolled and displayed on another screen. Further, as the classification result, whether the meal is a fat meal or a meal that is difficult to fat is displayed. In the example of FIG. 9, an example is shown in which the classification result is displayed as “B” for an easily fattened meal and “G” for a less easily fattened meal.
  • FIG. 10 is a diagram showing a second specific example of the first display example.
  • the example of FIG. 10 represents only a captured image and a classification result portion of the display screen as a first display example.
  • images of three meals are displayed side by side on one screen for each measurement day, and further, the day The weight difference between the morning weight and the evening weight is displayed. Further, as the classification result, the three meals on the measurement date are displayed surrounded by a frame corresponding to the classification result indicating whether the meal is easily fat or not.
  • a thick line double line
  • a dotted line frame is displayed for a meal that is not fat.
  • the measurement value is transmitted to the management apparatus 100 using the weight scale 200, and the captured image is transmitted to the management apparatus 100 using the camera 300.
  • the corresponding meal information may be input as text information representing a menu using the input device 400 and transmitted to the management device 100, for example.
  • input text information may be displayed instead of the captured image.
  • the measurement value is input as text information using the input device 400 and transmitted to the management device 100. May be.
  • FIG. 11 is a diagram illustrating a specific example of the second display example.
  • meal information on the measurement date is read out in order of increasing or decreasing weight difference between morning weight and evening weight for each day in a predetermined period. Is displayed.
  • captured images corresponding to measurement days with large weight differences are displayed in order of increasing weight difference as “menu ranking easy to gain weight”, and measurement dates with small weight difference are displayed as “menu ranking difficult to gain weight”.
  • An example is shown in which captured images on corresponding shooting dates are displayed in ascending order of weight difference.
  • the user can easily make the meal contents on the day when the weight difference of the day is large in his / her meal, and the meal contents on the day when the weight difference is not so large. Can be grasped as a meal that is difficult to gain weight.
  • the user can grasp the meal content that tends to increase the daily weight difference and the meal content that is difficult to increase in his / her meal.
  • Etc. can be grasped at a glance. Therefore, own meal management can be easily performed.
  • FIG. 12 is a diagram for explaining a flow of management of meal information in the management system 1B.
  • the measurement operation with the weight scale 200 is the same as the operation with the management system 1A shown in FIG.
  • management system 1 ⁇ / b> B compared with the management flow in management system 1 ⁇ / b> A shown in FIG. 6, in management system 1 ⁇ / b> B, the user transmits a captured image of each meal and uses input device 400.
  • a keyword for the meal content is input (steps S51-1, S52-1, and S53-1), and the keyword is transmitted to the management apparatus 100.
  • Keywords include, for example, ingredients included in the meal content, menu genres, and the like.
  • the keyword may be directly input by a character button constituting the operation button 42, or may be selected from options prepared in advance.
  • the management apparatus 100 also stores keywords as meal information along with the captured images. Further, upon receiving the display request (step S41-2) from the display device 500, the management device 100 generates display data using the classification results of “meal that tends to be fat” and “meal that is difficult to gain weight”. Then, meal information to be substituted for meal information classified as “fat easily” is extracted (step S33-0), and a captured image included in the meal information is displayed as an alternative menu in the display data (step S33-0). Step S33-1).
  • the display device 500 performs screen display by executing display processing based on the display data (step S42). On the screen, as with the management system 1A, a photographed image of a meal classified as “easy to get fat” and a photographed image of a meal classified as “a meal that is difficult to get fat” are displayed. The photographed image included in the meal information extracted as a meal that substitutes for the meal classified as "is displayed.
  • FIG. 13 is a block diagram illustrating a specific example of a functional configuration of the management apparatus 100 included in the management system 1B.
  • Each function shown in FIG. 13 is mainly formed by the CPU 10 when the CPU 10 reads and executes a program stored in the memory 11.
  • at least a part may be configured by hardware such as an electric circuit.
  • CPU 10 of management device 100 further includes an extraction unit 110 for extracting meal information to be substituted in addition to the functional configuration of management device 100 included in management system 1 ⁇ / b> A shown in FIG. 7. Including.
  • the memory 11 stores a second database 112.
  • FIG. 14 is a diagram showing a specific example of information described in the second database 112. As shown in FIG. 14, in the second database 112, information such as materials, categories, and calories is described for each menu and stored in the memory 11 in advance.
  • the extraction unit 110 refers to the meal information classified as “fat that is easy to get fat” by the classification unit 106 and extracts keywords representing the category and material of the menu included in the meal information. To do.
  • the extraction unit 110 reads the calorie of the menu by referring to the second database 112 based on the keyword. Then, the extraction unit 110 extracts from the second database 112 as a menu that substitutes for another menu that has a category smaller than that of the menu and that has the same category as that of the menu or another menu with overlapping materials. To do.
  • a menu having a lower calorie than the ingested meal can be extracted as an alternative menu based on a pre-defined database.
  • the extraction unit 110 refers to the meal information classified as “fat that tends to be fat” by the classification unit 106, and extracts a keyword representing the category or material of the menu included in the meal information. To do.
  • the extraction unit 110 refers to the first database 111 based on the keyword, so that meals on other measurement days associated with a weight difference smaller than the weight difference on the measurement date corresponding to the photographing date are associated with the meal information.
  • the menu is extracted from the first database 111 as a menu that substitutes for another menu having the same category as that of the menu or another menu having overlapping materials.
  • FIG. 15 is a flowchart showing a specific example of the operation flow in the management apparatus 100 included in the management system 1B. The operation shown in the flowchart of FIG. 15 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • step S101 to S115 operations similar to those of the management apparatus 100 included in the management system 1A shown in FIG. 8 are performed from step S101 to S115.
  • the CPU 10 reads meal information from the first database 111 in step S115, and in step S116, the above-described processing is performed. In this way, an alternative menu is extracted for meal information classified as fat meals. And CPU10 produces
  • an alternative menu is displayed in addition to the screen contents represented by the display data in the management apparatus 100 included in the management system 1A.
  • a display method a method of displaying an alternative menu on the screen as shown in FIG. 9 to FIG. 11 or an alternative menu on the screen as shown in FIG. 9 to FIG.
  • a button or the like for instructing display may be displayed, and an alternative menu may be displayed on a screen that is switched when the button is pressed.
  • the alternative menu when the alternative menu is extracted from the first database 111 as described above, the alternative menu may be displayed using a captured image included in the meal information.
  • the weight change prediction operation executed in the management system 1C and the menu suggestion operation executed in the management system 1C all represent menus, materials, meal categories, and the like from the input device 400 in the management device 100. This is executed by receiving a display request from the display device 500 together with the input of the keyword.
  • the user when executing a weight change prediction operation, the user inputs a menu, a material, a meal category, or the like using the input device 400 as text information before ingesting a meal or when transmitting a captured image. It is transmitted to the management apparatus 100 as input information. Furthermore, a request for prediction of weight transition is transmitted from the display device 500.
  • the management apparatus 100 refers to the meal information including the keyword stored in the first database 111 and reads the weight change of the day represented by the meal information. Then, the result is displayed as a predicted value of the change in weight. By displaying the information on the display device 500, the user can know the predicted value of the change from the morning weight by ingesting a meal corresponding to the input menu, material, meal category, or the like.
  • the user determines a material, a meal category, or the like before taking a meal or transmitting a photographed image, and uses the contents as keywords as a management device. 100, and the display device 500 transmits a menu presentation request.
  • the management apparatus 100 extracts meal information that is a weight difference that is considered to be a meal in which the weight difference of the user is less likely to be fat from the meal information including the keyword, and displays it as a suggestion menu.
  • the user By displaying the information on the display device 500, the user knows a menu that matches the input material, the category of the meal, and the like and is classified as “a meal that is difficult to get fat” in the previous meal. it can.
  • the functional configuration of the management apparatus 100 included in the management system 1C for performing the above operation is the same as the functional configuration of the management apparatus 100 included in the management system 1B shown in FIG. That is, the second database 112 is stored in the memory 11, and the extraction unit 110 is included in addition to the management device 100 included in the management system 1A.
  • the extraction unit 110 refers to the second database 112 using a keyword from the input device 400 and extracts a menu including the keyword. Furthermore, the extraction unit 110 refers to the first database 111 using the menu, extracts meal information including the menu stored in the first database 111, and the weight included in the meal information. Read the difference. Alternatively, the extraction unit 110 refers to the first database 111 using the menu, and among the meal information including the menu stored in the first database 111, the meal information classified as “a meal that is difficult to get fat” To extract.
  • the extraction unit 110 when the meal information stored in the first database 111 includes a keyword representing the meal material or the meal category input from the input device 400, the extraction unit 110 The first database 111 is referred to using the keyword from the input device 400, and the weight difference included in the information is read from the meal information including the keyword stored in the first database 111.
  • the extraction unit 110 refers to the first database 111 using the keyword from the input device 400 and classifies the meal information including the keyword stored in the first database 111 as “a meal that is difficult to get fat”. Extracted meal information.
  • the management apparatus 100 included in the management system 1C further performs a weight change prediction operation and a menu suggestion operation.
  • FIG. 16 is a flowchart showing a specific example of a flow of a weight change prediction operation in the management apparatus 100 included in the management system 1C.
  • the operation shown in the flowchart of FIG. 16 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • the weight change predicting operation shown in the flowchart of FIG. 16 is a specific example of the operation when the CPU 10 realizes the function shown in the first example by the extracting unit 110.
  • CPU 10 when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S201), CPU 10 uses the keyword in step S203. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
  • step S205 the CPU 10 refers to the first database 111, extracts the meal information including the menu extracted in step S203 from the first database 111, and reads the weight difference included in the information from the extracted meal information.
  • step S205 the CPU 10 reads the weight difference contained for every meal information extracted by the said step S205.
  • step S209 the CPU 10 predicts the weight change based on the weight difference read in step S205. Is generated and transmitted to the display device 500 in step S211.
  • step S209 the CPU 10 calculates an average value of the weight differences and specifies the average value as a predicted value. Or an operation such as specifying the largest of the weight differences as a predicted value is further performed.
  • FIG. 17 is a flowchart showing a specific example of the menu proposal operation flow in the management apparatus 100 included in the management system 1C.
  • the operation shown in the flowchart of FIG. 17 is also realized by the CPU 10 reading and executing a program stored in the memory 11.
  • the menu suggestion operation shown in the flowchart of FIG. 17 is a specific example of the operation when the CPU 10 implements the function shown in the first example by the extraction unit 110.
  • CPU 10 when CPU 10 receives a keyword from input device 400 via communication unit 14 and receives a display request from display device 500 (YES in step S301), CPU 10 uses the keyword in step S303. With reference to the second database 112, a menu including the keyword is extracted from the second database 112.
  • step S305 the CPU 10 refers to the first database 111, and extracts meal information including the menu extracted in step S303 from the first database 111.
  • the meal information is classified as “a meal that is difficult to get fat” by the above-described operation (YES in step S307)
  • step S309 the CPU 10 identifies the menu represented by the meal information as a suggested menu. .
  • the CPU 10 determines, for each meal information extracted in step S305, whether or not the meal information is classified as “a meal that is difficult to get fat”. When the above determination is made for all meal information extracted in step S305 (NO in step S311), in step S313, the CPU 10 displays display data based on the meal information specified as the suggested menu in step S309. Generated and transmitted to the display device 500 in step S315.
  • the user can know the estimated value of change in morning and evening weight when the user ingests the meal from the input menu, material, meal category, and the like.
  • the menu of the meal to be taken can be reviewed. For example, it is possible to take “a meal that is difficult to get fat” based on the desired material, the category of the meal, and the like. Therefore, own meal management can be performed more appropriately.
  • a program for causing a computer to execute the operation of the management apparatus 100 can also be provided.
  • Such programs include non-primary (non-random access memory) such as a flexible disk attached to a computer, a CD-ROM (Compact Disk-Read Only Memory), a ROM (Read Only Memory), a RAM (Random Access Memory), and a memory card.
  • non-primary non-random access memory
  • CD-ROM Compact Disk-Read Only Memory
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a memory card Random Access Memory
  • It can be recorded on a computer-readable recording medium and provided as a program product.
  • the program can be provided by being recorded on a recording medium such as a hard disk built in the computer.
  • a program can also be provided by downloading via a network.
  • the program as described above may be a program module that is provided as a part of a computer operating system (OS) and that calls necessary modules in a predetermined arrangement at a predetermined timing to execute processing. .
  • OS computer operating system
  • the program itself does not include the module, and the process is executed in cooperation with the OS.
  • a program that does not include such a module can also be included in the program according to the present invention.
  • the program according to the present invention may be provided by being incorporated in a part of another program. Even in this case, the program itself does not include the module included in the other program, and the process is executed in cooperation with the other program. Such a program incorporated in another program can also be included in the program according to the present invention.
  • the provided program product is installed in a program storage unit such as a hard disk and executed.
  • the program product includes the program itself and a recording medium on which the program is recorded.
  • 1,1A, 1B, 1C management system 10 20, 30, 40 CPU, 11, 21, 31, 41 memory, 14, 24, 34, 44 communication unit, 22, 32 operation buttons, 23 measurement unit, 33 shooting Unit, 42 operation buttons, 100 management device, 101 image input unit, 102 text input unit, 103 storage unit, 104 measurement value input unit, 105 calculation unit, 106 classification unit, 107 request unit, 108 readout unit, 109 generation unit, 110 Extraction unit, 111 1st database, 112 2nd database, 200 scale, 300 camera, 400 input device, 500 display device.

Landscapes

  • Health & Medical Sciences (AREA)
  • Nutrition Science (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

Un dispositif de gestion conçu pour gérer des informations sur des repas comprend une unité d'entrée permettant d'entrer les informations sur les repas, le poids d'un utilisateur, l'heure et la date (S11-3, S12-3), une unité de traitement destinée à traiter les informations sur les repas et le poids, et une unité de sortie destinée à présenter les informations traitées par l'unité de traitement (S33-2). L'unité de traitement exécute un processus (S31, S32) visant à mémoriser dans un dispositif de mémorisation les informations sur les repas pendant une période unitaire en association avec la variation de poids de l'utilisateur pendant ladite période unitaire, et un processus (S33-1) destiné à présenter une composition de repas en fonction de la variation de poids de l'utilisateur pendant la période unitaire et sur la base des informations sur les repas mémorisées dans le dispositif de mémorisation en association avec la variation de poids.
PCT/JP2011/076928 2011-02-16 2011-11-22 Dispositif et système de gestion, et procédé d'affichage WO2012111209A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-030777 2011-02-16
JP2011030777A JP2012168857A (ja) 2011-02-16 2011-02-16 管理装置、管理システム、および管理システムでの表示方法

Publications (1)

Publication Number Publication Date
WO2012111209A1 true WO2012111209A1 (fr) 2012-08-23

Family

ID=46672159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/076928 WO2012111209A1 (fr) 2011-02-16 2011-11-22 Dispositif et système de gestion, et procédé d'affichage

Country Status (2)

Country Link
JP (1) JP2012168857A (fr)
WO (1) WO2012111209A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6671156B2 (ja) * 2015-11-26 2020-03-25 共同印刷株式会社 システム、サーバ及び方法
JP6639216B2 (ja) * 2015-12-15 2020-02-05 共同印刷株式会社 システム、サーバ及び方法
JP7007101B2 (ja) * 2017-04-19 2022-01-24 Nttテクノクロス株式会社 食事内容提案装置、食事内容提案システム及び食事内容提案方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296481A (ja) * 2005-04-15 2006-11-02 Matsushita Electric Works Ltd 体重測定機能付洗面ユニット
JP2008304421A (ja) * 2007-06-11 2008-12-18 Omron Healthcare Co Ltd 体重計
JP2010277476A (ja) * 2009-05-29 2010-12-09 Kyoto Univ 保健指導システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006296481A (ja) * 2005-04-15 2006-11-02 Matsushita Electric Works Ltd 体重測定機能付洗面ユニット
JP2008304421A (ja) * 2007-06-11 2008-12-18 Omron Healthcare Co Ltd 体重計
JP2010277476A (ja) * 2009-05-29 2010-12-09 Kyoto Univ 保健指導システム

Also Published As

Publication number Publication date
JP2012168857A (ja) 2012-09-06

Similar Documents

Publication Publication Date Title
US11430571B2 (en) Wellness aggregator
US20200357522A1 (en) Wellness aggregator
CN106415559B (zh) 健康数据聚合器
JP5527423B2 (ja) 画像処理システム、画像処理方法、及び画像処理プログラムを記憶した記憶媒体
US20180144831A1 (en) Real-time or just-in-time online assistance for individuals to help them in achieving personalized health goals
CN109599161A (zh) 身体活动和健身监视器
KR102330878B1 (ko) 정보 처리 장치, 정보 처리 방법, 및, 정보 처리 시스템
König et al. Characteristics of smartphone-based dietary assessment tools: A systematic review
JP4972527B2 (ja) 動画表示システム、動画表示方法、およびコンピュータプログラム
JP2014174954A (ja) 行動支援システム、当該システムの端末装置、およびサーバー
KR20170056249A (ko) 심리 치료 제공을 위한 컴퓨터 장치 및 프로그램
WO2012111209A1 (fr) Dispositif et système de gestion, et procédé d'affichage
CN111295716A (zh) 健康管理辅助装置、方法及程序
JP2004283570A (ja) 健康管理システム
CN112988789A (zh) 医学数据查询方法、装置及终端
JP2013029877A (ja) データ管理システム、データ管理方法およびプログラム
JP2001299767A (ja) アレルギー疾患情報処理システム、アレルギー疾患情報処理方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体
JP2001318991A (ja) 情報システムを用いた栄養管理システム
CN116959733A (zh) 医疗数据的分析方法、装置、设备及存储介质
CN106062807A (zh) 用于传递任务导向内容的系统和方法
KR20150126415A (ko) 여행 서비스 정보 표시 시스템, 여행 서비스 정보 표시 방법, 여행 서비스 정보 표시 프로그램 및 정보 기록 매체
KR102214792B1 (ko) 식생활습관과 건강상태 자가진단 체크 시스템 및 이를 이용한 맞춤형 건강정보 제공방법
JP2005275606A (ja) 移動体通信端末、健康管理装置、健康カウンセリング装置、カロリー閲覧端末、健康管理支援システム及び方法
JP6931959B1 (ja) レシピ検索支援装置、レシピ検索支援方法、および、レシピ検索支援プログラム
JP7041332B2 (ja) ダイエット管理サーバ及びダイエット管理サーバ制御方法並びにダイエット管理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11858938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11858938

Country of ref document: EP

Kind code of ref document: A1