CN116457818A - Food information input method and device - Google Patents

Food information input method and device Download PDF

Info

Publication number
CN116457818A
CN116457818A CN202180072552.1A CN202180072552A CN116457818A CN 116457818 A CN116457818 A CN 116457818A CN 202180072552 A CN202180072552 A CN 202180072552A CN 116457818 A CN116457818 A CN 116457818A
Authority
CN
China
Prior art keywords
food
information
input
food information
information input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180072552.1A
Other languages
Chinese (zh)
Inventor
金大薰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Novelli Co ltd
Original Assignee
Novelli Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210142157A external-priority patent/KR20220054223A/en
Application filed by Novelli Co ltd filed Critical Novelli Co ltd
Priority claimed from PCT/KR2021/014989 external-priority patent/WO2022086304A1/en
Publication of CN116457818A publication Critical patent/CN116457818A/en
Pending legal-status Critical Current

Links

Landscapes

  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The present invention relates to a food information input method and apparatus, and the food information input method according to an embodiment of the present invention includes the steps of: operating in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen; and receiving at least one piece of food information input by using the input guide and the input button when operating in the photographing mode.

Description

Food information input method and device
Technical Field
The present disclosure relates to a technology for providing information input convenience to a user, and more particularly, to a method and apparatus for allowing a user to conveniently input food information.
Background
Recently, there has been an increasing interest in health, but more and more people suffering from overweight or obesity. Overweight or obesity is a serious problem leading to various diseases such as diabetes and hypertension.
Therefore, in order to solve the overweight or obese problem, it is necessary to analyze own eating habits. People often have foods that they like and dislike, but do not keep track of the type and time they actually eat. Therefore, in order to analyze own eating habits, it is necessary to identify foods that are actually consumed and analyze individual eating habits based on information about the identified foods.
For example, in order to recognize food actually consumed, a user should input food information using a mobile terminal or an input device. Here, the user executes a food-related application or a general photographing mode performed in the mobile terminal to photograph food. At this time, since the user should search for the photographed food image in the food-related application and then input the food information, this causes inconvenience. Further, when photographing food in a general photographing mode, a user should find a photographed food image in an album, upload the found food image to a food-related application, find the uploaded food image again, and then input food information, thereby causing inconvenience. Since food photographing and food information input are inconvenient as separate operations, the frequency of uploading food information is rapidly reduced after a user photographs several food images.
There may be a case where the user only photographs the food and then inputs the food information at a later time. If the period of capturing the food image has elapsed, the user may not remember the food information to be input during the capturing, and thus may not record the food information, or upload erroneous food information, resulting in an error in analyzing eating habits. Further, even if the user wants to input various food information for the photographed food image, it may be unavoidable to input some fixed food information for each application. In this case, the number of pieces of food information is limited, and thus the eating habits of the user cannot be accurately analyzed.
Accordingly, inconvenience of the user is increased, and thus satisfaction of the service of analyzing the eating habits of the individual through the food information input is deteriorated.
Disclosure of Invention
[ problem ]
Embodiments of the present disclosure provide a food information input method and apparatus for conveniently inputting food information in addition to photographing on a photographing screen when a user inputs food information.
Another embodiment of the present disclosure provides a food information input method and apparatus for conveniently inputting food information related to recognized food through a User Interface (UI) when food information is input in addition to photographing on a photographing screen.
Still another embodiment of the present disclosure is to provide a food information input method and apparatus for automatically performing photographing if a predetermined condition is satisfied during operation in a photographing mode.
The objects of the present disclosure are not limited to the foregoing, and various other objects are contemplated herein.
[ technical solution ]
In one aspect, a food information input method performed by a food information input apparatus may include: operating in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen; and receiving at least one piece of food information input by using the input guide and the input button when operating in the photographing mode.
The step of receiving at least one piece of food information includes: when operating in the photographing mode, in a state in which the user activates an input button displayed on the photographing screen, food information may be received according to a position at which the activated input button is deactivated on the input guide.
In the above method, the food information input method may further include displaying at least one piece of food information corresponding to a position of the input button activated by the user on the input guide.
The step of displaying at least one piece of food information on the photographing screen includes: food information corresponding to a position where the input button can be activated by the user on the input guide is displayed on the photographing screen in a preset size or more.
The step of receiving at least one piece of food information includes: in a state where the user activates an input button displayed on the photographing screen, at least one piece of food information may be selectively received according to a moving direction in which the activated input button is moved to be deactivated on the input guide.
The step of receiving at least one piece of food information includes: in a state where the user activates the input button displayed on the photographing screen, at least one piece of food information may be selectively received according to a moving distance and a moving direction in which the activated input button is moved to be deactivated on the input guide.
The at least one piece of food information may include at least one piece of information selected from: food amount information, food type information, food calorie information, food nutrition information, food menu information, consumer information, intake target information, intake plan information, intake time information, intake satisfaction information, food time information suitable for a preset input purpose, food location information suitable for a preset input purpose, food processor information, food trash discharge information, pre-meal input information, post-meal input information, favorites information, and information about items stored in a wish list.
In the above method, the food information input method may further include displaying at least one selected from the following items related to the received at least one piece of food information: user allergy effect, skin nutrition, growth nutrition, disease effect, aging effect, diet effect, expected intake calories compared to target calories, salinity, conversion value of food, food cost information, food environment information, carbon value information, resource value information, carbon emission, running cost loss cost, food raw material information, pre-meal input button, postprandial input button, favorites input button, and wish list storage button.
In the above method, the food information input method may further include identifying food displayed on the photographing screen, and the received at least one piece of food information may be related to the identified food.
The step of receiving at least one piece of food information includes: the unit of food displayed on the input button may be changed according to the type of the recognized food, and information about the amount of the recognized food may be input based on the changed unit of food.
In the above method, the food information input method may further include displaying an input limit range of the identified food.
In the above method, the food information input method may further include adjusting a minimum value and a maximum value of the displayed input limit range, and the input unit of the food information may be changed according to the adjusted minimum value and maximum value of the input limit range.
In the above method, the food information input method may further include calculating and displaying intake target information in consideration of characteristics of the user according to the identified food; receiving intake plan information about the identified food, which is input from the user; and displaying a ratio of the input ingestion plan information to the displayed ingestion target information.
In the above method, the food information input method may further include displaying success or failure of the intake target based on comparison of the input intake plan information with the displayed intake target information after the user ingests the identified food.
In the above method, the food information input method may further include analyzing a food image of the identified food to determine whether before or after meal; and recording the food image as a food comment before a meal by the user when the food image is taken before a meal, or recording the food image as a food comment after a meal by the user when the food image is taken after a meal.
In the above method, the food information input method may further include displaying identifiable food candidate types based on the identified type of food.
In the above method, the food information input method may further include searching for a restaurant providing the identified food based on a location of the food information input device; and correcting and displaying the received food information based on the menu information in the searched restaurant.
In the above method, the food information input method may further include searching for a restaurant providing the identified food based on a location of the food information input device; and ordering food corresponding to the received food information based on the menu information in the searched restaurant.
In the above method, the food information input method may further include providing a preset stimulus for the input food information to the user when the food information is input.
In another aspect, a food information input method performed by a food information input apparatus may include: operating in a photographing mode in which an input button is displayed on a photographing screen; checking whether a preset photographing guide condition is satisfied when operating in a photographing mode; and if the preset photographing guide condition is satisfied, performing a photographing operation.
In the above method, the food information input method may further include identifying food displayed on the photographing screen; and receiving food information about the identified food.
The preset photographing guide condition may be at least one condition selected from the following conditions: the condition that the identified food and the food information input device are parallel to each other, the condition that the food information input device is horizontal, the condition that the food information input device is located vertically above the identified food, and the condition that the food information input device is moving.
In the above method, the food information input method may further include displaying photographing accuracy on a screen based on the relative inclination between the identified food and the food information input device.
In the above method, the food information input method may further include providing a preset stimulus for a photographing operation to the user.
In the above method, the food information input method may further include analyzing a photographed image of the identified food to determine whether before or after meal; and recording the photographed image as a food comment before meal by the user when photographing the food image before meal, or recording the photographed image as a food comment after meal by the user when photographing the food image after meal.
In the above method, the food information input method may further include displaying identifiable food candidate types based on the identified type of food.
In the above method, the food information input method may further include searching for a restaurant providing the identified food based on a location of the food information input device; and correcting and displaying the received food information based on the menu information in the searched restaurant.
In the above method, the food information input method may further include searching for a restaurant providing the identified food based on a location of the food information input device; and ordering food corresponding to the received food information based on the menu information of the searched restaurant.
In another aspect, a food information input apparatus may include: a camera; a display; a memory configured to store one or more programs; and a processor configured to execute one or more stored programs, and the processor is operable in a photographing mode in which the input guide and the input buttons moving along the input guide are displayed on a photographing screen through the display; and at least one piece of food information may be input by the camera using the input guide and the input button when the processor operates in the photographing mode.
In another aspect, a food information input apparatus may include a food information input apparatus including: a camera; a display; a memory configured to store one or more programs; and a processor configured to execute one or more stored programs, and the processor may operate in a photographing mode in which an input button is displayed on a photographing screen through a display, and when the processor operates in the photographing mode, it may be checked whether a preset photographing guide condition is satisfied, and if the preset photographing guide condition is satisfied, photographing may be performed by a camera.
Further, another method and another system for implementing the present disclosure, and a computer-readable recording medium recording a computer program for executing the method may also be provided.
[ beneficial effects ]
According to the embodiments of the present disclosure, it is enabled to conveniently input food information in addition to photographing on a photographing screen when a user should input the food information.
According to another embodiment of the present disclosure, it is enabled to conveniently input food information related to the recognized food through a User Interface (UI) when food information is input in addition to photographing on a photographing screen.
According to another embodiment of the present disclosure, it is enabled to automatically perform photographing if a predetermined condition is satisfied during operation in a photographing mode.
The effects of the present disclosure are not limited to the foregoing, and other various effects are contemplated herein.
Drawings
Fig. 1 is a diagram showing a configuration of a food information input system according to an embodiment of the present disclosure.
Fig. 2 is a flowchart illustrating a food information input method according to an embodiment of the present disclosure.
Fig. 3 to 6 are diagrams illustrating a food information input operation according to an embodiment of the present disclosure.
Fig. 7 to 9 are diagrams illustrating an operation of inputting food information related to leftover discharge according to an embodiment of the present disclosure.
Fig. 10 and 11 are diagrams illustrating an operation of inputting food information related to food intake information according to an embodiment of the present disclosure.
Fig. 12 is a flowchart illustrating a food information input method according to another embodiment of the present disclosure.
Fig. 13 to 18 are diagrams illustrating a food information input process of fig. 12.
Fig. 19 is a flowchart illustrating a food information input method according to another embodiment of the present disclosure.
Fig. 20 is a diagram illustrating a food information input device according to another embodiment of the present disclosure.
Fig. 21 is a diagram illustrating a photographing operation according to another embodiment of the present disclosure.
Detailed Description
The foregoing and other objects, features and advantages of the present disclosure will be more readily understood from the following description of the preferred embodiments taken in conjunction with the accompanying drawings. The present disclosure may, however, be embodied in different forms and is not limited to the embodiments set forth herein, which are provided only for the purpose of making the present disclosure more thorough, complete, and able to fully convey the spirit of the disclosure to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. In this disclosure, the singular is also intended to include the plural unless the context clearly indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes," "including," "having," and the like, when used in this specification, specify the presence of stated features, but do not preclude the presence or addition of one or more other features. Throughout the specification, like reference numerals refer to like components, and "and/or" includes each and every combination of one or more of the listed components. It will be understood that, although the terms "first," "second," etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. Thus, a first component may be termed a second component, and a second component may be termed a first component, without departing from the teachings of the present disclosure.
Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Fig. 1 is a diagram showing a configuration of a food information input system according to an embodiment of the present disclosure.
First, referring to fig. 1, a food information input system 10 according to the present disclosure may include a food information input apparatus 100 and an external server 200 in communication with the food information input apparatus 100.
The food information input device 100 may include a communication module 110, a camera 120, a display 130, a memory 140, and a processor 150.
The communication module 110 may include one or more modules enabling communication between the food information input apparatus 100 and a wireless communication system or between the food information input apparatus 100 and the external server 200. Further, the communication module 110 may include one or more modules connecting the food information input device 100 to one or more networks.
The camera 120 may capture an image or video by an operation of a user. Here, the camera 120 may take images or videos of food before and after meals by the user's operation. The camera 120 may include a single camera, multiple cameras, a single image sensor, or multiple image sensors. The camera 120 may be at least one selected from the group consisting of: at least one 2D camera, at least one 3D camera, at least one stereo camera, and at least one image sensor.
The display 130 may form a layer structure with the touch sensor or may be integrated with the touch sensor to implement a touch screen. Such a touch screen may provide an input interface between the food information input apparatus 100 and the user, and may simultaneously provide an output interface between the food information input apparatus 100 and the user.
The memory 140 may store data supporting various functions of the food information input apparatus 100. The memory 140 may store one or more programs driven in the food information input apparatus 100, a plurality of application programs or applications, data for operating the food information input apparatus 100, and commands. At least some of the applications may be downloaded from the external server 200 through wireless communication. Further, at least some of the application programs may exist for basic functions of the food information input apparatus 100. Meanwhile, an application program may be stored in the memory 140 and installed on the food information input apparatus 100 to perform operations (or functions) of the food information input apparatus 100 through the processor 150.
Processor 150 may generally control the overall operation of food information input device 100 and application-related operations. The processor 150 may process signals, data, or information input or output through the above-described components, or drive an application program stored in the memory 140, thereby providing the user with appropriate information or functions.
In an embodiment, the processor 150 may operate in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen through the display 130 by executing one or more programs stored in the memory 140, and the processor 150 may receive at least one piece of food information through the camera 120 using the input guide and the input button when the processor operates in the photographing mode.
In an embodiment, the processor 150 may operate in a photographing mode in which an input button is displayed on a photographing screen through the display 130 by executing one or more programs stored in the memory 140, the processor 150 may check whether a preset photographing guide condition is satisfied when the processor operates in the photographing mode, and if the preset photographing guide condition is satisfied, the processor 150 may perform a photographing operation through the camera 120.
Further, the processor 150 may control at least some of the components described with reference to fig. 1 to drive the application programs stored in the memory 140. Further, the processor 150 may operate in combination with at least two or more of the components included in the food information input apparatus 100 in order to drive an application program. In this regard, the processor 150 will be described later with reference to fig. 2 to 20.
Hereinabove, the configuration of the food information input system 10 according to the present disclosure has been described with reference to fig. 1. The number of components of the food information input system 10 may be less than or greater than the number of components shown in fig. 1.
Hereinafter, a food information input method according to an embodiment of the present disclosure will be described with reference to fig. 2 to 11.
Fig. 2 is a flowchart illustrating a food information input method according to an embodiment of the present disclosure.
As shown in fig. 2, the food information input method according to the embodiment of the present disclosure may be performed by at least one processor of the food information input apparatus 100 or by a computer as the food information input apparatus 100.
First, the food information input apparatus 100 may operate in a photographing mode (S110). The food information input apparatus 100 may operate in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen of the display 130. The photographing screen may be a photographing screen of a food information input application executed by the food information input apparatus 100 or a photographing screen of a photographing application executed by the food information input apparatus 100, and is not limited to a specific photographing screen.
Further, when the food information input apparatus 100 is operated in the photographing mode, at least one piece of food information is input into the food information input apparatus using the input guide and the input button (S120). Here, the at least one piece of food information may include general food information together with a photographed image or photographed video, information about remaining meals remaining after a meal, or information about food waste. For example, the at least one piece of food information may include at least one piece of information selected from the following together with the photographed image or photographed video: information about food amount, information about food type, information about food calories, information about food nutrition, information about food menus, information about eaters, information about intake targets, information about intake plans, information about intake times, information about intake satisfaction, information about food times suitable for preset input purposes, information about food locations suitable for the preset input purposes, information about food processors, information about food waste discharge, pre-meal input information, post-meal input information, information about favorites, and information about items stored in a wish list.
The detailed operation of step S120 is as follows: when the food information input apparatus 100 is operated in the photographing mode, the user may activate an input button displayed on the photographing screen (S121).
Subsequently, the food information input apparatus 100 may display at least one piece of food information corresponding to a position of the input button on the input guide activated by the user on the display (S122). Further, the food information input apparatus 100 may display at least one piece of food information corresponding to a position of the input button on the input guide activated by the user on the photographing screen in a preset size or more. Here, the operation of displaying at least one piece of food information on the display is not an essential operation, and the display operation may or may not be performed according to a user's setting.
Thereafter, in a state where the user activates the input button, the food information input apparatus 100 may receive at least one piece of food information according to a position where the activated input button is deactivated on the input guide (S123). For example, in a state where a user touches to activate an input button, the input button is dragged along an input guide, and then the touch is released so that the input button can be deactivated. Then, the food information input apparatus 100 may receive the food information according to the position where the input button is deactivated on the input guide. Alternatively, in a state where the user touches to activate the input button, the user may deactivate the input button by releasing the touch from the location of the touch instead of dragging the input button along the input guide. Then, the food information input apparatus 100 may receive food information according to a position where the input button is deactivated on the input guide (in other words, a position where the user touches and then releases the touch).
Meanwhile, the detailed process of inputting food information is as follows.
In some embodiments, in a state in which the user activates an input button displayed on the photographing screen, the food information input device 100 may selectively receive at least one piece of food information according to a moving direction in which the activated input button is moved to be deactivated on the input guide. For example, the type of food information may be set differently for each specific moving direction. The user may selectively input at least one piece of food information according to the type of food information differently set for each moving direction on the input guide.
In some embodiments, in a state in which the user activates an input button displayed on the photographing screen, the food information input device 100 may selectively receive at least one piece of food information according to a moving distance and a moving direction in which the activated input button is moved to be deactivated on the input guide. For example, the type of food information may be set differently for each specific moving direction. Further, the information value of the food information may be changed for each moving distance. The user may consider the type of food information differently set according to the moving direction on the input guide and adjust the moving distance to allow the information value of the food information to be selectively input.
In some embodiments, in a state in which the user activates the input button displayed on the photographing screen, the food information input device 100 may selectively receive at least one piece of food information according to a position at which the activated input button is moved to be deactivated on the input guide and at least one piece of combined information of the multimodal interface input information. The multimodal interface input information may include at least one of: the user's voice, facial expressions, gestures, behaviors, gaze, and screen touch movements.
Thereafter, the food information input apparatus 100 may display at least one of the following items related to the received at least one piece of food information: user allergy effect, skin nutrition, growth nutrition, disease effect, aging effect, diet effect, expected intake calories compared to target calories, salinity, conversion value of food, food cost information, food environment information, carbon value information, resource value information, carbon emission, running cost loss cost, food raw material information, pre-meal input button, postprandial input button, favorites input button, and wish list storage button.
Fig. 3 to 6 are diagrams illustrating a food information input operation according to an embodiment of the present disclosure.
The food information input apparatus 100 may receive food information through steps 310 to 330.
As shown in fig. 3, in step 310, the food information input apparatus 100 operates in a photographing mode in which an input guide 302 and an input button 301 moving along the input guide 302 are displayed on a photographing screen. Here, an input range related to food information may be displayed on the input guide 302. For example, the input guide 302 may display an input range from 0 to 50 people, which is a unit related to pasta. An initial input value corresponding to an initial position on the input guide 302 (for example, 25 persons corresponds to 50%) may be displayed on the input button 301, or an input value preset by a user may be displayed on the input button.
In step 320, when the user drags and moves the input button 301 in a state in which the input button 301 is activated, the food information input apparatus 100 may check the position of the input button moved on the input guide 302. Further, the food information input apparatus 100 may display a food information value (for example, 33 persons) corresponding to the position of the input button 301 moved along the input guide 302 in a size equal to or greater than a preset size. Here, the food information value corresponding to the position of the moved input button 301 may be displayed in any area of the display or the input button 301, not limited to a specific display area.
In step 330, in a state in which the user activates the input button 301 through step 320, the food information input apparatus 100 may receive food information according to the position in which the activated input button 301 is deactivated on the input guide 302. For example, in a state where the user activates the input button 301 through step 320, the activated input button 301 may be dragged and moved to a position of 33 persons ranging from 0 person to 50 person, and then the input button 301 may be deactivated in the position of 33 persons. Then, the food information input apparatus 100 may check that the position at which the input button 301 is deactivated by the user on the input guide 302 corresponds to 33 persons, and may receive food information as 33 persons according to the checked position.
Meanwhile, as shown in fig. 4, in step 310, the food information input device operates in a photographing mode in which an input guide 302 and an input button 301 moving along the input guide 302 are displayed on a photographing screen. Here, an input range related to food information may be displayed on the input guide 302. For example, the input guide 302 may display an input range from 0 to 50 people, which is a unit related to pasta. An initial input value (e.g., 50% for 25 persons) corresponding to the initial position on the input guide 302 may be displayed on the input button 301, or a user may display a preset input value on the input button.
The food information input apparatus 100 may display the input limit range of the recognized food. The input limit range may include a minimum value 304 and a maximum value 305. Further, the food information input apparatus 100 may adjust the minimum value 304 and the maximum value 305 of the displayed input limit range. The input limit range may be increased or decreased. The input unit of the food information may be changed to increase or decrease according to the minimum value 304 and the maximum value 305 of the adjusted input limit range.
For example, the food input device 100 may display the minimum 304 of the input limit range as 0 person and the maximum 305 as 50 person. If a preset adjustment operation of the input limit range (e.g., an operation of pressing a photographing screen) is sensed, the food input device 100 may adjust the minimum value 304 of the input limit range to 15 persons and the maximum value 305 to 35 persons. The input limit can be adjusted from 50 to 20 persons. Since the input limit range is adjusted from 50 persons to 20 persons, the input unit of food information can be changed more closely.
Meanwhile, as shown in fig. 5, the food information input apparatus 100 may correct the input food information through steps 340 to 360.
In step 340, the food information input apparatus 100 may display the food information previously input by the user and correct the previously input food information through the user's tapping operation. For example, the user may import records corresponding to 25 persons out of 20 persons, 15 persons, 25 persons, 8 persons as the previously input food information.
Further, in step 350, if the food information input apparatus 100 displays the food information previously input by the user (e.g., 25 persons), the user may correct the previously input food information to another food information value (e.g., 26 persons).
Thereafter, in step 360, the food information input apparatus 100 may update the previously recorded food information (e.g., 25 persons) to the user-corrected food information (e.g., 26 persons).
Meanwhile, as shown in fig. 6, the food information input apparatus 100 may complete the recording of the food information related to the leftovers input by the user through steps 370 to 380.
In step 370, the food information input apparatus 100 may record the food information input through steps 310 to 330. Alternatively, the food information input apparatus 100 may correct the previously recorded food information, and the corrected food information may be updated and recorded through steps 340 to 360.
In step 380, the food information input apparatus 100 may sort the food information by the date of storing the record of the leftovers and display the food information to the user. For example, the food information input apparatus 100 may display food information (e.g., 20, 14, 25, 8, 17, etc.) input at the month of 2021, 7, 26. The user may select a different day and check the food information entered for the different day.
In this way, by the user's activation and deactivation operations in the photographing mode, photographing information can be input, and at the same time, food information can be input. For example, the activation and deactivation operations may be implemented as touch and touch release operations, drag and drag release operations, press and press release operations, preset double-touch operations (or double-click operations) and double-touch release operations (or double-click release operations), operations of simultaneously pressing or releasing two or more input buttons, or simultaneous touch operations (or simultaneous click operations) and simultaneous touch release operations of two or more areas performed by a user on a photographing screen. The activation and deactivation operation may be any activation and deactivation operation for photographing, and is not limited to a specific activation and deactivation operation. Alternatively, the food information may be photographed and inputted through at least one combination operation of the activation and deactivation operation of the user in the photographing mode and the user multi-modal interface input operation.
Fig. 7 to 9 are diagrams illustrating an operation of inputting food information related to leftover discharge according to an embodiment of the present disclosure.
The food information input apparatus 100 may receive information about leftovers or food trashes to be discharged through steps 410 to 430.
As shown in fig. 6, in step 410, the food information input apparatus 100 operates in a photographing mode in which an input guide 402 and an input button 401 moving along the input guide 402 are displayed on a photographing screen. Here, the input guide 302 shown in fig. 3 is arranged in the horizontal direction, and the input guide 402 shown in fig. 6 is arranged in the vertical direction. The direction of the input guide may preferably be adjusted according to the amount of food that the user can intuitively recognize, but is not limited to a specific direction. An input range related to the food waste information may be displayed on the input guide 402. For example, an input range from 0L to 120L (which is a unit related to food waste) may be displayed on the input guide 402. An initial input value (e.g., 60L for 50%) corresponding to the initial position on the input guide 402 may be displayed on the input button 401, or a preset input value preset by the user may be displayed on the input button.
In step 420, when the user drags and moves the input button 401 in a state in which the input button 401 is activated, the food information input apparatus 100 may check the position of the input button on the input guide 402. Further, the food information input apparatus 100 may display the food waste information value (for example, 86.4L corresponds to 72%) corresponding to the position of the input button 401 moved along the input guide 402 in a size equal to or greater than a preset size. Here, the food waste information value corresponding to the position of the moved input button 401 may be displayed in any area of the display or the input button 401, not limited to a specific display area.
In step 430, in a state in which the user activates the input button 401 through step 420, the food information input apparatus 100 may receive food information according to the position in which the activated input button 401 is deactivated on the input guide 402. For example, in a state where the user activates the input button 401 through step 420, the activated input button 401 may be dragged and moved to a position of 86.4L in the range of 0L to 120L, and then the input button 401 may be deactivated in the position of 86.4L. Then, the food information input apparatus 100 may check that the position where the input button 401 is deactivated by the user on the input guide 402 corresponds to 86.4L, and may receive food waste information as 86.4L according to the checked position.
Meanwhile, as shown in fig. 8, the food information input apparatus 100 may correct the input food information through steps 440 to 460.
In step 440, the food information input apparatus 100 may display the food waste information previously input by the user and correct the previously input food waste information through the user's tapping operation. For example, the user may import a record corresponding to 86.4L of 50L, 65L, 75L, 86.4L as the previously input food information.
Further, in step 450, if the food item information input apparatus 100 displays the food item information previously input by the user (e.g., 86.4L), the user may correct the previously input food item information to another food item information value (e.g., 86.5L).
Thereafter, in step 460, the food information input device 100 may update the previously recorded food waste information (e.g., 86.4L) to the food information (e.g., 86.5L) corrected by the user.
Meanwhile, as shown in fig. 9, the food information input apparatus 100 may complete the discharge recording of the food waste information input by the user through steps 470 to 480.
In step 470, the food information input apparatus 100 may record the food waste information input through steps 410 to 430. Alternatively, the food-information input apparatus 100 may correct the previously recorded food-trash information, and may update and record the corrected food-trash information through steps 440 to 460.
In step 480, the food information input apparatus 100 may store the discharge record of the food waste according to the date selected by the user. Further, the food information input apparatus 100 may classify the food-trash information according to the date of storing the discharge record and display the food-trash information to the user. For example, the food information input apparatus 100 may display food waste information (e.g., 50L, 65L, 86.4L, 70L, etc.) input at the month of 7 and 26 of 2021. The user may select a different day and check the food waste information input on the different day.
Fig. 10 and 11 are diagrams illustrating an operation of inputting food information related to food intake information according to an embodiment of the present disclosure.
The food information input apparatus 100 may receive food information related to intake information when a person plans to ingest food or has ingest food through steps 510 and 520.
As shown in fig. 10, in step 510, the food information input apparatus 100 operates in a photographing mode in which an input guide 502 for inputting intake information and an input button 501 moving along the input guide 502 are displayed on a photographing screen. Here, an input range related to intake information may be displayed on the input guide 502. For example, at least one of "MANUBI", "RANUBI", "DANUBI", and "NANUBI" may be displayed on the input button 501 as information about an eater who is planning to ingest or has ingest food displayed on the photographing screen. Alternatively, at least one of "midnoon" and "midnoon" may be displayed on the input button 501 as information on a menu of foods that are displayed on the photographing screen and are scheduled to be ingested or have been ingested.
As shown in fig. 11, in step 520, in a state in which the user activates the input button 501 at an initial position (e.g., a user's "DANUBI") through step 510, the food information input apparatus 100 may receive food information according to a position (e.g., a user's "RANUBI") at which the activated input button 501 is deactivated on the input guide 502. For example, in a state where the user activates the input button 501 in an initial position (i.e., the user's "DANUBI") through step 520, after the activated input button 501 is dragged and moved to the user's "RANUBI" position, the input button 501 may be deactivated in the user's "RANUBI" position. Then, the food information input apparatus 100 may check that the position of the input button 501 deactivated by the user on the input guide 502 corresponds to the food consumer "RANUBI" and receive information about the food consumer as "RANUBI" according to the checked position. Meanwhile, when the food information input apparatus 100 displays a menu of pre-meal or post-meal foods, the menu of pre-meal or post-meal foods may have different colors so that the user can recognize the menu of pre-meal or post-meal foods.
Hereinafter, a process of inputting food information according to another embodiment of the present disclosure will be described in detail with reference to fig. 12 to 17. Fig. 12 is a flowchart illustrating a food information input method according to another embodiment of the present disclosure, and fig. 13 to 17 are diagrams illustrating a food information input process of fig. 12.
Referring to fig. 12 to 14, a food information input method according to another embodiment of the present disclosure may be performed by at least one processor of the food information input apparatus 100 or a computer as the food information input apparatus 100.
First, the food information input apparatus 100 may operate in a photographing mode (S210).
The food information input apparatus 100 may operate in a photographing mode according to an operation of a food information input application (not shown), and may activate the camera 120.
Subsequently, the food information input apparatus 100 may determine whether a preset condition is satisfied (S220). For example, the preset conditions may include conditions in which photographing is accurately performed on the display 130 by the camera 120 or conditions in which food is more accurately recognized.
Subsequently, when the preset condition is satisfied, the food information input device 100 may display a UI on the display 130 (S230).
When the food information input apparatus 100 does not perform automatic photographing, the food information input apparatus 100 photographs food through photographing input by a user, and the food information input method according to another embodiment of the present disclosure allows the user to input photographing into the food information input apparatus 100 and simultaneously input food information including the amount of food into the food information input apparatus 100.
For this, the UI displayed on the display 130 may be a UI that can input food information including a food amount, and referring to fig. 13, for example, may be a bar-shaped UI 610, but is not limited thereto.
For example, for photographing of the food information input apparatus 100, the user should touch the display 130, at which time food information including the amount of food the user intends to ingest may be input through the touched position according to the UI displayed on the display 130. That is, when the user touches the display 130 once, food information including the amount of food scheduled to be ingested can be input while photographing, and thus, the user does not need to separately input food information including the amount of food, thereby eliminating inconvenience of the user.
Referring to fig. 13, a bar UI 610 displays the food intake ratio displayed on the display 130. However, without being limited thereto, various UIs may be provided according to the type of food recognized by the camera 120.
For example, a unit of food (e.g., head, block, etc., when food divided into blocks is recognized by the camera 120, the block may be displayed on the UI as a unit of food) may be displayed on the UI according to the type of food recognized by the camera 120.
In some embodiments, when the user touches the display 130 once, the photographing input and the input of food information including the amount of food scheduled to be ingested may not be simultaneously performed, but a UI may be provided to input the food information including the amount of food scheduled to be ingested separately from the photographing input.
In some embodiments, referring to fig. 14, an edible candidate area 710 may be provided to set a food edible person as a UI displayed on the display 130.
For example, a food consumer may be selected through the consumer candidate area 710, the owner of the food information input device 100 and a preset ingestion candidate may be displayed on the display 130, and a nearest food consumer or a frequently set consumer may be displayed as an ingestion candidate on the display 130. Further, the user can set a plurality of food eaters using the UI.
Meanwhile, the UI displayed on the display 130 has an area that can be directly input and set, so that the user can select a direct input to input the user when a suitable item does not exist on the ingestion candidate.
The food information input apparatus 100 is not limited to the type of smart phone shown in fig. 15, and is not limited as long as it is a type of mobile apparatus including the camera 120.
In some embodiments, referring to fig. 16, the food information input apparatus 100 may provide the user with food information about food selected by the user among various types of food identified by the camera 120. For example, the user may perform a selection input on the food information input apparatus 100 by touching some of the food images displayed on the display 130, but a method of performing a selection input on the food information input apparatus 100 by the user is not limited thereto.
Further, when the food information input apparatus 100 is not used for personal use, but an employee uses the food information input apparatus 100 for many and unspecified persons in a restaurant or the like, intake candidates may be displayed as an intake candidate group. The intake candidate group based on the at least one feature classification may be displayed on the display 130, for example, as a combination of age and gender, the intake candidate group may be displayed such as, but not limited to, a teen-age woman, or a teen-age man.
In some embodiments, if food is recognized by the camera 120, a UI may be displayed on the display 130, and at the same time, the food information input apparatus 100 may provide food information about the recognized food to the user, or the food information input apparatus 100 may provide food information about the recognized food to the user, regardless of the UI displayed on the display 130.
More specifically, if food is recognized by the camera 120 even before the user inputs photographing, the food information input apparatus 100 may provide food information about the recognized food to the user. In other words, when food is recognized by the camera 120, the food information input apparatus 100 may provide the user with food information about the food analyzed in real time. The food information may include, for example, food calorie information, food type information, food nutrition information, etc., and the type of the food information is not limited as long as it is information that can be derived based on an image of the identified food. The food information may be analyzed by the food information input apparatus 100 or the external server 200 based on the image of the recognized food.
The food information input apparatus 100 may provide food information derived by a user through a visual, auditory, or tactile method based on an image of the identified food. For example, referring to fig. 12, an image of food and corresponding food calorie information identified by the camera 120 may be displayed on the display 130. In addition, the food information input apparatus 100 may provide food nutrition information such as "the current food is a high protein diet and contains a large amount of vitamins through a speaker (not shown), thus facilitating fatigue recovery.
However, the food information input device 100 of the present disclosure is not limited to the smart phone type shown in fig. 12, and is not limited as long as it is a mobile device type including the camera 120. For example, referring to fig. 19, the food information input apparatus 100 according to an embodiment of the present disclosure may not include the display 130.
Thereafter, the food information input apparatus 100 may receive food information input by the user according to the UI (S240).
In some embodiments, the user may photograph a corresponding food consumer using the food information input device 100. For example, after the food information input apparatus 100 performs input of the food information, the user may photograph a corresponding food consumer using the food information input apparatus 100, and the food information input apparatus 100 may identify the consumer photographed at the time closest to the time of photographing the food as the corresponding food consumer, or may identify the consumer included in the image having a background similar to that of the photographed food as the corresponding food consumer.
In some embodiments, the food information input apparatus 100 may change a unit of food displayed on any position or input button according to the type of the recognized food, and may receive information about the amount of the recognized food based on the changed unit of food.
In some embodiments, the food information input apparatus 100 may analyze the food image of the identified food to determine whether it is before or after a meal, and may record the food image as a food comment before a meal by the user when the food image is taken before a meal, or may record the food image as a food comment after a meal by the user when the food image is taken after a meal.
In some embodiments, the food information input apparatus 100 may display the identifiable food candidate type based on the identified food type.
In some embodiments, the food information input apparatus 100 may search for a restaurant providing the identified food based on the location of the food information input apparatus, and may correct and display the received food information based on menu information in the searched restaurant.
In some embodiments, referring to fig. 17, the food information input apparatus 100 may search for a restaurant providing the identified food based on the position of the food information input apparatus, and may order the food corresponding to the received food information based on menu information in the searched restaurant. At this time, when the user touches the order button 810 displayed on the photographing screen for the searched restaurant menu and the recognized food, the apparatus may order the food to the corresponding restaurant through a payment process or the like.
In some embodiments, in the case of inputting food information, the food information input apparatus 100 may provide a user with a preset stimulus for the input food information.
In some embodiments, the food information input apparatus 100 may calculate and display intake target information in consideration of user characteristics according to the identified food, may receive intake plan information about the identified food input by the user, and may display a ratio of the input intake plan information to the displayed intake target information.
In some embodiments, after the user ingests the identified food, the food information input apparatus 100 may display success or failure of the ingestion target based on a comparison of the input ingestion plan information with the displayed ingestion target information.
As shown in fig. 18, the food information input apparatus 100 may diagnose the eating habits of the user to provide the nutritional intake status information of the user for each period (e.g., daily, monthly, yearly, every section, etc.). For example, the nutritional intake status information may include total calories, carbohydrates, proteins, fat, sodium, calcium, iron, vitamin a, thiamine, riboflavin, vitamin C, and the like. The nutritional intake status information of the user may include first nutritional intake status information compared to the target segment and second nutritional intake status information compared to the recommended meal allowance. The food information input apparatus 100 may calculate and display first nutrient intake condition information in which the target segment is displayed and second nutrient intake condition information in which the recommended meal allowance is displayed, in accordance with the identified food in consideration of the nutrient intake condition information of the user.
Although fig. 13 to 18 are sequentially described, they are merely for illustrating a technical idea of another embodiment of the present disclosure, and fig. 13 to 18 are not limited to a time sequence since it is apparent to those skilled in the art to which the present disclosure pertains that various changes may be made, for example, the order described in fig. 13 to 18 may be changed or one or more steps may be performed in parallel without departing from essential characteristics of the embodiment.
Hereinafter, a food information input process according to another embodiment of the present disclosure will be described in detail with reference to fig. 19 to 21. Fig. 19 is a flowchart illustrating a food information input method according to another embodiment of the present disclosure, fig. 20 is a diagram illustrating a food information input apparatus according to another embodiment of the present disclosure, and fig. 21 is a diagram illustrating a photographing operation according to another embodiment of the present disclosure.
Referring to fig. 19 to 21, a food information input method according to another embodiment of the present disclosure may be performed by at least one processor of the food information input apparatus 100 or a computer as the food information input apparatus 100.
First, the food information input apparatus 100 may operate in a photographing mode (S310). The food information input apparatus 100 may operate in a photographing mode according to an operation of a food information input application (not shown), and may activate the camera 120.
Subsequently, the food information input apparatus 100 may provide a photographing guide to the user (S320).
In order to accurately analyze food information including an image, a photographing operation should be performed by the camera 120 at a position vertically above a photographing target (i.e., food) in a state where the camera is positioned parallel to the food as the photographing target. When food is not photographed under the above conditions, it is not easy to perform high accuracy analysis even if the acquired food information is analyzed.
For this, the food information input apparatus 100 may analyze whether food is parallel to the food information input apparatus 100, whether the food information input apparatus 100 or the camera 120 is horizontal, and whether the camera 120 is positioned vertically above the food, thereby providing a photographing guide to a user so that the user may displace the food information input apparatus 100 or the camera 120.
The method of providing the photographing guide to the user may use a visual, audible or tactile method. The photographing guide may be visually displayed on the display 130, may be output through a speaker (not shown), or may be provided through a vibration module (not shown), but the present disclosure is not limited thereto.
Subsequently, the food information input apparatus 100 may determine whether a preset condition is satisfied (S330).
In some embodiments, when the food information input apparatus 100 operates in the photographing mode to activate the camera 120, the food information input apparatus 100 may determine whether a preset condition is satisfied. The preset condition may be at least one of the following conditions: conditions under which a photographing operation is precisely performed in a photographing mode, or conditions under which the food information input device 100 provides food information to a user, or conditions under which food is recognized by the camera 120.
More specifically, if food is recognized by the camera 120 even before the user inputs photographing, the food information input apparatus 100 may provide food information about the recognized food to the user. In other words, when food is recognized by the camera 120, the food information input apparatus 100 may provide the user with food information about the food analyzed in real time. The food information may include, for example, food calorie information, food type information, food nutrition information, etc., and the type of the food information is not limited as long as it is information that can be derived based on an image of the identified food. The food information may be analyzed by the food information input apparatus 100 or the external server 200 based on the image of the recognized food.
The food information input apparatus 100 may provide photographing guidance or food information derived by a user through a visual, audible, or tactile method based on an image of the identified food. For example, an image of the food identified by the camera 120 and corresponding food calorie information may be displayed on the display 130. In addition, the food information input apparatus 100 may provide food nutrition information such as "the current food is a high protein diet and contains a large amount of vitamins through a speaker (not shown), thus facilitating fatigue recovery.
However, the food information input device 100 according to another embodiment of the present disclosure is not limited to the smart phone type shown in fig. 15, and is not limited as long as it is a mobile device type including the camera 120. For example, referring to fig. 20, the food information input apparatus 100 according to another embodiment of the present disclosure may not include the display 130.
When the preset condition is satisfied, the food information input device 100 may perform automatic photographing (S340). In other words, when the preset condition is satisfied, the food information input apparatus 100 may perform automatic photographing even if there is no photographing input by the user.
The preset condition may be an optimal condition that food can be photographed using the camera 120. For example, the preset condition may be a condition of the food information input device 100 or the camera 120 being horizontal, a condition of the movement of the food information input device 100 or the camera 120 being equal to or less than a predetermined level, or a condition of recognizing food from an image photographed by the camera 120. Alternatively, the preset condition may be at least one of the following conditions: the condition that the recognized food and the food information input apparatus 100 are parallel to each other, the condition that the food information input apparatus 100 is horizontal, the condition that the food information input apparatus 100 is located vertically above the recognized food, and the condition that the food information input apparatus 100 is moved.
In other words, referring to fig. 21, when food is recognized in a state in which the camera 120 is horizontally placed and the camera 120 is not shaken, the food information input apparatus 100 may perform automatic photographing.
In the food information input process according to another embodiment of the present disclosure, the food information input apparatus 100 may provide photographing guidance to the user so that the user can find an optimal photographing condition, and when the optimal photographing condition is satisfied, the food information input apparatus 100 performs automatic photographing, thereby increasing the user's convenience.
In some embodiments, the food information input apparatus 100 may determine whether a preset condition is satisfied (S330), and when the preset condition is satisfied, a notification may be provided such that the automatic photographing is not performed, but the user performs a photographing operation. For example, the food information input apparatus 100 may provide a notification to the user using at least one of visual, audible, and tactile methods. The notification may be displayed on the display 130 to perform photographing, a light emitting device (not shown) included in the food information input device 100 may emit light to provide the notification so that the user performs photographing, or the notification may be provided through a speaker (not shown) or a vibration module (not shown) so that the user performs photographing, but the present disclosure is not limited thereto.
In some embodiments, the user may photograph a corresponding food consumer using the food information input device 100. For example, after the food information input apparatus 100 performs the automatic photographing, the user may photograph the corresponding food consumer using the food information input apparatus 100, and the food information input apparatus 100 may identify the consumer photographed at the time closest to the time of photographing the food as the corresponding food consumer, or may identify the consumer included in the image having the background similar to the background of the photographed food as the corresponding food consumer.
In some embodiments, the food information input apparatus 100 may request the user to input the amount of food ingested or comments on food ingested after performing photographing, and may adjust rewards provided to the user in consideration of the time of user input according to the corresponding request.
In some embodiments, when automatic photographing is performed or photographing is performed through photographing input of a user, the food information input apparatus 100 may provide food information about food identified through photographing to the user. In other words, when food is photographed by the camera 120, the food information input apparatus 100 may provide the user with food information about the food analyzed in real time. The food information may include, for example, food calorie information, food type information, food nutrition information, and the like. The type of the food information is not limited as long as it is information that can be derived based on the image of the recognized food. The food information may be analyzed by the food information input apparatus 100 or the external server 200 based on the image of the recognized food.
Meanwhile, a process of inputting food information according to another embodiment of the present disclosure will be described in detail.
In the food information input process according to another embodiment of the present disclosure, the food information input apparatus 100 may provide various functions to a user after the food information input apparatus 100 performs photographing.
The method of inputting various functions from the food information input apparatus 100 to the user may vary, and is as follows.
First, a selectable region may be provided as a UI on the display 130, and the user's input may be differently recognized according to a position where the user performs a touch input (or a click input) in the corresponding selectable region.
Further, when the user performs a drag input on the display 130 with respect to the photographed image, the food information input apparatus 100 may perform a predetermined function.
Further, when performing a double touch input (or a double click input), two or more buttons are simultaneously pressed, or a touch input (or a click input) is performed for two or more areas, the food information input apparatus 100 may perform a predetermined function.
Finally, when the user directly sets a mode for performing a corresponding function, the food information input apparatus 100 may perform a given function.
Meanwhile, functions provided to the user by the food information input apparatus 100 after the food information input apparatus 100 performs photographing are as follows.
First, when the food information input apparatus 100 performs photographing, this allows the user to set a storage space for a photographed image.
For example, when the food information input apparatus 100 performs photographing, the photographed image may be analyzed, and then a folder suitable for storage may be recommended in an existing folder, and the user may store the photographed image in the folder by dragging.
In addition, the user can freely create a folder for food to be returned later, at home, best expectations, and the like, and can drag the photographed image to the folder to save it.
Further, when the food information input apparatus 100 performs photographing, the user can easily share the photographed image.
For example, when a user drags a photographed image in a specific direction or performs an input for sharing, the image may be shared with other users through SNS or the like. Further, if the food information input apparatus 100 is set to the sharing mode through the mode setting, when food is included in the photographed image, the image can be automatically shared with other users through SNS or the like even without additional input.
Thus, when a user uploads a photographed image to an unspecified community to share food information, a person who is thinking about the type of food taken, looking for an image of the photographed food, or looking for entertainment can utilize the image.
The pre-meal or post-meal comments may then be input from the user to the food information input apparatus 100.
The food information input apparatus 100 may analyze the photographed image to determine whether it is an image of food before or after meal, and in the case of photographing an image of food before meal, may record a comment of a user after photographing as a comment before ingestion, and in the case of photographing an image of food after meal, may record a comment of a user after photographing as a comment after ingestion.
For example, in the case of taking an image of food before a meal, the food information input apparatus 100 causes the user to record a comment before ingestion, for example, a comment before ingestion such as a maximum expected value before ingestion or a minimum expected value before ingestion, according to a drag input direction performed by the user on the display 130. However, the method of inputting the comment before ingestion is not limited thereto.
Further, in the case of taking an image of a food after meal, the food information input apparatus 100 allows the user to record a comment after ingestion, for example, a comment before ingestion, such as maximum satisfaction after ingestion, or minimum satisfaction after ingestion, according to a drag input direction performed by the user on the display 130. However, the method of inputting the comment after ingestion is not limited thereto.
The above-described food information input method according to the embodiments of the present disclosure may be implemented as a program (or application) executed in conjunction with a hardware device and stored in a medium.
The program may include code encoded in a computer language, such as C, C ++, JAVA, or machine language, which may be read by a processor (CPU) of a computer through a device interface of the computer so that the computer reads the program and performs a method implemented as the program. The code may include functional code related to a function or the like defining a function required to perform the method, and may include control code related to an execution process required for a processor of the computer to perform the function according to a predetermined process. In addition, the code may include additional information necessary for the processor of the computer to perform the functions, or memory reference related code for which location (address) in the computer's internal or external memory the medium should reference. Furthermore, when the processor of a computer needs to communicate with any other computer or server at a remote location to perform a function, the code may also include code related to the communication for how to communicate with any other remote computer or server using the communication module of the computer, and which information or media should be transmitted and received during the communication.
Storage media does not refer to media (such as registers, caches, memories, etc.) that store data temporarily, but rather refers to media that store data semi-permanently and that can be read by a device. In particular, examples of storage media include, but are not limited to, ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage devices. In other words, the program may be stored in various recording media on various servers that the computer can access, or in various recording media on the computer of the user. Furthermore, the medium may be distributed among computer systems connected through a network, and the computer readable code may be stored in a distributed manner.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by hardware, or in a combination of the two. A software module may reside in Random Access Memory (RAM), read Only Memory (ROM), erasable Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, a hard disk, a removable magnetic disk, a CD-ROM, or any type of computer readable recording medium known in the art to which this disclosure pertains.
Although embodiments of the present disclosure have been described with reference to the accompanying drawings, it will be understood by those skilled in the art to which the present disclosure pertains that the present disclosure may be embodied in other specific forms without changing the technical idea or essential characteristics of the present disclosure. Accordingly, it should be understood that the above-described embodiments are illustrative in all respects and not restrictive.

Claims (30)

1. A food information input method performed by a food information input apparatus, the food information input method comprising the steps of:
operating in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen; and
at least one piece of food information input by using the input guide and the input button is received while operating in the photographing mode.
2. The food information input method of claim 1, wherein,
the step of receiving the at least one piece of food information includes:
when operating in the photographing mode, the food information is received according to a position where the activated input button is deactivated on the input guide in a state where the user activates the input button displayed on the photographing screen.
3. The food information input method of claim 1, further comprising the steps of:
at least one piece of food information corresponding to a position on the input guide where the input button is activated by the user is displayed.
4. The food information input method according to claim 3, wherein,
the step of displaying at least one piece of food information on the photographing screen includes:
the food information corresponding to the position of the input button activated by the user on the input guide is displayed on the photographing screen in a preset size or more.
5. The food information input method of claim 1, wherein,
the step of receiving the at least one piece of food information includes:
in a state where the user activates the input button displayed on the photographing screen, at least one piece of food information is selectively received according to a moving direction in which the activated input button is moved to be deactivated on the input guide.
6. The food information input method of claim 1, wherein,
receiving the at least one piece of food information includes:
in a state where the user activates the input button displayed on the photographing screen, at least one piece of food information is selectively received according to a moving distance and a moving direction in which the activated input button is moved to be deactivated on the input guide.
7. The food information input method of claim 1, wherein the at least one piece of food information includes at least one piece of information selected from the following: food amount information, food type information, food calorie information, food nutrition information, food menu information, consumer information, intake target information, intake plan information, intake time information, intake satisfaction information, food time information suitable for a preset input purpose, food position information suitable for the preset input purpose, food processor information, food trash discharge information, pre-meal input information, post-meal input information, favorites information, and information about items stored in a wish list.
8. The food information input method of claim 1, further comprising the steps of:
displaying at least one selected from the following items related to the received at least one piece of food information: user allergy effect, skin nutrition, growth nutrition, disease effect, aging effect, diet effect, expected intake calories compared to target calories, salinity, conversion value of food, food cost information, food environment information, carbon value information, resource value information, carbon emission, running cost loss cost, food raw material information, pre-meal input button, postprandial input button, favorites input button, and wish list storage button.
9. The food information input method of claim 1, further comprising the steps of:
identifying food displayed on the photographing screen,
wherein the received at least one piece of food information is related to the identified food.
10. The food information input method of claim 9, wherein,
the step of receiving the at least one piece of food information includes:
the unit of food displayed on the input button is changed according to the type of the recognized food, and information about the amount of the recognized food is input based on the changed unit of food.
11. The food information input method of claim 9, further comprising the steps of:
displaying the input limit range of the identified food.
12. The food information input method of claim 11, further comprising the steps of:
the minimum and maximum values of the displayed input limit range are adjusted,
wherein the input unit of the food information is changed according to the minimum value and the maximum value of the adjusted input limit range.
13. The food information input method of claim 9, further comprising the steps of:
Calculating and displaying intake target information in consideration of characteristics of the user according to the identified food;
receiving intake schedule information entered by the user regarding the identified food; and
the ratio of the input intake plan information to the displayed intake target information is displayed.
14. The food information input method of claim 13, further comprising the steps of:
after the user ingests the identified food, a success or failure of the ingestion target is displayed based on a comparison of the entered ingestion plan information with the displayed ingestion target information.
15. The food information input method of claim 9, further comprising the steps of:
analyzing the food image of the identified food to determine whether before or after a meal; and
recording the food image as a food comment before a meal by a user when the food image is taken before the meal, or recording the food image as a food comment after a meal by a user when the food image is taken after the meal.
16. The food information input method of claim 9, further comprising the steps of:
displaying a type of food candidate that can be identified based on the identified type of food.
17. The food information input method of claim 9, further comprising the steps of:
searching for a restaurant providing the identified food based on the location of the food information input device; and
the received food information is corrected and displayed based on the menu information in the searched restaurant.
18. The food information input method of claim 9, further comprising the steps of:
searching for a restaurant providing the identified food based on the location of the food information input device; and
based on the menu information in the searched restaurant, food corresponding to the received food information is ordered.
19. The food information input method of claim 9, further comprising the steps of:
when the food information is input, a preset stimulus for the input food information is provided to the user.
20. A food information input method performed by a food information input apparatus, the food information input method comprising the steps of:
operating in a photographing mode in which an input button is displayed on a photographing screen;
checking whether a preset photographing guide condition is satisfied when operating in the photographing mode; and
And if the preset shooting guide condition is met, shooting operation is performed.
21. The food information input method of claim 20, further comprising the steps of:
identifying food displayed on the photographing screen; and
food information about the identified food is received.
22. The food information input method of claim 21, wherein,
the preset photographing guide condition is at least one condition selected from the following conditions:
the identified food and the food information input device may be parallel to each other, the food information input device may be horizontal, the food information input device may be vertically above the identified food, and the food information input device may be moved.
23. The food information input method of claim 21, further comprising the steps of:
and displaying photographing accuracy on the screen based on a relative inclination between the identified food and the food information input device.
24. The food information input method of claim 21, further comprising the steps of:
the user is provided with preset incentives for shooting operations.
25. The food information input method of claim 21, further comprising the steps of:
analyzing the captured image of the identified food to determine whether it is before or after a meal; and
recording the photographed image as a food comment before a meal by a user when photographing the food image before a meal, or recording the photographed image as a food comment after a meal by a user when photographing the food image after a meal.
26. The food information input method of claim 21, further comprising the steps of:
displaying a type of food candidate that can be identified based on the identified type of food.
27. The food information input method of claim 21, further comprising the steps of:
searching for a restaurant providing the identified food based on the location of the food information input device; and
the received food information is corrected and displayed based on the menu information in the searched restaurant.
28. The food information input method of claim 21, further comprising the steps of:
searching for a restaurant providing the identified food based on the location of the food information input device; and
based on the menu information of the searched restaurant, food corresponding to the received food information is ordered.
29. A food information input device, comprising:
a camera;
a display;
a memory configured to store one or more programs; and
a processor configured to execute the one or more stored programs,
wherein the processor operates in a photographing mode in which an input guide and an input button moving along the input guide are displayed on a photographing screen through the display; and is also provided with
Wherein at least one piece of food information is input by the camera using the input guide and the input button when the processor operates in the photographing mode.
30. A food information input device, comprising:
a camera;
a display;
a memory configured to store one or more programs; and
a processor configured to execute the one or more stored programs,
wherein the processor operates in a photographing mode in which an input button is displayed on a photographing screen through the display,
wherein when the processor operates in the photographing mode, it is checked whether a preset photographing guide condition is satisfied, and
Wherein if the preset photographing guide condition is satisfied, photographing is performed by the camera.
CN202180072552.1A 2020-10-23 2021-10-25 Food information input method and device Pending CN116457818A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2020-0138123 2020-10-23
KR1020210142157A KR20220054223A (en) 2020-10-23 2021-10-22 Method and apparatus for inputting food information
KR10-2021-0142157 2021-10-22
PCT/KR2021/014989 WO2022086304A1 (en) 2020-10-23 2021-10-25 Food information input method and apparatus

Publications (1)

Publication Number Publication Date
CN116457818A true CN116457818A (en) 2023-07-18

Family

ID=87128895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180072552.1A Pending CN116457818A (en) 2020-10-23 2021-10-25 Food information input method and device

Country Status (1)

Country Link
CN (1) CN116457818A (en)

Similar Documents

Publication Publication Date Title
US10803315B2 (en) Electronic device and method for processing information associated with food
Hassannejad et al. Automatic diet monitoring: a review of computer vision and wearable sensor-based methods
CN103562921B (en) The food database that position enables
DK178771B1 (en) Physical activity and workout monitor
EP3454338B1 (en) Meal advice providing system and analysis device
CN104081438B (en) The treatment of name bubble
Bruno et al. A survey on automated food monitoring and dietary management systems
KR20170071159A (en) Method, storage medium and apparatus for providing service associated with images
KR20200066278A (en) Electronic device and method for processing information associated with food
Ocay et al. NutriTrack: Android-based food recognition app for nutrition awareness
KR20180007232A (en) Method and apparatus for providing service associated with health care
CN105892905A (en) Gesture Input Processing Method and Electronic Device Supporting the Same
US20220093234A1 (en) Systems, methods and devices for monitoring, evaluating and presenting health related information, including recommendations
US20190221134A1 (en) Meal Management System
CN112951373A (en) Food material recommendation method and device, intelligent refrigerator and intelligent terminal
CN102053158A (en) Blood sugar analysis system, device and method
CN111863194A (en) Diet information display method, device, equipment and storage medium
US20220365962A1 (en) Method and Apparatus for Inputting Food Information
CN116457818A (en) Food information input method and device
CN113011236A (en) Information display method, intelligent door lock and computer readable storage medium
KR20190048922A (en) Smart table and controlling method thereof
KR20220054223A (en) Method and apparatus for inputting food information
KR20220052046A (en) Method and Apparatus of Analyzing Eating Habits by User
KR20190047296A (en) Methods for management of nutrition and disease using food images
CN109155155A (en) For providing the method for collection menu on the computing device and calculating equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination