CN114694800B - Identity binding method and device, storage medium and diet nutrition tracking system - Google Patents

Identity binding method and device, storage medium and diet nutrition tracking system Download PDF

Info

Publication number
CN114694800B
CN114694800B CN202210620627.1A CN202210620627A CN114694800B CN 114694800 B CN114694800 B CN 114694800B CN 202210620627 A CN202210620627 A CN 202210620627A CN 114694800 B CN114694800 B CN 114694800B
Authority
CN
China
Prior art keywords
meal
taker
identity
information
taking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210620627.1A
Other languages
Chinese (zh)
Other versions
CN114694800A (en
Inventor
罗红艳
曾志成
黄进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hongbozhicheng Technology Co ltd
Original Assignee
Shenzhen Hongbozhicheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hongbozhicheng Technology Co ltd filed Critical Shenzhen Hongbozhicheng Technology Co ltd
Priority to CN202210620627.1A priority Critical patent/CN114694800B/en
Publication of CN114694800A publication Critical patent/CN114694800A/en
Application granted granted Critical
Publication of CN114694800B publication Critical patent/CN114694800B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records

Abstract

The application relates to an identity binding method, an identity binding device, a storage medium and a meal nutrition tracking system, wherein when a plurality of pieces of face information are recognized within a preset time interval, the identity binding method can determine that a meal taker occupying the optimal position is a person who is taking a meal at present according to the position and movement of the recognized face of the meal taker in a screen, and the meal taker does not need to carry out meal plate-meal position identification by means of a radio frequency chip and a meal position card reader, so that the cost of the meal nutrition tracking system is reduced, the operation of the meal taker is simplified, and the accuracy of identity recognition is higher.

Description

Identity binding method and device, storage medium and diet nutrition tracking system
Technical Field
The application relates to the technical field of nutrition science, in particular to an identity binding method and device, a storage medium and a diet nutrition tracking system.
Background
Healthy diet management can help people keep healthy and help prevent and treat diseases such as obesity, cardiovascular diseases, diabetes and the like which are closely related to diet.
Healthy diet management needs to be closely related to three meals a day, the diet nutrition condition of an individual needs to be tracked and continuously adjusted, and therefore people need to record diet of each meal, such as meal time, food types, weight, nutritional ingredients and the like, the recording process is time-consuming and labor-consuming, particularly, people who often eat outside are more difficult to realize and adhere to the tracking of diet nutrition, and therefore, some catering institutions begin to provide diet nutrition tracking services for diners through diet nutrition tracking devices, systems and the like. In the prior art, because more diners are queued at each dining position to take a meal, in order to accurately identify the identity of the diner who takes the meal at present, the diner needs to bind a face-radio frequency chip by virtue of the dining plate and the radio frequency chip before taking the meal, and when taking the meal, the radio frequency chip and the dining position card reader are used for carrying out the identification of the dining plate and the dining position, so that the diner is required to participate in the operation when carrying out the identity identification on the diner, and the cost of the dietary nutrition tracking system is high.
Disclosure of Invention
Technical problem to be solved
In view of the above disadvantages and shortcomings of the prior art, the present application provides an identity binding method, device, storage medium and diet nutrition tracking system, which solves the technical problems that the existing diet nutrition tracking system is complex and high in cost, and needs diners to participate in operation when identifying the identity of diners.
(II) technical scheme
In order to achieve the above purpose, the present application adopts a main technical solution comprising:
in a first aspect, an embodiment of the present application provides an identity binding method, which is applicable to an identity recognition device using real-time video recognition, where the identity recognition device is bound to a meal location in a meal nutrition tracking system, and the method includes the following steps:
step S11, when the interval of the identification time of the face information of the current meal taker A and the next meal taker B is smaller than a preset time interval, judging whether the meal taker A and the meal taker B are both in an effective meal taking range, if so, executing step S12, otherwise, executing step S15;
step S12, judging whether the diner A and the diner B move towards the middle position of a screen or not and the diner A is closer to the middle position of the screen than the diner B, if so, binding the identity information of the diner A with the dining position, otherwise, executing step S13;
step S13, judging whether the meal taker A and the meal taker B both move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen than the meal taker A, if so, binding the identity information of the meal taker B with the meal position, otherwise, executing step S14;
step S14, judging whether the position relation between the meal taker A and the meal taker B meets the condition that the meal taker A moves away from the middle position of the screen, the meal taker B moves towards the middle position of the screen, the meal taker A is closer to the middle position of the screen than the meal taker B, if yes, the identity information of the meal taker A is bound with the meal position, and if not, the identity information of the meal taker B is bound with the meal position;
step S15, binding the identity information of the food taker located in the effective food taking range with the meal position;
after binding the identity information of the meal taker A or the meal taker B with the meal size, the method further comprises:
step S20, comparing the currently recognized face information with the characteristic values in the face database, if the comparison fails, executing step S21, otherwise executing step S22;
step S21, determining whether the feature value of the currently recognized face information is empty, if yes, executing step S211, otherwise executing step S212;
step S211, judging whether cache data of the face information exist, if so, executing step S23, otherwise, executing step S24;
step S212, judging whether the current diner is located at the center position, if so, executing step S23, otherwise, executing step S24;
step S22, judging whether the currently recognized face information is new face information, if so, executing step S221, otherwise, executing step S25;
step S221, judging whether the previous bound food taker is located at the center position of the screen, if so, executing step S23, otherwise, executing step S11, wherein the previous bound food taker is the bound food taker A or the bound food taker B;
step S23, determining the currently recognized face information as the face information of the previously bound diner;
step S24, starting a new identification process;
and step S25, continuing to keep the identity of the current diner.
Optionally, the step of determining whether the meal taker a and the meal taker B are both within the valid meal taking range includes:
and judging whether the values of F (A) or F (B) are both larger than a preset value, if so, determining that the diner A and the diner B are both in the effective meal taking range, wherein F is the face width information.
Optionally, before step 11, the method further includes:
and step S10, judging whether more than one piece of face information is recognized in the preset time interval, if so, executing the step S11, and otherwise, binding the identity information of the current meal taker and the meal position.
Optionally, before the step S11, the method further includes:
step S31, judging whether a preemptive meal taking event occurs when the ith meal taking occurs, if so, determining X (i) >0 and executing step S32, otherwise, executing step S11, wherein X (i) is the meal taking amount of the ith meal taking;
step S32, when it is determined that T (i) < T, where T (i) is a time after the ith time, and T is a limit time for recognizing the face information of the registered user, whether the value of X (i + 1) is zero, if yes, step S321 is executed, otherwise, step S33 is executed;
step S321, determining whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32;
step S322, uploading meal fetching information of the ith meal fetching;
step S33, when T is not more than T (i) < T + T1, whether the value of X (i + 1) is zero or not is judged, if yes, step S331 is executed, otherwise, step S34 is executed, wherein T1 is the time added for identifying strangers compared with registered users;
step S331, judging whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32;
step S34, when the value of X (i + 1) is zero or not when T (i) is not less than T + T1, if yes, the data of the ith meal is confirmed to be no main data and the no main data is uploaded, otherwise, S35 is executed;
and step S35, judging whether a person takes a meal within T time before the ith meal taking happens, if so, determining that the current meal taker is the previous meal taker and uploading the meal taking information of the ith meal taking, and otherwise, determining that the data of the ith meal taking is no main data and uploading the meal taking information of the ith meal taking.
Optionally, the step S321 of determining whether the identity of the current diner is identified includes: judging whether a registered user or a stranger is identified, if so, determining that the current diner is the registered user or the stranger; if not, judging whether a person takes the meal within T time before the ith meal taking happens, and if so, determining that the current meal taker is the previous meal taker; otherwise, determining that the identity of the current diner is not recognized;
the step S331 of determining whether the current identity of the diner is recognized includes: judging whether a registered user or a stranger is identified, if so, determining that the current diner is the registered user or the stranger; otherwise, determining that the identity of the current diner is not recognized.
Optionally, the method further comprises making the value of m (i) a null value, Y =0, where m (i) is the taker code, Y is the status value, and Y =0 is the initial value;
after step S32 or step S33, if it is determined that the current meal taker is a registered user, the uploading the meal taking information of the ith meal taking further includes: obtaining a taker code writing M (i) of the currently used taker, and uploading X (i), M (i) and Y = 1;
after step S32, if it is determined that the current meal taker is the previous meal taker, the uploading the meal fetching information of the ith meal further includes: acquiring a code M (i-1) of a taker taking the ith-1 time of meal, and uploading X (i), M (i-1) and Y = 1;
after the current meal taker is determined to be a stranger, the uploading of the meal taking information of the ith meal taking further comprises: let M (i) = M (stranger), and upload x (i), M (stranger), and Y = 2;
after the data of the ith meal taking is determined to be no main data, the uploading the meal taking information of the ith meal taking further comprises: let m (i) = null, and upload x (i), m (i), and Y = 3;
where Y =1 indicates successful recognition, Y =2 indicates recognition as a stranger, and Y =3 indicates no recognition of a person.
In a second aspect, embodiments of the present application provide an identification apparatus comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the method steps of the first aspect.
In a third aspect, the present application provides a meal nutrition tracking system, which is characterized in that the meal nutrition tracking system comprises an identification device as described in the second aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the identity binding method as described in the first aspect or the second aspect.
(III) advantageous effects
The beneficial effect of this application is:
1. when a plurality of face information are recognized within a preset time interval, the optimum position of the meal taker can be determined to be the current meal taker according to the position and the movement of the recognized face of the meal taker in the screen, and the meal dish-meal position identification of the meal taker by means of the radio frequency chip and the meal position card reader is not needed, so that the cost of the meal nutrition tracking system is reduced, the operation of the meal taker is simplified, and the accuracy of identity recognition is high;
2. judging whether the meal taker is in the central position or not according to the real-time recognized face information, determining the current meal taker in the plurality of meal takers, and avoiding the problem that other meal takers are wrongly bound to the meal position when the bound meal taker does not finish taking a meal;
3. when the meal is taken first, the identity of the meal taker is identified through the face information identified before and after the meal is taken, the meal taking data can be accurately associated, and the association error of the meal taking data is effectively prevented;
4. the server receives the identification information sent by the identity identification device of each meal position and the meal taking information sent by the nutrition price tag device, obtains and stores the meal taking information of the meal taking person, calculates nutrition and health evaluation data and/or formulates meal suggestions for the meal taking person according to the historical meal taking information of the meal taking person and the pre-stored health information, does not need the meal taking person to perform operation in the whole process, and is favorable for pertinently extracting personalized meal suggestions according to the body differences of the meal taking person on the basis of long-term tracking and analysis of meal nutrition intake data.
Drawings
Fig. 1 is a schematic flowchart of an identity binding method provided in embodiment 1 of the present application;
fig. 2 is a flowchart illustrating another identity binding method according to embodiment 2 of the present application;
fig. 3 is a flowchart illustrating another identity binding method according to embodiment 3 of the present application;
FIG. 4 is a flowchart of "determine if the identity of the current taker is identified" of step S321 in FIG. 3;
FIG. 5 is a flowchart of "determining whether the identity of the current diner is identified" of step S331 in FIG. 3;
fig. 6 is a schematic diagram of a meal deployment of a dietary nutrition tracking system of some embodiments of the present application;
FIG. 7 is a schematic diagram of a line serving device of some of the real-time aspects of FIG. 6;
fig. 8 is a schematic diagram of a network architecture of a dietary nutrition tracking system of some embodiments of the present application.
Detailed Description
For a better understanding of the present application, reference is made to the following detailed description of the present application, which is to be read in connection with the accompanying drawings.
The identity binding method provided by the embodiment of the application is suitable for an identity recognition device adopting real-time video recognition, and the identity recognition device is bound with a meal position in a diet nutrition tracking system, and is characterized by comprising the following steps:
step S11, when the interval of the recognition time of the face information of the current meal taker A and the next meal taker B is smaller than the preset time interval, judging whether the meal taker A and the meal taker B are both in the effective meal taking range, if so, executing step S12, otherwise, executing step S15;
step S12, judging whether the meal taker A and the meal taker B both move towards the middle position of the screen and the meal taker A is closer to the middle position of the screen than the meal taker B, if so, binding the identity information of the meal taker A with the meal position, otherwise, executing step S13;
step S13, judging whether the meal taker A and the meal taker B both move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen than the meal taker A, if so, binding the identity information of the meal taker B with the meal position, otherwise, executing step S14;
step S14, judging whether the position relation between the meal taker A and the meal taker B meets the condition that the meal taker A moves away from the middle position of the screen, the meal taker B moves towards the middle position of the screen, the meal taker A is closer to the middle position of the screen than the meal taker B, if yes, the identity information of the meal taker A is bound with the meal position, and if not, the identity information of the meal taker B is bound with the meal position;
and step S15, binding the identity information of the food taker in the valid food taking range with the meal position.
The identity binding method not only reduces the cost of the diet nutrition tracking system and simplifies the operation of the meal taker, but also has higher identity identification accuracy.
In order to better understand the above technical solutions, exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Detailed description of the preferred embodiments
Example 1:
referring to fig. 1, some embodiments of the present application provide an identity binding method, which is suitable for an identity recognition device using real-time video recognition, and the identity recognition device is bound with a meal position in a meal nutrition tracking system, and the method includes the following steps:
and S11, when the interval between the recognition time of the face information of the current meal taker A and the recognition time of the face information of the next meal taker B is smaller than the preset time interval, judging whether the meal taker A and the meal taker B are both in the effective meal taking range, if so, executing the step S12, and otherwise, executing the step S15.
In practical application, whether the meal taker is in an effective meal taking range can be judged through the width of the recognized face: when the human face span is larger, the meal taker is closer to the shot and the meal taking position; when the face width information is smaller, the face width information is far away from the meal taking position and is not the current meal taking person. Therefore, the step of judging whether the meal taker A and the meal taker B are both in the effective meal taking range comprises the following steps:
and (C) judging whether the values of F (A) or F (B) are all larger than a preset value, if so, determining that both the diner A and the diner B are in the effective meal taking range.
And step S12, judging whether the meal taker A and the meal taker B both move towards the middle position of the screen and the meal taker A is closer to the middle position of the screen than the meal taker B, if so, executing step S121, otherwise, executing step S13.
And step S121, binding the identity information of the diner A with the dining position.
In this embodiment, the meal taker a and the meal taker B take or are identified in sequence, which indicates that the meal taker a takes the preferred position of the meal position meal when both move to the middle of the meal position and the meal taker a is closer to the middle position of the screen before the meal taker B, and thus the current meal taker is determined to be the meal taker a.
And step S13, judging whether the meal taker A and the meal taker B both move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen than the meal taker A, if so, executing step 131, otherwise, executing step S14.
Step 131 is executed to bind the identity information of the diner B with the meal space.
In practice, when both people move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen, the meal taker B is shown to occupy the preferred position for taking the meal, and therefore the meal taker B can be determined to be the current meal taker.
And step S14, judging whether the position relation between the meal taker A and the meal taker B meets the condition that the meal taker A moves away from the middle position of the screen, the meal taker B moves towards the middle position of the screen, and the meal taker A is closer to the middle position of the screen than the meal taker B, if so, executing the step S141, otherwise, executing the step S142.
And step S141, binding the identity information of the diner A with the dining position.
And S142, binding the identity information of the diner B with the dining position.
In practical application, when the meal taker A moves away from the center line of the screen, the meal taker B moves close to the center line of the screen, and the meal taker A is closer to the middle position of the screen than the meal taker B, the meal taker A occupies the better position of the meal taking at the moment, and therefore the meal taker A can be determined to be the current meal taker; when the meal taker A moves away from the center line of the screen, the meal taker B moves towards the position close to the center line of the screen, and the meal taker B is closer to the middle position of the screen than the meal taker A, the better position for the meal taker B to take is indicated, and therefore the current meal taker B can be determined to be the meal taker B.
And step S15, binding the identity information of the food taker in the valid food taking range with the meal position.
In one feasible scheme, the positions of the food taker A and the food taker B can be judged according to the distance between the recognized human face center line of the food taker A and the recognized human face center line of the food taker B and the recognized screen center line:
and G is a positive number when the midline of the face moves towards the midline of the screen, and G is a negative number when the midline of the face moves away from the midline of the screen.
When 0< g (a) < g (B), it is illustrated that the diner a and the diner B both move toward the screen middle position and the diner a is closer to the screen middle position than the diner B;
when G (A) < G (B) <0, it shows that the meal taker A and the meal taker B both move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen than the meal taker A;
when G (A) <0, 0< G (B) and | G (A) ≦ | G (B) |, it is said that the meal taker A moved away from the screen middle position, the meal taker B moved towards the screen middle position, and the meal taker A was closer to the screen middle position than the meal taker B;
when G (A) <0, 0< G (B) and | G (B) ≦ | G (A) | illustrates that the meal taker A moves away from the screen neutral position, the meal taker B moves towards the screen neutral position, and the meal taker B is closer to the screen neutral position than the meal taker A.
In one possible implementation, before performing step 11, the method further includes:
and step S10, judging whether more than one piece of face information is recognized within a preset time interval, if so, executing step S11, and otherwise, binding the identity information of the current meal taker with the meal position.
The identity binding method is suitable for an identity recognition device adopting real-time video recognition, the identity recognition device is bound with a meal position in a meal nutrition tracking system, when a plurality of face information are recognized in a preset time interval, a meal taker occupying the optimal position can be determined to be a current meal taker according to the position and movement of the recognized face of the meal taker in a screen, and meal plate-meal position identification of the meal taker by means of a radio frequency chip and a meal position card reader is not needed, so that the cost of the meal nutrition tracking system is reduced, the operation of the meal taker is simplified, and the identity recognition accuracy is high.
Example 2:
in practical application, in the process of taking food, the situation that a bound food taker does not face a camera temporarily due to actions such as lowering or turning around, and the situation that other people approach the bound food taker frequently occurs, and at this time, the identity recognition device captures a human face in real time for recognition due to the adoption of the camera for real-time video recognition, so that another bound food taker with a wrongly bound food position is avoided when the bound food taker does not finish taking food, and on the basis of the embodiment 1, the invention provides another identity binding method. Referring to fig. 2, after the identity information of the meal taker a or the meal taker B is bound to the meal location, the method further comprises:
and S20, comparing the face information of the currently identified diner with the characteristic values in the face database, if the comparison fails, executing S21, otherwise executing S22.
And step S21, judging whether the characteristic value of the face information of the currently identified diner is empty, if so, executing step S211, otherwise, executing step S212.
In practical application, if the characteristic value of the face information is null, the diner may leave the dining position, or an accident situation that cannot be identified occurs.
Step S211, determining whether cached data of the face information exists, if yes, performing step S23, otherwise, performing step S24.
In practical application, the identity recognition device may be configured to cache the recognized face information and set a time for clearly caching data, where the time may be set according to an actual situation, for example, the time for a person to take a meal normally.
If the cache data of the face information exists, the currently recognized face information can be determined as the face information of the bound diner A or diner B. If the data is not cached, the previously bound meal taker can be considered to have finished taking the meal and leave the meal position, and a new round of identification and binding can be started.
Step S212, judging whether the current diner is positioned at the center position, if so, executing step S23, otherwise, executing step S24;
in practical application, whether the person taking a meal is located at the center position can be judged according to the distance between the face and the center line of the screen.
And step S22, judging whether the face information of the currently identified diner is new face information, if so, executing step S221, otherwise, executing step S25.
In this step, the new face information means that the current recognized face information is different from the face information of the previously bound meal taker a (or meal taker B).
And S221, judging whether the previously bound diner is in the central position, if so, executing S23, otherwise, executing S11.
In this embodiment, the previous bound taker is the bound taker a or the bound taker B in embodiment 1.
Execution of step S11 may re-determine the taker occupying the optimal position.
And step S23, determining the currently recognized face information as the face information of the previously bound diner.
When the comparison fails but the characteristic value of the face information is not a null value, the fact that the meal taker is lowering the head, turning the head or blocking the face is indicated, and if the meal taker is still in the central position, the fact that the meal taker is still a previous bound person can be determined.
Step S24, a new recognition process is started.
And step S25, continuing to keep the identity of the current diner.
In the embodiment, whether the diner is in the central position or not is judged according to the real-time recognized face information, and the current diner is determined among the plurality of diners, so that the problem that other diners are wrongly bound to the dining position when the bound diners do not finish taking meals is solved.
Example 3:
in practical applications, when a meal is taken, a situation that a meal taker takes a meal first before the identity recognition succeeds is also likely to occur, so that in order to accurately correlate meal taking data and prevent correlation errors of the meal taking data, the present invention further provides another identity binding method based on the foregoing embodiment, referring to fig. 3, before step S11, the method further includes the following steps:
step S31, determining whether a preemptive meal fetching event occurs when the ith meal fetching occurs, if yes, determining x (i) >0, and executing step S32, otherwise, executing step S11, where x (i) is the meal fetching amount of the ith meal fetching.
In practical application, after determining that the preemptive meal taking event occurs, the method may further include: and sending out reminding information to remind a meal taker to brush the face first and then take the meal.
In practical application, whether a food taking event occurs in the ith time can be judged by the following method:
and judging whether bound meal taker identity information corresponding to the ith meal taking exists or not, if yes, determining that a preemptive meal taking event does not occur, and otherwise, determining that the preemptive meal taking event occurs.
The bound meal taker identity information corresponding to the ith meal taking exists, the time point of the identity binding is the same as the time point of the ith meal taking, the ith identity binding is completed, or a meal taker code M is set for the bound meal taker identity information each time, and M (i) corresponding to X (i) exists.
In practical application, each time the weight of dishes at a dining position is reduced, it can be judged that a meal taking occurs.
In step S32, when it is determined that T (i) < T, where T (i) is the time after the ith time, and T is the limit time at which the face information of the registered user is recognized, whether the value of X (i + 1) is zero or not is determined, and if yes, step S321 is performed, otherwise, step S33 is performed.
At T (i) < T, it is determined whether X (i + 1) is equal to 0, i.e., whether the (i + 1) th meal taking event occurs again within a T period after the (i) th meal taking event occurs.
Step S321, determining whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32.
And S322, uploading meal fetching information of the ith meal fetching.
And S33, judging whether the value of X (i + 1) is zero when T is less than or equal to T (i) < T + T1, if so, executing S331, otherwise, executing S34, wherein T1 is the time added for identifying a stranger compared with the identification of a registered user.
Step S331, determining whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32.
Step S34, when it is determined that T (i) is equal to or greater than T + T1, whether the value of X (i + 1) is zero, if yes, step S341 is executed, otherwise, step S35 is executed.
And step S341, confirming that the data of the ith meal taking is the non-main data and uploading the non-main data.
Step S35, determining whether there is a meal taken within time T before the ith meal taking occurs, if yes, executing step S351, otherwise executing step 341.
And step S351, determining that the current meal taker is the previous meal taker and uploading the meal taking information of the ith meal.
In some possible schemes, referring to fig. 4, the step S321 of "determining whether the identity of the current diner is identified" may include the following steps:
s3211, judging whether a registered user or a stranger is identified, if yes, executing a step S3212, otherwise, executing a step S3213.
In step S3212, the current diner is determined to be a registered user or stranger.
Step S3213, judging whether a person takes the meal within T time before the ith meal taking occurs, if yes, executing step S3214; otherwise, step S3215 is executed.
And S3214, determining that the current meal taker is the previous meal taker.
Step S3215, determining that the identity of the current diner is not recognized.
In practical application, the identity of the current diner can be identified according to the face information. If the identifiable face information is acquired, the acquired face information can be compared with face information of a registered user stored in a database, if the comparison is successful, the current meal taker is indicated as the registered user, otherwise, the current meal taker is indicated as a stranger, otherwise, the acquired face information is abnormal, for example, the face information is not identified (no person) or the acquired face information is incomplete (for example, the meal taker lowers the head, turns the head or shields the face).
In some possible scenarios, referring to fig. 5, the step S331 of determining whether the current identity of the meal taker is recognized includes the following steps:
step S3311, judging whether a registered user or a stranger is identified, if so, executing step S3312; otherwise, step S3313 is executed.
And step S3312, determining the current diner as a registered user or a stranger.
Step S3313, determining that the identity of the current meal taker is not recognized.
In one possible scheme, a meal-taker code m (i) and a state value Y can also be set in the meal-taker amount-human face ledger data. Where Y =0 is defined as an initial value, Y =1 indicates successful recognition, Y =2 indicates recognition as a stranger, and Y =3 indicates no person was recognized.
After step S32 or step S33, if the current diner is determined to be a registered user, the method further comprises: acquiring a current diner code writing M (i) of a used diner, and uploading X (i), M (i) and Y = 1;
after step S32, if it is determined that the current meal taker is a registered user and is a meal taker for the ith-1 th meal, the method further comprises: acquiring a code M (i-1) of a taker taking the ith-1 time of meal, and uploading X (i), M (i-1) and Y = 1;
after determining that the current diner is a stranger, the method further comprises: let M (i) = M (stranger), and upload x (i), M (stranger), and Y = 2;
after determining that the data of the ith meal taking is confirmed to be no main data, the method further comprises the following steps: let m (i) = null, and upload x (i), m (i), and Y = 3.
According to the identity binding method, when the meal is taken first, the identity of the meal taker is identified through the face information identified before and after the meal is taken, the meal taking data can be accurately associated, and the meal taking data association error is effectively prevented.
Example 4:
on the basis of the foregoing embodiments, the present application further provides an identification apparatus, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the computer program implements the method steps according to embodiments 1 to 3.
Example 5:
on the basis of the foregoing embodiment, the present application further provides a meal nutrition tracking method, which is applicable to a server, and includes the following steps:
and S51, receiving meal taking information sent by a certain nutrition price tag device.
In the step, the meal taking information comprises meal position identification of the meal taking position bound by the nutrition price tag device, dish information of the meal taking position, meal taking weight and meal taking time;
and S52, matching first identification information with the same meal position identification as the meal position identification of the meal taking information from the received plurality of identification information sent by the plurality of identification devices.
In the step, each piece of identification information comprises a meal position identification of the meal taking position, identity identification data of the meal taking person and identification time bound by the corresponding identity identification device.
And the identity identification data is the identity identification data of the diner bound with the dining positions when the quality of the dishes of each dining position is reduced.
In practical application, the meal position identifiers of different meal taking positions are different. Through different meal position marks, can carry out accurate location to getting the meal person and dish.
Because the meal position identifications of each meal taking position are different, the meal taking information and the first identification information with the same meal position identifications correspond to the same meal taking position.
And S53, searching second identification information with the identification time matched with the meal fetching time of the meal fetching information from the first identification information.
And S54, analyzing and integrating the meal taking information and the second identification information to obtain meal taking information, wherein the meal taking information comprises the identity information of the meal taking person, the dish information and the meal taking weight.
And S55, storing the dining information.
Referring to fig. 6-8, in some possible implementations, the meal nutrition tracking system may have multiple ports 10, each port may have multiple meal lines 12, and each meal line 12 may have multiple meal locations 13. Each meal taking position is provided with an electronic scale 131, a nutrition price tag device 132 and an identity recognition device 133 which are bound with the meal taking position 13.
In practical application, a meal position identifier can be set for each meal taking position 13, the meal position identifiers of the meal taking positions are different, and then the meal position identifiers are bound with the nutrition price tag device 132 and the identity recognition device 133 of the meal positions, so that the meal positions and the nutrition price tag device 132 and the identity recognition device 133 can be positioned differently
The server can receive the identification information sent by the identification devices 133 of a plurality of meal taking positions 13 and the meal taking information sent by the nutrition price tag devices 132 of the meal taking positions 13.
When the server receives the identification information that certain nutrition price tag device sent, can look for the identification information that the identity identification device of same meal position sent according to the meal position sign earlier, then will look for the identification information's that finds identification information the identification time and get meal time in the meal information and match, thereby obtain the information of having dinner of getting the diner, this information of having dinner is the analysis integration to the final successful identification information of matching and the information of getting meal, can include the identity information of getting the diner, get meal time, the meal position sign, the dish information and get meal weight, promptly: who, at what time, what dishes were picked up, and how much weight was picked up.
In practical application, the identification time may be a specific time point, for example, a time point when a meal taker enters an identification area of the meal taking position identification device or a time point when identification is successful, and when meal taking information reported for a meal taking position nutrition price tag device matches identification information, identification information in the nearest 5s to 10s of the meal taking position may be searched according to the meal taking time in the meal taking information, so as to obtain identification data of the meal taker related to the meal taking information.
In practical application, the identification time may also be a time interval value, the start time of the time interval value is a time point when the meal taker enters the identification area of the identity identification device, and the end time of the time interval value is a time point when the meal taker leaves the identification area.
In practical application, the identity identification data may include data for determining the identity of the diner, and the server may analyze the identity of the diner according to the identity identification data to obtain identity information; the identification data can also directly contain the identification information of the diner, and the server can directly read the identification information.
In some possible scenarios, the dish information may include dish name, major ingredients, nutritional components, and the like.
The server can record dining information of each meal taker, wherein the dining information comprises identity information of the meal taker, meal taking time, dish information and meal taking weight.
The dining information of the diner is recorded through the server, the diner does not need to perform operation, and the long-term tracking of the dietary nutrition intake of the diner is facilitated.
In some possible implementations, after step S52, the method further includes:
feeding the dining information back to the nutrition price tag device.
The dining information can further comprise a meal position identifier, after the dining information is obtained, the server can feed the dining information back to the nutrition price tag device bound with the meal position identifier according to the meal position identifier in the dining information, and the dining information is displayed by the nutrition price tag device to a meal taker who finishes taking a meal just before the meal position.
In practical application, before feeding the meal information back to the nutrition price tag device, the server can analyze and calculate the nutritional ingredients, energy and the like contained in the food taken at this time according to the dish information and the meal taking weight, then writes the nutritional ingredients, energy and the like into the meal information as meal nutrition intake data, and sends the meal information to the nutrition price tag device for display, so that a meal taker can conveniently know the nutrition intake condition of the meal taken at this time.
In some possible solutions, after the step S53 "storing dining information", the method further includes:
calculating nutrition and health evaluation data and/or making a diet suggestion for the food taker according to the health information and the historical dining information prestored by the food taker;
and transmitting the nutritional health evaluation data and/or the formulated dietary suggestions to a query terminal for the diner to review.
In practical applications, the inquiry terminal may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
In practical application, the operation of calculating the nutritional health assessment data and/or making the dietary suggestions can be performed once for each meal, each day or each preset interval time of the meal taker according to the requirement or health information of the meal taker.
In some possible solutions, before the "receiving the identification information sent by the plurality of identification devices", the method further includes:
receiving meal position dish information sent by a central console, wherein the meal position dish information comprises dish information and a meal position identifier;
and sending the dish information to a nutrition price tag device bound with the meal position identifier.
According to the meal nutrition tracking method, the server receives the identification information sent by the identity identification device of each meal position and the meal taking information sent by the nutrition price tag device, the meal taking information of a meal taking person is obtained and stored, the nutrition and health evaluation data can be calculated and/or meal suggestions can be made for the meal taking person according to the historical meal taking information of the meal taking person and the pre-stored health information, the meal taking person is not required to perform operation in the whole process, and the personalized meal suggestions can be pertinently extracted according to the body differences of the meal taking person on the basis of long-term tracking and analysis of meal nutrition intake data.
Example 6:
on the basis of the foregoing embodiments, the present application provides a server including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the computer program implements the method steps according to embodiment 5 when executed by the processor.
Example 7:
on the basis of the previous embodiment, the embodiment of the application also provides another meal nutrition tracking method which is suitable for a nutrition price tag device bound with a meal taking position. The method comprises the following steps:
s61, receiving weight data reported by an electronic scale of the dining table bound with the weighing scale in real time;
s62, obtaining the meal taking weight of the current meal taker according to the weight data;
and S63, sending the meal taking information of the meal taker to the server, wherein the meal taking information comprises meal taking time, meal taking weight, meal position identification of the meal position and pre-stored dish information.
In practical application, the dish information may include dish name, main ingredients, nutritional ingredients, and the like.
In some possible scenarios, referring to fig. 6-8, the meal nutrition tracking system may have multiple ports 10, each port 10 may have multiple meal lines 12, and each meal line 12 may have multiple meal locations 13. Each meal taking position is provided with an electronic scale 131, a nutrition price tag device 132 and an identity recognition device 133 which are bound with the meal taking position 13.
In practical application, a meal position mark can be set for each meal taking position 13, the meal position marks of each meal taking position 13 are different, and then the meal position marks are bound with the nutrition price tag device 132 and the identity recognition device 133 of the meal taking position 13, so that the different meal taking positions 13, the nutrition price tag device 132 and the identity recognition device 133 are positioned.
In practical application, the electronic scale 131 may directly report the changed weight of the dish as weight data each time the weight of the dish changes, and at this time, the nutrition price tag device 132 may process the weight data by using a stable weighing algorithm and a weight reduction algorithm according to the weight of the dish received this time and the stored weight of the dish received last time, so as to obtain the weight of the meal, and then replace the weight of the dish received last time with the weight of the dish received this time.
In practical applications, the electronic scale 131 may also obtain the meal-taking weight by using a stable weighing algorithm and a weight-losing algorithm when the weight of the dish changes each time, and directly report the meal-taking weight as weight data to the nutrition price tag device 132.
In some possible solutions, before step 61 "receiving weight data reported by the electronic scale of the meal taking position in real time", the method further includes the following steps:
receiving and storing the dish information sent by the server;
and displaying the dish information on a display screen.
In practical application, the dish information is sent to the server by an administrator through the center console.
In practical application, each nutrition price tag device is provided with a display screen, and dish information is displayed on the display screen, so that a diner can know whether the dishes of the meal taking position are suitable for eating by himself or not and how much the dishes can be eaten.
In some possible solutions, after "sending the meal fetching information of the meal taker to the server" in step S63, the method further includes:
receiving dining information of a diner sent by a server, wherein the dining information can comprise identity information of the diner, meal taking time, meal position identification, dish information and meal taking weight;
and displaying the dining information on a display screen.
In practice, the meal information may also include meal nutrient intake data.
The dining information of the meal taker is displayed on the display screen, so that the meal taker can know the meal taking condition and the nutrition intake of the meal taker at the meal taking position in time.
According to the diet nutrition tracking method, the nutrition price tag device is bound with the meal taking position, and can receive weight data reported by an electronic scale of the meal taking position, so that the meal taking weight of a meal taking person is obtained, meal position identification, the meal taking weight and dish information are sent to the server, the server is combined with the meal position identification and the identity identification data reported by the identity identification device, the meal taking person is identified, meal taking information of the meal taking person is obtained and recorded, the meal taking person is not required to perform operation in the whole process, long-term tracking of diet nutrition intake of the meal taking person is facilitated, and the meal taking person is helped to keep healthy diet.
Example 8:
on the basis of the foregoing embodiments, the present application provides a nutrition price tag device, which is bound to a meal taking place of a meal nutrition tracking system, and comprises a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the method steps as set forth in embodiment 7.
Example 9:
on the basis of the foregoing embodiments, the present application provides a diet nutrition tracking system, which includes the console 20, the server 30 as described above, and at least one meal line 12, referring to fig. 6 to 8.
In some possible scenarios, the server 30 is configured to perform the method steps as described in embodiment 1, and will not be described herein again.
In some possible implementations, each meal line 12 includes a plurality of meal stations 13, and each meal station 13 includes an electronic scale 131, an identification device 133, and a nutritional price tag device 132 as described in example 4.
In practical application, the number of the meal lines 12 and the meal taking positions 13 can be increased or decreased according to actual needs. For example, the number of the meal taking positions 13 of each meal line 12 can be set to 32, preferably, also can be set to 24, and the like.
Optionally, referring to fig. 7, each dining line 12 may further include a dining line management terminal 14 and an inquiry terminal 16:
the meal line management terminal 14 is used for managing the equipment of the meal line 12.
The inquiry terminal 16 is arranged behind the last meal taking position and is used for inquiring and displaying the daily meal nutrition intake data, the nutrition health evaluation data and/or the meal suggestions of the meal taker to the server 30.
In practical application, each meal taking place 13 has its own meal place identifier for distinguishing from other meal taking places, and the nutrition price tag device 132 and the identity recognition device 133 of each meal place 13 are bound with the corresponding meal taking place by setting the meal place identifier.
In some possible implementations, the nutritional price tag device 132 is used to perform the method steps as described in example three, and will not be described herein.
In some possible solutions, the identification device 133 is used to identify the identification data of the diner of the meal location 13, and send the identification time, the meal location identification and the identification data to the server as identification information.
In practical application, the identity recognition device 133 can directly recognize the identity of the diner, obtain the identity information (such as name or customer number) of the diner, write the identity recognition data in the identity recognition data, and send the identity recognition data to the server 30; or only data (such as a shot photo, a video, an acquired fingerprint, a card number of a card swiping and the like) for determining the identity of the diner can be acquired, the acquired data is written into the identity identification data and is sent to the server 30, and the identity of the diner can be analyzed by the server 30 according to the identity identification data.
In practical applications, the identification device 133 may be a camera, a card reader, a fingerprint recognizer, or the like. When the identification device 133 is a camera, the following configuration may be adopted:
identification requirements: the distance recognition/centering/good quality can store the latest 60s of data;
identifying the distance: the focal length is 2.8mm, and the distance is 2-3 m;
low configuration performance: 20-path polling, 10-20 shots in 1s, serial processing and I51080 Ti;
high configuration performance: the display card 2080TI 40/s and 1080P/2M;
algorithm limit: for 50 s.
In practical applications, each of the id devices 133 can report the identification information to the master station 41 through the switch 40, and the master station 41 sends the identification information to the server 30.
In practical applications, the switch 40 and the primary station 41, and the primary station 41 and the server 30 can communicate via TCP/IP protocol.
In some possible embodiments, a dish basin 150 for holding dishes is placed on the electronic scale 131 to monitor the weight change of the dishes and report weight data related to the dishes to the nutrition price tag device.
In practical applications, the electronic scale 131 may be a thermal electronic scale. The electronic scale 131 and the nutrition price tag device 132 are communicated by an RS232 interface.
In some possible embodiments, the console 20 is used to send the dinning information of the current day to the server 30.
In practice, the console 20 and the server 30 may communicate via TCP/IP protocol.
In a feasible scheme, before a meal taker arrives at a meal taking position and enters the identification range of a camera, the camera carries out identity identification on the meal taker to obtain identity identification data, the obtained identity identification data, bound meal position identification and identification time are written into identification information, and the identification information is sent to a server; when the electronic scale monitors that the weight of the dish basin changes, the changed weight data are sent to the nutrition price tag device, the nutrition price tag device processes the received weight data, weight loss data are calculated, the weight loss data, dish information, a meal position identifier and meal taking time are written into meal taking information and sent to the server, after the server receives the meal taking information, the meal taking position bound by the nutrition price tag device is determined according to the meal position identifier in the meal taking information, the identity identification data of a meal taking person within 5s to 10s of the meal taking position nearest to the meal taking position is searched from the received identification information, meal taking information of the meal taking person is obtained and stored, the meal taking information comprises the identity information, the meal taking time, the dish information and the meal taking weight of the meal taking person, and the meal taking information is sent to the nutrition price tag device to be displayed.
The meal nutrition tracking system of the embodiment accurately positions the information of the meal taker and the dishes, the server receives the identification information sent by the identity identification device of each meal position and the meal taking information sent by the nutrition price tag device, the meal taking information of the meal taker is obtained and stored, the nutrition and health evaluation data can be calculated and/or meal suggestions can be made for the meal taker according to the historical meal taking information of the meal taker and the pre-stored health information, the meal taker does not need to perform operation in the whole process, and the personalized meal suggestions can be pertinently extracted according to the body differences of the meal taker on the basis of long-term tracking and analysis of meal nutrition intake data.
Example 10:
on the basis of the foregoing embodiments, the present application provides a readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method steps of identity binding as described in embodiments 1 to 3.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise. In the description of the present application, the term "and/or" is only one kind of association relationship describing the association object, and means that there may be three kinds of relationships, for example, the meal management terminal and/or B, and may mean: the three situations of the meal management terminal, the meal management terminal and the meal management terminal B exist separately, and the meal management terminal B exists separately.
In this application, unless expressly stated or limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can include, for example, fixed connections, removable connections, or integral parts; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium; either as communication within the two elements or as an interactive relationship of the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, a first feature is "on" or "under" a second feature, and the first and second features may be in direct contact, or the first and second features may be in indirect contact via intermediate media. Also, a first feature "on," "above," and "over" a second feature may be directly or obliquely above the second feature, or simply mean that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the second feature, or may simply mean that the first feature is at a lower level than the second feature.
In the description of the present specification, the description of "one embodiment", "some embodiments", "examples", "specific examples" or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
While embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (9)

1. An identity binding method is suitable for an identity recognition device adopting real-time video recognition, and the identity recognition device is bound with a meal position in a meal nutrition tracking system, and is characterized by comprising the following steps:
step S11, when the interval of the recognition time of the face information of the current meal taker A and the next meal taker B is smaller than the preset time interval, judging whether the meal taker A and the meal taker B are both in the effective meal taking range, if so, executing step S12, otherwise, executing step S15;
step S12, judging whether the meal taker A and the meal taker B both move towards the middle position of the screen and the meal taker A is closer to the middle position of the screen than the meal taker B, if so, binding the identity information of the meal taker A with the meal position, otherwise, executing step S13;
step S13, judging whether the meal taker A and the meal taker B both move away from the middle position of the screen and the meal taker B is closer to the middle position of the screen than the meal taker A, if so, binding the identity information of the meal taker B with the meal position, otherwise, executing step S14;
step S14, judging whether the position relation between the meal taker A and the meal taker B meets the condition that the meal taker A moves away from the middle position of the screen, the meal taker B moves towards the middle position of the screen, and the meal taker A is closer to the middle position of the screen than the meal taker B, if yes, binding the identity information of the meal taker A with the meal position, otherwise, binding the identity information of the meal taker B with the meal position;
step S15, binding the identity information of the food taker located in the effective food taking range with the meal position;
after binding the identity information of the meal taker A or the meal taker B with the meal size, the method further comprises:
step S20, comparing the currently recognized face information with the characteristic values in the face database, if the comparison fails, executing step S21, otherwise executing step S22;
step S21, determining whether the feature value of the currently recognized face information is empty, if yes, executing step S211, otherwise executing step S212;
step S211, judging whether cache data of the face information exist, if so, executing step S23, otherwise, executing step S24;
step S212, judging whether the current diner is located at the center position, if so, executing step S23, otherwise, executing step S24;
step S22, judging whether the currently recognized face information is new face information, if so, executing step S221, otherwise, executing step S25;
step S221, judging whether the previous bound food taker is located at the center position of the screen, if so, executing step S23, otherwise, executing step S11, wherein the previous bound food taker is the bound food taker A or the bound food taker B;
step S23, determining the currently recognized face information as the face information of the previously bound diner;
step S24, starting a new identification process;
and step S25, continuing to maintain the identity of the current diner.
2. The identity binding method of claim 1, wherein the determining whether the meal taker a and the meal taker B are both within a valid meal fetch range comprises:
and judging whether the values of F (A) or F (B) are both larger than a preset value, if so, determining that the diner A and the diner B are both in the effective meal taking range, wherein F is the face width information.
3. The identity binding method of claim 1, wherein before the step S11, the method further comprises:
and step S10, judging whether more than one piece of face information is recognized in the preset time interval, if so, executing the step S11, and otherwise, binding the identity information of the current meal taker and the meal position.
4. The identity binding method of claim 3, wherein before the step S11, the method further comprises:
step S31, judging whether a preemptive meal taking event occurs when the ith meal taking occurs, if so, determining X (i) >0 and executing step S32, otherwise, executing step S11, wherein X (i) is the meal taking amount of the ith meal taking;
step S32, when T (i) < T is judged, whether the value of X (i + 1) is zero or not is judged, if yes, step S321 is executed, otherwise, step S33 is executed, wherein T (i) is the time after the ith meal occurs, and T is the limit time for recognizing the face information of the registered user;
step S321, determining whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32;
step S322, uploading meal fetching information of the ith meal fetching;
step S33, when T is not more than T (i) < T + T1, whether the value of X (i + 1) is zero or not is judged, if yes, step S331 is executed, otherwise, step S34 is executed, wherein T1 is the time added for identifying strangers compared with registered users;
step S331, judging whether the identity of the current diner is identified, if so, executing step S322, otherwise, continuing to execute step S32;
step S34, when the value of X (i + 1) is zero or not when T (i) is not less than T + T1, if yes, the data of the ith meal is confirmed to be no main data and the no main data is uploaded, otherwise, S35 is executed;
and step S35, judging whether a person takes a meal within T time before the ith meal taking happens, if so, determining that the current meal taker is the previous meal taker and uploading the meal taking information of the ith meal taking, and otherwise, determining that the data of the ith meal taking is no main data and uploading the meal taking information of the ith meal taking.
5. The identity binding method of claim 4,
the step S321 of determining whether the current identity of the meal taker is recognized includes: judging whether a registered user or a stranger is identified, if so, determining that the current diner is the registered user or the stranger; if not, judging whether a person takes the meal within T time before the ith meal is taken, and if so, determining that the current meal taker is the previous meal taker; otherwise, determining that the identity of the current diner is not recognized;
the step S331 of determining whether the current identity of the diner is recognized includes: judging whether a registered user or a stranger is identified, if so, determining that the current diner is the registered user or the stranger; otherwise, determining that the identity of the current diner is not recognized.
6. The identity binding method of claim 5, further comprising making the value of M (i) a null value, Y =0, wherein M (i) is a taker code, Y is a state value, Y =0 is an initial value;
after step S32 or step S33, if the current meal taker is determined to be a registered user, the uploading the meal taking information of the ith meal further comprises: obtaining a taker code writing M (i) of the currently used taker, and uploading X (i), M (i) and Y = 1;
after step S32, if it is determined that the current meal taker is a meal taker for the previous meal taker, the uploading the meal taker information for the ith meal further includes: acquiring a code M (i-1) of a taker taking the meal for the (i-1) th time, and uploading X (i), M (i-1) and Y = 1;
after the current meal taker is determined to be a stranger, the uploading of the meal taking information of the ith meal taking further comprises: let M (i) = M (stranger), and upload x (i), M (stranger), and Y = 2;
after determining that the data of the ith meal taking is no main data, the uploading the meal taking information of the ith meal taking further comprises: let m (i) = null, and upload x (i), m (i), and Y = 3;
where Y =1 indicates successful recognition, Y =2 indicates recognition as a stranger, and Y =3 indicates no recognition of a person.
7. An identification device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program when executed by the processor implementing the steps of the identity binding method of any one of claims 1 to 6.
8. A meal nutrition tracking system, characterized in that the meal nutrition tracking system comprises an identification device according to claim 7.
9. A readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the identity binding method according to any one of claims 1 to 6.
CN202210620627.1A 2022-06-02 2022-06-02 Identity binding method and device, storage medium and diet nutrition tracking system Active CN114694800B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210620627.1A CN114694800B (en) 2022-06-02 2022-06-02 Identity binding method and device, storage medium and diet nutrition tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210620627.1A CN114694800B (en) 2022-06-02 2022-06-02 Identity binding method and device, storage medium and diet nutrition tracking system

Publications (2)

Publication Number Publication Date
CN114694800A CN114694800A (en) 2022-07-01
CN114694800B true CN114694800B (en) 2022-09-06

Family

ID=82131055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210620627.1A Active CN114694800B (en) 2022-06-02 2022-06-02 Identity binding method and device, storage medium and diet nutrition tracking system

Country Status (1)

Country Link
CN (1) CN114694800B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919672A (en) * 2019-02-28 2019-06-21 深圳前海微众银行股份有限公司 Intelligent method of ordering, system, terminal of ordering, canteen maker and storage medium
CN110287867A (en) * 2019-06-24 2019-09-27 广州织点智能科技有限公司 Unmanned convenience store enters recognition methods, device, equipment and storage medium
CN112017756A (en) * 2020-09-07 2020-12-01 神思电子技术股份有限公司 Dietary nutrition analysis method based on face recognition self-service meal-making system
CN112164171A (en) * 2020-09-28 2021-01-01 黄石钧工智能科技有限公司 Meal system is got to as required based on face identification
CN213582413U (en) * 2020-11-26 2021-06-29 重庆电子信息中小企业公共服务有限公司 Dining room automatic charging dining system based on face recognition
CN114282568A (en) * 2021-04-01 2022-04-05 深圳市粤能环保科技有限公司 Intelligent recycling bin face information non-inductive registration method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1209609A4 (en) * 2000-03-15 2002-08-07 Matsushita Electric Works Ltd Food advising system for diet-restricted person
US20160210621A1 (en) * 2014-12-03 2016-07-21 Sal Khan Verifiable credentials and methods thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919672A (en) * 2019-02-28 2019-06-21 深圳前海微众银行股份有限公司 Intelligent method of ordering, system, terminal of ordering, canteen maker and storage medium
CN110287867A (en) * 2019-06-24 2019-09-27 广州织点智能科技有限公司 Unmanned convenience store enters recognition methods, device, equipment and storage medium
CN112017756A (en) * 2020-09-07 2020-12-01 神思电子技术股份有限公司 Dietary nutrition analysis method based on face recognition self-service meal-making system
CN112164171A (en) * 2020-09-28 2021-01-01 黄石钧工智能科技有限公司 Meal system is got to as required based on face identification
CN213582413U (en) * 2020-11-26 2021-06-29 重庆电子信息中小企业公共服务有限公司 Dining room automatic charging dining system based on face recognition
CN114282568A (en) * 2021-04-01 2022-04-05 深圳市粤能环保科技有限公司 Intelligent recycling bin face information non-inductive registration method

Also Published As

Publication number Publication date
CN114694800A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
Hassannejad et al. Automatic diet monitoring: a review of computer vision and wearable sensor-based methods
US20100332571A1 (en) Device augmented food identification
US10671893B2 (en) System and method for recipe to image associations
CN107909668B (en) Sign-in method and terminal equipment
JP2009526302A (en) Method and system for tagging digital data
CN110363076A (en) Personal information correlating method, device and terminal device
US20140052585A1 (en) Information processing system, information processing method, program, and information recording medium
CN110704659B (en) Image list ordering method and device, storage medium and electronic device
US20190221134A1 (en) Meal Management System
CN112635052A (en) Adjustable medical health information management and consultation service system
CN106203466A (en) The method and apparatus of food identification
CN114694800B (en) Identity binding method and device, storage medium and diet nutrition tracking system
JP2013030049A (en) Information processing device and information processing program
WO2021189640A1 (en) Alcohol product information management method and apparatus, and computer device and storage medium
CN113343003A (en) Dining nutrition construction recording system and method
US20210090135A1 (en) Commodity information notifying system, commodity information notifying method, and program
CN116703503A (en) Intelligent recommendation method and system for campus canteen dishes
US8972522B2 (en) Information distribution system, information distribution apparatus, information communication terminal, and information distribution method
US20220222844A1 (en) Method, device, and program for measuring food
TW201227592A (en) Method for food and drink management with interactive image identifying technology
CN112183914B (en) Evaluation method and device for dish items
CN109766052B (en) Dish picture uploading method and device, computer equipment and readable storage medium
CN114283453A (en) Method and device for acquiring information of wandering animal, storage medium and electronic equipment
US8655032B2 (en) Mobile identification system and method
KR20210128071A (en) System for providing personalized recommended food based on mobile and providing method using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant