CN113034312B - Dining information pushing method, device, equipment and storage medium - Google Patents

Dining information pushing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113034312B
CN113034312B CN201911345190.XA CN201911345190A CN113034312B CN 113034312 B CN113034312 B CN 113034312B CN 201911345190 A CN201911345190 A CN 201911345190A CN 113034312 B CN113034312 B CN 113034312B
Authority
CN
China
Prior art keywords
user
information
meal
dining
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911345190.XA
Other languages
Chinese (zh)
Other versions
CN113034312A (en
Inventor
张磊
冯劲苗
张晓颖
于晋瑄
冀兰菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Tianjin Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Tianjin Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201911345190.XA priority Critical patent/CN113034312B/en
Publication of CN113034312A publication Critical patent/CN113034312A/en
Application granted granted Critical
Publication of CN113034312B publication Critical patent/CN113034312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The embodiment of the invention discloses a restaurant information pushing method, device, equipment and storage medium, which are used for solving the problem that the restaurant information pushing method, device, equipment and storage medium cannot be personalized for users in the prior art. The method comprises the following steps: when the user is detected to enter the target area, determining the human body posture characteristics of the user in the target area; the human body posture features include torso angles and spatial angles; judging whether the user has a meal taking action according to the human body posture characteristics; if yes, determining the meal taking behavior information of the user; the meal taking behavior information comprises at least one item of meal type corresponding to the meal taking behavior and heat of the first meal; and pushing the dining information matched with the dining information for the user according to the dining information. According to the technical scheme, personalized food information can be pushed to the user in a targeted manner, so that the user can know the food information matched with the food taking behavior without thinking calculation, intelligent pushing of the food information is realized, and the dining experience of the user is improved.

Description

Dining information pushing method, device, equipment and storage medium
Technical Field
The present invention relates to the field of data service technologies, and in particular, to a method, an apparatus, a device, and a storage medium for pushing dining information.
Background
At present, the face recognition consumer adopts an advanced face recognition technology to carry out identity verification, and accurate personnel passing permission or consumption record is provided by analyzing face characteristics as the basis of the identity recognition. But the technique is only applicable to authentication payment links. People are generally concerned about the diet condition of the people at present, and hope that the diet habit of the people can promote the health of the people. But not everyone spends time planning his own diet. Therefore, providing a method for pushing dining information is a problem to be solved at present.
Disclosure of Invention
The embodiment of the invention provides a restaurant information pushing method, device, equipment and storage medium, which are used for solving the problem that the restaurant information pushing for users cannot be personalized in the prior art.
In order to solve the technical problems, the embodiment of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a method for pushing dining information, including:
when the user is detected to enter a target area, determining the human body posture characteristics of the user in the target area; the human body posture features include torso angles and spatial angles;
judging whether the user has a meal taking action according to the human body posture characteristics;
If yes, determining the dining behavior information of the user; the meal taking behavior information comprises at least one of meal type and first meal heat corresponding to the meal taking behavior;
pushing dining information matched with the dining behavior information for the user according to the dining behavior information.
In a second aspect, an embodiment of the present invention further provides a dining information pushing device, including:
the first determining module is used for determining the human body posture characteristics of the user in the target area when the user is detected to enter the target area; the human body posture features include torso angles and spatial angles;
the judging module is used for judging whether the user has a meal taking action or not according to the human body posture characteristics;
the second determining module is used for determining the meal taking behavior information of the user if yes; the meal taking behavior information comprises at least one of meal type and first meal heat corresponding to the meal taking behavior;
and the pushing module is used for pushing the dining information matched with the dining behavior information for the user according to the dining behavior information.
In a third aspect, an embodiment of the present invention further provides a dining information pushing device, including:
A memory storing computer program instructions;
a processor, when the computer program instructions are executed by the processor, implementing the dining information pushing method according to the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium, where the computer readable storage medium includes instructions, which when executed on a computer, cause the computer to perform the dining information pushing method according to the first aspect.
In the embodiment of the invention, when the user is detected to enter the target area, the trunk angle and the space angle of the user in the target area are determined, so that whether the user has a meal taking action or not is judged according to the trunk angle and the space angle of the user, when the user is determined to have the meal taking action, the meal taking action information of the user is determined, and the dining information matched with the meal taking action information is pushed for the user. Therefore, the technical scheme can combine the dining behavior information of the user, and specifically push personalized dining information for the user, so that the user can know the dining information matched with the dining behavior of the user without thinking calculation, intelligent pushing of the dining information is realized, and dining experience of the user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a dining information pushing method in an embodiment of the invention.
Fig. 2 is a schematic view of the range of a camera target area in an embodiment of the invention.
FIG. 3 is a schematic representation of 3D skeleton coordinates in one embodiment of the invention.
Fig. 4 is a schematic flow chart of a dining information pushing method according to another embodiment of the present invention.
Fig. 5 is a schematic structural view of a dining information pushing device according to an embodiment of the present invention.
Fig. 6 is a schematic structural view of a dining information pushing device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic flow chart of a dining information pushing method in an embodiment of the invention. The method of fig. 1 may include:
s102, when the user is detected to enter the target area, the human body posture characteristics of the user in the target area are determined.
Wherein the human body posture features include torso angle and spatial angle.
Optionally, the body gesture of the user can be detected through the camera, the target area is a monitoring effective area, when the user walks in the monitoring effective area, the camera can track the skeleton of the human body through collecting gait samples, collect skeleton joint point information of each frame of image of each walker, obtain corresponding depth images, and store the obtained skeleton joint point information and depth images corresponding to the skeleton joint point information to form a gait database. So that the data in the gait database is called when the dining information is required to be pushed to the user later to calculate the dining behavior information of the user.
For example, when detecting the human body posture of the user, the camera is horizontally placed, as shown in fig. 2, and an isosceles triangle with a height of 3.8 meters and a base length of 4.13 meters is the target area.
In this embodiment, when it is detected that a user enters a target area, only bone joint information and depth images of the user may be collected and stored in a gait database, and each data in the gait database is extracted according to a preset calculation period for calculation; the real-time calculation can also be directly performed on the acquired bone joint point information and depth image of the user.
S104, judging whether the user has a meal taking action according to the human body posture characteristics.
And S106, if the fact that the user has the meal taking action is determined, determining the meal taking action information of the user.
The meal taking behavior information comprises meal types, first meal heat and the like corresponding to the meal taking behavior.
S108, pushing the dining information matched with the dining behavior information for the user according to the dining behavior information.
In the embodiment of the invention, when the user is detected to enter the target area, the trunk angle and the space angle of the user in the target area are determined, so that whether the user has a meal taking action or not is judged according to the trunk angle and the space angle of the user, when the user is determined to have the meal taking action, the meal taking action information of the user is determined, and the dining information matched with the meal taking action information is pushed for the user. Therefore, the technical scheme can combine the dining behavior information of the user, and specifically push personalized dining information for the user, so that the user can know the dining information matched with the dining behavior of the user without thinking calculation, intelligent pushing of the dining information is realized, and dining experience of the user is improved.
In one embodiment, when determining the human body posture characteristics of the user in the target area, the skeletal joint point information of the user can be determined according to the pre-acquired depth image of the user, and the trunk angle and the space angle of the user can be determined according to the skeletal joint point information of the user.
Wherein the torso angle includes a first angle between the arms and the torso, a second angle between the hips and the upper body, and the like. The spatial angle may be the frontal orientation of the user in space.
Optionally, technical means such as skeleton tracking, image distinguishing, motion recognition and the like can be performed on a user through 3D virtual technical equipment to capture human skeleton data, color data flow and depth data flow are simultaneously acquired through three cameras of the equipment, so that gray scale and color image data are synchronously used, three-dimensional depth information of a human body is acquired to obtain a depth image of the human body, and further skeletal joint point information is determined.
After determining the depth image and skeletal joint point information of the user, the torso angle and frontal orientation in space of the user may be determined by:
optionally, according to the collected bone joint point information of the user, the bone joint point coordinates of the user may be determined and the bone joint point coordinates may be connected to obtain a 3D skeleton map (as shown in fig. 3, the coordinates of the bone joint points are not shown in the map, and only the connected plane effect map is shown in the map). And determining vectors among all bone joint points in the 3D skeleton diagram, drawing vector lines in the 3D skeleton diagram, and determining a first angle between an arm and a trunk and a second angle between a hip and an upper body through an included angle between the vector lines among all joints and a gravity vector line.
For example, when a typical user stands upright, the vector lines between the joints of the arm and the vector lines between the joints of the head and the center of the hip joint are all perpendicular to the ground (i.e., parallel to the gravity vector line), and when the user has actions (such as taking a meal and walking), the vector lines between the joints are bent to form an included angle with the gravity vector line. Therefore, the first angle between the arm and the trunk can be determined through the included angle between the vector line and the gravity vector line between the joints of the arm; the second angle between the hip and the upper body may be determined by forming an angle with the gravity vector line of each joint between the head and the center of the hip joint.
Alternatively, a camera may be utilizedCollecting a depth image of a user in a target area, and setting any pixel in the depth image as x and d 1 (x) Is the depth value (gray value) at the x point. The set D is a direction set and is a set of plane octant angles,
Figure BDA0002333152100000051
in addition, K α =K 1 ,K 2 Represents an offset vector with an angle alpha from the horizontal to the right starting from the origin, satisfying +.>
Figure BDA0002333152100000052
And->
Figure BDA0002333152100000053
In formula (1): t=2x+1, K when m e Z 1 =0,K 2 Taking a constant value; t=2×2m, K when m∈z 2 =0,K 1 Taking a constant value; in other cases K 1 ,K 2 Taking a constant value. In addition, a vector pair consisting of any 2 offset vectors, thetak U ,K V U, V.epsilon.D, 25 pairs total. The θ local gradient features are calculated as follows: f (f) θ l,x=d 1 x+K U -d 1 (x+K V ). Wherein f θ l, x reflects gradient information around pixel x and thus characterizes pixel x. For the same object, the local gradient feature has the invariance of the spatial position, namely, the feature value of the point on the surface of the object is unchanged when the object freely translates in the scene, so that the feature can well distinguish the object with the rugged surface, namely, the front orientation of the user in the space can be accurately determined.
In this embodiment, the skeletal joint information of the user may be determined according to the depth image of the user acquired in advance, and the torso angle and the space angle of the user may be determined according to the skeletal joint information of the user, so as to provide a data basis for the subsequent determination of whether the user has a meal taking action, and by dual determination of the torso angle and the space angle of the user, the torso angle of the front orientation of the user in space may be accurately determined, i.e. the influence of body steering during walking of the user on the determination of the gesture features of the user may be avoided.
In one embodiment, when judging whether the user has a meal taking action according to the human body posture characteristics, if the first angle between the arm and the trunk is larger than or equal to a first preset angle and the second angle between the hip and the upper body is larger than or equal to a second preset angle, determining that the user has the meal taking action.
Optionally, the first preset angle may be preset to 45 degrees and the second preset angle may be preset to 25 degrees according to the actual situation (such as the meal taking actions of most users).
In this embodiment, it can be determined that the user has a meal taking action according to the first angle between the arm and the trunk of the user being greater than or equal to the first preset angle and the second angle between the hip and the upper body being greater than or equal to the second preset angle, so that the manner of determining the meal taking action of the user is simpler and accords with the actual situation, and the misjudgment situation is avoided.
After determining that the user has the meal taking action, the meal taking action information of the user needs to be determined, so that the dining information matched with the meal taking action information is pushed to the user. The meal taking behavior information comprises meal type, first meal heat and the like corresponding to the meal taking behavior.
In one embodiment, the meal retrieval action information includes a meal type corresponding to the meal retrieval action. The type of meal corresponding to the user's meal taking action may be determined by at least one of:
the first mode is to acquire first image information of the meal corresponding to the meal taking action, and determine the type of the meal corresponding to the meal taking action according to the first image information.
Optionally, the types of the meal corresponding to the image information of each meal can be stored in advance, after the first image information of the meal corresponding to the taking action is obtained, the first image information is matched with the image information of the meal stored in advance, and the type of the meal corresponding to the matched image information of the meal is determined to be the type of the meal corresponding to the taking action.
Optionally, the image recognition technology can be used for recognizing the obtained first image information of the meal corresponding to the meal taking action, and determining the type of the meal corresponding to the meal taking action according to the recognition result.
And determining a meal taking position corresponding to the meal taking action according to the human body posture characteristics, and determining the meal type corresponding to the meal taking action according to the corresponding relation between the preset meal taking position and the meal type.
Optionally, the meal types corresponding to the meal taking positions can be stored in advance, and when the user takes a meal, the meal taking positions corresponding to the meal taking behaviors of the user are determined through the human body posture characteristics, so that the meal types are determined.
In one embodiment, the meal retrieval action information includes a meal type and a first meal calories corresponding to the meal retrieval action. The first meal heat corresponding to the meal taking action can be determined according to the preset corresponding relation between each meal type and the meal heat.
In the embodiment, the information such as the meal type, the first meal heat and the like corresponding to the meal taking action of the user can be determined, a data basis is provided for pushing the food information, and the determination of the food information to be pushed is facilitated.
In one embodiment, when pushing dining information matched with the dining behavior information for a user according to the dining behavior information, the human body parameter information of the user can be determined according to a pre-acquired depth image of the user; secondly, determining reference caloric information required to be ingested by a user every day according to the human body parameter information; and then, determining the second meal heat required to be ingested in a preset period of the user according to the first meal heat corresponding to the meal taking action of the user and the reference heat information required to be ingested by the user every day, and pushing the catering information in the preset period of the user according to the second meal heat.
Wherein, the human body parameter information comprises height, weight, sex, etc. The actual width corresponding to the pixel width of each depth image frame can be calculated by utilizing the proportion between the depth image data and the actual data, and then the accurate actual height of the human body can be calculated according to the actual width; similarly, the accurate actual body width of the human body can be calculated according to the method; the actual body weight of the human body can be estimated through the actual width and the actual height, and the gender of the user can be accurately identified according to the different human body structures in the depth image. The preset period may be a breakfast period, a lunch period, a dinner period, etc. The user may be preset to push the dining information of which time period(s).
Optionally, according to the human body parameter information, the calculation formula of the daily heat required by the human body can be adopted:
women 655+ (9.6 x body weight kg) + (1.8 x body height cm) - (4.7 x age years),
men 66+ (13.7 x body weight kg) + (5 x body height cm) - (6.8 x age years),
baseline caloric information for daily intake by the user is determined.
In this embodiment, the human body parameter information of the user can be determined according to the pre-acquired depth image of the user, so as to determine the reference caloric information required to be ingested by the user every day, then, the second dining caloric information required to be ingested in the preset time period of the user is determined according to the first dining caloric information and the reference caloric information, and the dining information in the preset time period is pushed to the user according to the second dining caloric information, so that personalized dining information is pushed to the user instead of dining information calculated by other ways (such as big data) is pushed, and the pushed dining information is more referential.
In one embodiment, the dining preference information of the user can be determined according to the dining behavior information, and then the dining information in the preset time period can be pushed to the user according to the dining preference information of the user and the determined second dining heat required to be ingested in the preset time period.
The dining preference information of the user can be determined according to the type of the meal corresponding to the meal taking behavior of the user.
For example, according to the type of meal corresponding to the meal taking action of a user, the meal taken by the user can be determined to comprise stewed meat, vegetables and eggs, so that the dining preference of the user can be determined to be stewed meat, vegetables and eggs, and further, according to the dining preference information of the user and the determined heat quantity of the second meal required to be taken in the preset time period of the user, the dining information in the preset time period is pushed to the user.
In this embodiment, the dining preference information of the user can be determined according to the dining behavior information of the user, so that the dining information in the preset period is pushed to the user according to the dining preference information of the user and the determined second meal heat required to be ingested in the preset period, the dining information pushed to the user is more in line with the preference of the user, and the dining experience of the user is improved.
Fig. 4 is a schematic flow chart of a dining information pushing method according to another embodiment of the present invention.
The method of fig. 4 may include:
s401, acquiring depth images and bone joint point information of users in a target area.
The three-dimensional image data acquisition device comprises three cameras, a three-dimensional image acquisition device and a three-dimensional image acquisition device, wherein the three-dimensional image acquisition device can be used for acquiring three-dimensional depth information of a human body to acquire depth images of the human body through the three cameras of the three-dimensional image acquisition device. The target area may be a monitoring active area of the 3D virtual technology device.
S402, determining a first angle between the arms and the trunk of the user, a second angle between the hip and the upper body and the front orientation of the user in the space according to the acquired depth image and skeleton joint point information of the user.
According to the acquired bone joint point information of the user, the bone joint point coordinates of the user can be determined, and the bone joint point coordinates are connected to obtain a 3D skeleton map. And determining vectors among all bone joint points in the 3D skeleton diagram, drawing vector lines in the 3D skeleton diagram, and determining a first angle between an arm and a trunk and a second angle between a hip and an upper body through an included angle between the vector lines among all joints and a gravity vector.
When a user stands vertically, vector lines between joints of the arm and vector lines of joints between the head and the center of the hip joint are vertical to the ground (namely, parallel to the gravity vector lines), and when the user has actions (such as taking a meal and walking), the vector lines between the joints are bent to form an included angle with the gravity vector lines. Therefore, the first angle between the arm and the trunk can be determined through the included angle between the vector line and the gravity vector line between the joints of the arm; the second angle between the hip and the upper body may be determined by forming an angle with the gravity vector line of each joint between the head and the center of the hip joint.
The camera can be used for collecting the depth image of the user in the target area, and any pixel in the depth image is set as x and d 1 (x) Is the depth value (gray value) at the x point. The set D is a direction set and is a set of plane octant angles,
Figure BDA0002333152100000091
in addition, K α =K 1 ,K 2 Represents an offset vector with an angle alpha from the horizontal to the right starting from the origin, satisfying +.>
Figure BDA0002333152100000092
And->
Figure BDA0002333152100000093
In formula (1): t=2x+1, K when m e Z 1 =0,K 2 Taking a constant value; t=2×2m, K when m∈z 2 =0,K 1 Taking a constant value; in other cases K 1 ,K 2 Taking a constant value. In addition, a vector pair consisting of any 2 offset vectors, thetak U ,K V U, V.epsilon.D, 25 pairs total. The θ local gradient features are calculated as follows: f (f) θ l,x=d 1 x+K U -d 1 (x+K V ). Wherein f θ l, x reflects gradient information around pixel x and thus characterizes pixel x. For the same object, the local gradient feature has the invariance of the spatial position, namely, the feature value of the point on the surface of the object is unchanged when the object freely translates in the scene, so that the feature can well distinguish the object with the rugged surface, namely, the front orientation of the user in the space can be accurately determined.
S403, judging whether a first angle between the arm and the trunk of the user is larger than or equal to a first preset angle and a second angle between the hip and the upper body is larger than or equal to a second preset angle. If yes, then S404 is performed.
The first preset angle is 45 degrees and the second preset angle is 25 degrees according to actual conditions (such as meal taking actions of most users).
S404, determining that the user has a meal taking action.
S405, determining the type of the meal and the heat quantity of the first meal corresponding to the meal taking action of the user.
The method comprises the steps of storing the types of the dinner products corresponding to the image information of each dinner product in advance, acquiring first image information of the dinner products corresponding to the taking action, matching the first image information with the image information of the prestored dinner products, and determining the type of the dinner products corresponding to the matched image information of the dinner products as the type of the dinner products corresponding to the taking action. The first meal heat corresponding to the meal taking action can be determined according to the preset corresponding relation between each meal type and the meal heat.
S406, determining human body parameter information of the user according to the depth image.
Wherein, the human body parameter information comprises height, weight, sex, etc. The actual width corresponding to the pixel width of each depth image frame can be calculated by utilizing the proportion between the depth image data and the actual data, and then the accurate actual height of the human body can be calculated according to the actual width; similarly, the accurate actual body width of the human body can be calculated according to the method; the actual body weight of the human body can be estimated through the actual width and the actual height, and the gender of the user can be accurately identified according to the different human body structures in the depth image.
S407, determining the reference caloric information required to be ingested by the user every day according to the human body parameter information.
According to the human body parameter information, the calculation formula of the daily heat required by the human body can be adopted:
women 655+ (9.6 x body weight kg) + (1.8 x body height cm) - (4.7 x age years),
men 66+ (13.7 x body weight kg) + (5 x body height cm) - (6.8 x age years),
baseline caloric information for daily intake by the user is determined.
For example, for a 25 year old female, the height and weight of the user are estimated from the depth image corresponding to the bone joint information of the user, and assuming that the height of the user is 160 and the weight is 50 kg, the calories of the female can be calculated to be (655+9.6x50+1.8x160-4.7x25) = 1540.5 cards.
S408, determining the heat of the second meal required to be ingested in the preset period of the user according to the heat of the first meal corresponding to the meal taking action of the user and the reference heat information required to be ingested by the user every day.
The preset time period can be breakfast time period, lunch time period, dinner time period and the like. The user may be preset to push the dining information of which time period(s).
S409, pushing the restaurant information in the preset time period for the user according to the heat of the second meal.
Optionally, the food preference information of the user can be determined according to the food type corresponding to the food taking action, and then the food information in the preset time period can be pushed to the user according to the food preference information of the user and the determined heat quantity of the second food required to be taken in the preset time period.
In addition, if the female enters the target area in the breakfast time period, the reasonable meal suggestion can be recommended to the user according to the food information uploaded on the same day and the proportion of the intake of three meals. For example, the ingestion of calories 1504 calories today is recommended as follows:
the intake calories of breakfast account for about 25% -30% of the total calories of the day. Recommending breakfast: 1 cup of yoghurt, a slice of bread and 2 eggs. Heat quantity: 180+170+100=450 cards.
The caloric intake of lunch accounts for about 30% -40% of the calories of a day. Recommended lunch: rice (100 g), stewed pork chop with potato and matsutake, and white-burned cabbage heart (100 g). Heat quantity: 100+366+100=566 cards.
The intake calories of dinner account for about 30% -40% of the total calories of a day. Recommending dinner: 10 dumplings and one tomato-egg soup. Heat quantity: 370+118=488 card.
In the embodiment of the invention, when the user is detected to enter the target area, the trunk angle and the space angle of the user in the target area are determined, so that whether the user has a meal taking action or not is judged according to the trunk angle and the space angle of the user, when the user is determined to have the meal taking action, the meal taking action information of the user is determined, and the dining information matched with the meal taking action information is pushed for the user. Therefore, the technical scheme can combine the dining behavior information of the user, and specifically push personalized dining information for the user, so that the user can know the dining information matched with the dining behavior of the user without thinking calculation, intelligent pushing of the dining information is realized, and dining experience of the user is improved.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Fig. 5 is a schematic structural view of a dining information pushing device according to an embodiment of the present invention. Referring to fig. 5, a dining information pushing device may include:
a first determining module 510, configured to determine a human body posture feature of a user in a target area when it is detected that the user enters the target area; the human body posture features include torso angles and spatial angles;
the judging module 520 is configured to judge whether the user has a meal taking action according to the posture characteristics of the human body;
a second determining module 530, configured to determine the meal taking behavior information of the user if it is determined that the meal taking behavior exists in the user; the meal taking behavior information comprises at least one item of meal type corresponding to the meal taking behavior and heat of the first meal;
the pushing module 540 is configured to push, for the user, dining information matching the dining behavior information according to the dining behavior information.
In one embodiment, the first determination module 510 includes:
the first determining unit is used for determining skeletal joint point information of the user according to the pre-acquired depth image of the user;
the second determining unit is used for determining the trunk angle and the space angle of the user according to the skeletal joint point information of the user; the torso angle includes at least one of a first angle between the arms and the torso, and a second angle between the hips and the upper body.
In one embodiment, the determining module 520 includes:
and the third determining unit is used for determining that the user has a meal taking action if the first angle is larger than or equal to the first preset angle and the second angle is larger than or equal to the second preset angle.
In one embodiment, the meal retrieval action information includes meal types corresponding to the meal retrieval action; the second determination module 530 includes:
the fourth determining unit is used for obtaining first image information of the meal corresponding to the meal taking action and determining the type of the meal corresponding to the meal taking action according to the first image information; and/or the number of the groups of groups,
a fifth determining unit, configured to determine a meal taking position corresponding to the meal taking behavior according to the human body posture feature; and determining the meal type corresponding to the meal taking action according to the corresponding relation between the preset meal taking position and the meal type.
In one embodiment, the meal retrieval action information further includes a first meal heat; the second determination module 530 further includes:
and the sixth determining unit is used for determining the first meal heat corresponding to the meal taking action according to the preset corresponding relation between each meal type and the meal heat.
In one embodiment, the pushing module 540 includes:
a seventh determining unit, configured to determine human body parameter information of the user according to a depth image of the user acquired in advance; the human body parameter information comprises at least one of height, weight and gender;
An eighth determining unit for determining reference caloric information required to be ingested by the user every day according to the human body parameter information;
a ninth determining unit, configured to determine, according to the first meal heat and the reference heat information, a second meal heat that needs to be ingested in a preset period of time by the user;
and the pushing unit is used for pushing the restaurant information in a preset period for the user according to the heat of the second meal.
In one embodiment, the pushing module 540 further includes:
a tenth determining unit, configured to determine dining preference information of the user according to the dining behavior information;
pushing the restaurant information for the user in a preset time period according to the heat of the second meal, including:
and pushing the restaurant information in a preset period for the user according to the restaurant preference information of the user and the heat of the second meal.
The dining information pushing device provided by the embodiment of the invention can realize each process realized by the dining information pushing method in the embodiment of the method, and in order to avoid repetition, the description is omitted.
In the embodiment of the invention, when the user is detected to enter the target area, the trunk angle and the space angle of the user in the target area are determined, so that whether the user has a meal taking action or not is judged according to the trunk angle and the space angle of the user, when the user is determined to have the meal taking action, the meal taking action information of the user is determined, and the dining information matched with the meal taking action information is pushed for the user. Therefore, the device can combine the dining behavior information of the user, and specifically push personalized dining information for the user, so that the user can know the dining information matched with the dining behavior of the user without thinking calculation, intelligent pushing of the dining information is realized, and dining experience of the user is improved.
Fig. 6 is a schematic structural view of a dining information pushing device according to an embodiment of the present invention.
The dining information pushing device 600 includes, but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, processor 610, and power supply 611. It will be appreciated by those skilled in the art that the configuration of the dining information pushing device shown in fig. 6 does not constitute a limitation of the dining information pushing device, and the dining information pushing device may comprise more or less components than shown, or may combine certain components, or may be arranged with different components. In the embodiment of the invention, the dining information pushing equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
Wherein the processor 610 is configured to determine a human body posture feature of the user in the target area when it is detected that the user enters the target area; the human body posture features include torso angles and spatial angles; judging whether the user has a meal taking action according to the human body posture characteristics; if yes, determining the meal taking behavior information of the user; the meal taking behavior information comprises at least one item of meal type corresponding to the meal taking behavior and heat of the first meal; and pushing the dining information matched with the dining information for the user according to the dining information.
In the embodiment of the invention, when the user is detected to enter the target area, the trunk angle and the space angle of the user in the target area are determined, so that whether the user has a meal taking action or not is judged according to the trunk angle and the space angle of the user, when the user is determined to have the meal taking action, the meal taking action information of the user is determined, and the dining information matched with the meal taking action information is pushed for the user. Therefore, the device can combine the dining behavior information of the user, and specifically push personalized dining information for the user, so that the user can know the dining information matched with the dining behavior of the user without thinking calculation, intelligent pushing of the dining information is realized, and dining experience of the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the downlink data with the processor 610; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 601 may also communicate with networks and other devices through a wireless communication system.
The dining information pushing device provides wireless broadband internet access to the user through the network module 602, such as helping the user send and receive e-mail, browse web pages, access streaming media, and the like.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 may also provide audio output (e.g., a call signal receiving sound, a message receiving sound, etc.) related to a specific function performed by the dining information pushing apparatus 600. The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used for receiving audio or video signals. The input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, the graphics processor 6041 processing image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. Microphone 6042 may receive sound and can process such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 601 in the case of a telephone call mode.
The dining information pushing device 600 further comprises at least one sensor 605, such as a light sensor, a motion sensor and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 6061 and/or the backlight when the dining information pushing device 600 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and the direction when the accelerometer sensor is stationary, and can be used for identifying the gesture (such as horizontal and vertical screen switching, related games and magnetometer gesture calibration) of the catering information pushing equipment, vibration identification related functions (such as pedometer and knocking) and the like; the sensor 605 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 606 is used to display information input by a user or information provided to the user. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function controls of the dining information pushing device. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 6071 or thereabout using any suitable object or accessory such as a finger, stylus, or the like). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 610, and receives and executes commands sent from the processor 610. In addition, the touch panel 6071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein.
Further, the touch panel 6071 may be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 610 to determine a type of a touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 6, the touch panel 6071 and the display panel 6061 are two independent components to implement the input and output functions of the dining information pushing apparatus, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to implement the input and output functions of the dining information pushing apparatus, which is not limited herein.
The interface unit 608 is an interface for connecting an external device with the dining information pushing apparatus 600. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and to transmit the received input to one or more elements within the dining information pushing apparatus 600 or may be used to transmit data between the dining information pushing apparatus 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a storage program area that may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the dining information pushing device, and connects various parts of the entire dining information pushing device by various interfaces and lines, and executes various functions and processes data of the dining information pushing device by running or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the dining information pushing device. The processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The dining information pushing device 600 may further include a power source 611 (such as a battery) for powering the respective components, and preferably, the power source 611 may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system.
In addition, the dining information pushing device 600 includes some functional modules, which are not shown, and are not described herein.
Preferably, the embodiment of the present invention further provides a dining information pushing device, which includes a processor 610, a memory 609, and a computer program stored in the memory 609 and capable of running on the processor 610, where the computer program when executed by the processor 610 implements each process of the foregoing dining information pushing method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and no further description is provided herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the processes of the foregoing embodiments of the dining information pushing method, and can achieve the same technical effects, so that repetition is avoided, and no further description is provided herein. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (9)

1. The restaurant information pushing method is characterized by comprising the following steps of:
when the user is detected to enter a target area, determining skeletal joint point information of the user according to a pre-acquired depth image of the user, and determining a first angle between an arm and a trunk of the user, a second angle between a hip and an upper body of the user and a front orientation of the user in space according to the skeletal joint point information of the user;
judging whether the user has a meal taking action or not according to a first angle and a second angle of the front face of the user in the space;
if yes, determining the dining behavior information of the user; the meal taking behavior information comprises at least one of meal type and first meal heat corresponding to the meal taking behavior;
Pushing dining information matched with the dining behavior information for the user according to the dining behavior information.
2. The method according to claim 1, wherein the determining whether the user has a meal taking action according to a first angle and a second angle of the front face of the user in the space comprises:
and if the first angle is larger than or equal to a first preset angle and the second angle is larger than or equal to a second preset angle, determining that the user has a meal taking action.
3. The method of claim 2, wherein the meal retrieval action information includes a meal type corresponding to the meal retrieval action;
the determining of the meal taking behavior information of the user comprises the following steps:
acquiring first image information of the meal corresponding to the meal taking action, and determining the type of the meal corresponding to the meal taking action according to the first image information; and/or the number of the groups of groups,
determining a meal taking position corresponding to the meal taking behavior according to the human body posture characteristics; and determining the meal type corresponding to the meal taking action according to the corresponding relation between the preset meal taking position and the meal type.
4. The method of claim 3, wherein the meal retrieval action information further includes the first meal calories;
The determining the meal taking behavior information of the user further comprises:
and determining the first meal heat corresponding to the meal taking action according to the preset corresponding relation between each meal type and the meal heat.
5. The method of claim 4, wherein pushing the dining information for the user that matches the dining information according to the dining information comprises:
determining human body parameter information of the user according to the pre-acquired depth image of the user; the human body parameter information comprises at least one of height, weight and gender;
determining reference caloric information required to be ingested by the user every day according to the human body parameter information;
determining the quantity of heat of a second meal required to be ingested by the user within a preset period of time according to the quantity of heat of the first meal and the reference heat information;
pushing the restaurant information in the preset time period for the user according to the heat of the second meal.
6. The method of claim 5, wherein pushing dining information for the user that matches the dining information according to the dining information, further comprises:
Determining dining preference information of the user according to the dining behavior information;
the pushing the restaurant information in the preset time period for the user according to the heat of the second meal comprises:
pushing the restaurant information in the preset period for the user according to the restaurant preference information of the user and the heat of the second dinner.
7. A dining information pushing device, characterized by comprising:
the first determining module is used for determining skeletal joint point information of the user according to a pre-acquired depth image of the user, and determining a first angle between an arm and a trunk of the user, a second angle between a hip and an upper body of the user and a front direction of the user in space according to the skeletal joint point information of the user;
the judging module is used for judging whether the user has a meal taking action or not according to the first angle and the second angle sign of the front face of the user in the space;
the second determining module is used for determining the meal taking behavior information of the user if yes; the meal taking behavior information comprises at least one of meal type and first meal heat corresponding to the meal taking behavior;
and the pushing module is used for pushing the dining information matched with the dining behavior information for the user according to the dining behavior information.
8. A dining information pushing apparatus, characterized by comprising:
a memory storing computer program instructions;
processor, which when executed by the processor, implements a dining information pushing method as claimed in any one of claims 1 to 6.
9. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the dining information pushing method according to any one of claims 1 to 6.
CN201911345190.XA 2019-12-24 2019-12-24 Dining information pushing method, device, equipment and storage medium Active CN113034312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911345190.XA CN113034312B (en) 2019-12-24 2019-12-24 Dining information pushing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911345190.XA CN113034312B (en) 2019-12-24 2019-12-24 Dining information pushing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113034312A CN113034312A (en) 2021-06-25
CN113034312B true CN113034312B (en) 2023-07-04

Family

ID=76451504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911345190.XA Active CN113034312B (en) 2019-12-24 2019-12-24 Dining information pushing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113034312B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108690B (en) * 2017-12-19 2022-02-11 深圳创维数字技术有限公司 Method, device, equipment and storage medium for monitoring diet
CN108549660B (en) * 2018-03-12 2022-08-02 维沃移动通信有限公司 Information pushing method and device
CN109447672A (en) * 2018-08-30 2019-03-08 深圳壹账通智能科技有限公司 Using recommended method, device, storage medium and computer equipment
CN109346153A (en) * 2018-08-31 2019-02-15 北京唐冠天朗科技开发有限公司 It is a kind of to digitize system and method for having dinner
CN110135957A (en) * 2019-05-20 2019-08-16 梁志鹏 A kind of vegetable recommended method, device and the storage medium of intelligent restaurant healthy diet

Also Published As

Publication number Publication date
CN113034312A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN109682011A (en) A kind of temperature control method, device and terminal device
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN103955272B (en) A kind of terminal user's attitude detection system
CN109381165A (en) A kind of skin detecting method and mobile terminal
CN110866038A (en) Information recommendation method and terminal equipment
CN108460817B (en) Jigsaw puzzle method and mobile terminal
CN110536479A (en) Object transmission method and electronic equipment
CN110807405A (en) Detection method of candid camera device and electronic equipment
JP2020024688A (en) Information service system, information service method, and program
CN109669611A (en) Fitting method and terminal
CN111222569A (en) Method, device, electronic equipment and medium for identifying food
CN109300526A (en) A kind of recommended method and mobile terminal
CN108881544A (en) A kind of method taken pictures and mobile terminal
CN110446195A (en) Location processing method and Related product
CN110533651A (en) A kind of image processing method and device
CN109658198B (en) Commodity recommendation method and mobile terminal
CN108958623A (en) A kind of application program launching method and terminal device
CN111091519A (en) Image processing method and device
US11599739B2 (en) Image suggestion apparatus, image suggestion method, and image suggestion program
CN109740493A (en) A kind of target object recommended method and mobile terminal
CN114022896A (en) Target detection method and device, electronic equipment and readable storage medium
CN111405361B (en) Video acquisition method, electronic equipment and computer readable storage medium
CN111354460B (en) Information output method, electronic equipment and medium
CN110738548B (en) Virtual fitting method and device, mobile terminal and computer readable storage medium
CN112733673A (en) Content display method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant