CN113111258A - Menu pushing method and device - Google Patents

Menu pushing method and device Download PDF

Info

Publication number
CN113111258A
CN113111258A CN202110352793.3A CN202110352793A CN113111258A CN 113111258 A CN113111258 A CN 113111258A CN 202110352793 A CN202110352793 A CN 202110352793A CN 113111258 A CN113111258 A CN 113111258A
Authority
CN
China
Prior art keywords
user
pushing
emotional state
menu
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110352793.3A
Other languages
Chinese (zh)
Inventor
陈剑桥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Technology Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Technology Co Ltd
Priority to CN202110352793.3A priority Critical patent/CN113111258A/en
Publication of CN113111258A publication Critical patent/CN113111258A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Abstract

The invention discloses a menu pushing method and device. The method comprises the following steps: acquiring behavior and physical sign data of a user; judging whether the user is in a preset emotional state or not according to the behavior and sign data; if the judgment result is yes, the menu information is pushed to the user according to the preset emotional state, the problem that the pushing result cannot meet the requirements of the user when the menu is pushed is solved, and the effect of pushing the menu according to the emotional state of the user is achieved.

Description

Menu pushing method and device
Technical Field
The invention relates to the field of data pushing, in particular to a menu pushing method and device.
Background
Along with the development of economy, people live more and more delicately, and the pressure of work and life is also getting bigger and bigger. Under a large environment for a long time, people can have extreme conditions on diet, or the food is too simple and lacks nutrition; or the people are too abundant to drink the food too vigorously, and if the people drink the food too vigorously, the health of people can be seriously damaged.
At present, most recipe systems tend to teach recipes of dishes that people want to make by themselves, and these recipes can only provide detailed teaching of a specific recipe, but ignore recipes with scientific basis for stress people.
Most of the existing menu recommendation systems recommend menus according to the preference of most people by combining big data and local characteristics. These recipes, while well-known and delicious, may not be the most desirable result for users of people with negative mood.
Aiming at the problem that the pushing result cannot meet the user requirement when the menu is pushed in the related art, an effective solution is not provided at present.
Disclosure of Invention
The invention mainly aims to provide a menu pushing method and a menu pushing device, and aims to solve the problem that a pushing result cannot meet the requirements of a user when a menu is pushed in the related art.
In order to achieve the above object, according to an aspect of the present invention, there is provided a menu pushing method, including: acquiring behavior and physical sign data of a user; judging whether the user is in a preset emotional state or not according to the behavior and sign data; and if so, pushing menu information to the user according to the preset emotional state.
Further, obtaining behavior and sign data of the user includes at least two of: acquiring palm temperature of a user through a temperature sensor on the mobile terminal; acquiring palm humidity of a user through a humidity sensor on the mobile terminal; acquiring shaking data of the mobile terminal; acquiring facial expression data of a user; acquiring volume, tone, speech speed and breath sound loudness data of user call voice; acquiring screen switching frequency data of the mobile terminal; acquiring heartbeat data of a user; sleep data of a user is acquired.
Further, the predetermined emotional state comprises an anxiety state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data comprises: judging whether the user is in the anxiety state or not according to any four items: the shaking data of the mobile terminal is larger than a shaking reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a calm state of a user; the screen switching frequency of the mobile terminal exceeds a switching reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a user calm state; detecting a user palm temperature increase; the facial expression of the user is detected as: pressing eyebrow, lifting facial muscle, and sipping mouth; detecting an increase in the tidal humidity of the user's hand; judging that the speech speed is accelerated and the loudness of the breath sound is enhanced through the communication voice record in the latest preset duration; detecting that the heartbeat of the user is accelerated and the body temperature is increased;
further, the predetermined emotional state comprises an angry state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data comprises: determining whether the user is in the angry state by satisfying any four of: the facial expression of the user is detected as: the eyes are gazelle, the lips are closed tightly, the eyebrows are locked tightly, and the eyebrows are pressed downwards; judging whether the breath sound loudness is enhanced, the sound is increased and the occurrence of an uncivilized vocabulary is accompanied by the breath sound loudness enhancement and the sound increase through the communication voice recording within the latest preset time length; detecting that the heartbeat of the user is accelerated and the body temperature is increased; detecting an increase in the grip of the opponent's shell; an increase in the tidal humidity of the user's hand is detected.
Further, the predetermined emotional state includes a sad state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data includes: judging whether the user is in the sad state or not by satisfying any three items: detecting that the user has poor sleep quality; detecting the increase of the switching frequency of the mobile phone; the facial expression of the user is detected as: slightly opening the mouth, breaking the mouth angle, keeping the lips still, and simultaneously lifting the two cheeks to enable the upper eyelid to droop; judging that the breath loudness is weak, the speaking loudness is reduced, the speech speed is slow, and sigh words appear for many times through the communication voice record in the latest preset duration; detecting that the grip of the casing of the opponent is rapidly lifted and rapidly descended;
further, pushing menu information to the user according to the predetermined emotional state comprises at least one of: pushing a menu for making dishes corresponding to the preset emotional state to the user; a takeout sales platform for pushing dishes corresponding to the preset emotional states to the user; and a restaurant pushing dishes corresponding to the preset emotional state to the user.
Further, before pushing information of a menu corresponding to the predetermined emotional state to a user, the method further comprises: judging whether the frequency of pushing the menu information to the user in a preset time period exceeds a preset frequency or not; and if the number of the menus exceeds the preset threshold, pushing information of other menus in the menu list to be pushed to the user.
Further, determining whether the user is in a predetermined emotional state based on the behavioral and vital sign data comprises: and judging whether the user is in a preset emotional state or not through a trained recognition model in the artificial intelligence system.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a menu pushing device, including: the acquisition unit is used for acquiring behavior and physical sign data of a user; the judging unit is used for judging whether the user is in a preset emotional state or not according to the behavior and sign data; and the pushing unit is used for pushing menu information to the user according to the preset emotion state when the judgment result is yes.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a computer-readable storage medium characterized in that the computer-readable storage medium includes a stored program, wherein the recipe push method according to the present invention is executed when the program is executed by a processor.
The method comprises the steps of obtaining behavior and physical sign data of a user; judging whether the user is in a preset emotional state or not according to the behavior and sign data; if the judgment result is yes, the menu information is pushed to the user according to the preset emotional state, the problem that the pushing result cannot meet the requirements of the user when the menu is pushed is solved, and the effect of pushing the menu according to the emotional state of the user is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of a recipe pushing method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a data interaction process of the emotion management recipe recommendation system of the present embodiment; and
fig. 3 is a schematic diagram of a recipe pushing device according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the invention provides a menu pushing method.
Fig. 1 is a flowchart of a recipe pushing method according to an embodiment of the present invention, as shown in fig. 1, the method includes the steps of:
step S102: acquiring behavior and physical sign data of a user;
step S104: judging whether the user is in a preset emotional state or not according to the behavior and sign data;
step S106: and if so, pushing menu information to the user according to the preset emotional state.
The embodiment acquires behavior and sign data of a user; judging whether the user is in a preset emotional state or not according to the behavior and sign data; if the judgment result is yes, the menu information is pushed to the user according to the preset emotional state, the problem that the pushing result cannot meet the requirements of the user when the menu is pushed is solved, and the effect of pushing the menu according to the emotional state of the user is achieved.
In this embodiment, the behavior and sign data of the user may be behavior and sign data of multiple dimensions of the user, including speaking sound, frequency, motion, facial expression, heartbeat frequency, body temperature, palm temperature and humidity of the user, and the data may be acquired by a sensor, such as a temperature and humidity sensor installed on a mobile phone shell, or a camera acquiring facial expression of the user, and sleep data of the user acquired by a sleep monitoring module (such as a bracelet), and whether the user is in a predetermined emotional state is comprehensively determined according to the behavior and sign data, and if so, menu information is pushed according to a corresponding emotional state, so that a menu can be pushed according to the emotional state of the user, so that the user eats satisfied dishes, and negative emotions of the user are improved.
It should be noted that the pushed menu information may be one menu, or may be a plurality of menus arranged in sequence, which may be selected by the user.
Optionally, the obtaining of the behavior and sign data of the user includes at least two of: acquiring palm temperature of a user through a temperature sensor on the mobile terminal; acquiring palm humidity of a user through a humidity sensor on the mobile terminal; acquiring shaking data of the mobile terminal; acquiring facial expression data of a user; acquiring volume, tone, speech speed and breath sound loudness data of user call voice; acquiring screen switching frequency data of the mobile terminal; acquiring heartbeat data of a user; sleep data of a user is acquired.
The behavior and physical sign data of the user can be acquired in various ways, can be a sensor installed on the mobile terminal, and can also be acquired through an independent sensor.
Optionally, the predetermined emotional state comprises an anxiety state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data comprises: whether the user is in the anxiety state is judged according to the following four items: the shaking data of the mobile terminal is larger than a shaking reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a calm state of a user; the screen switching frequency of the mobile terminal exceeds a switching reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a user calm state; detecting a user palm temperature increase; the facial expression of the user is detected as: pressing eyebrow, lifting facial muscle, and sipping mouth; detecting an increase in the tidal humidity of the user's hand; judging that the speech speed is accelerated and the loudness of the breath sound is enhanced through the communication voice record in the latest preset duration; the heartbeat of the user is detected to be accelerated, and the body temperature is detected to be increased.
Optionally, the predetermined emotional state comprises an angry state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data comprises: whether the user is in an angry state or not is judged by satisfying any four items: the facial expression of the user is detected as: the eyes are gazelle, the lips are closed tightly, the eyebrows are locked tightly, and the eyebrows are pressed downwards; judging whether the breath sound loudness is enhanced, the sound is increased and the occurrence of an uncivilized vocabulary is accompanied by the breath sound loudness enhancement and the sound increase through the communication voice recording within the latest preset time length; detecting that the heartbeat of the user is accelerated and the body temperature is increased; detecting an increase in the grip of the opponent's shell; an increase in the tidal humidity of the user's hand is detected.
Optionally, the predetermined emotional state includes a sad state, and determining whether the user is in the predetermined emotional state according to the behavior and sign data includes: whether the user is in the sad state is judged if any three items are met: detecting that the user has poor sleep quality; detecting the increase of the switching frequency of the mobile phone; the facial expression of the user is detected as: slightly opening the mouth, breaking the mouth angle, keeping the lips still, and simultaneously lifting the two cheeks to enable the upper eyelid to droop; judging that the breath loudness is weak, the speaking loudness is reduced, the speech speed is slow, and sigh words appear for many times through the communication voice record in the latest preset duration; detecting that the grip of the casing of the opponent is rapidly lifted and rapidly descended;
some behavior and sign data are obtained by comparing with a reference value when being evaluated, the reference value is acquired under a calm state of a user, and the emotional state can be judged to be satisfied when four or more indexes are satisfied.
In this embodiment, for the determination of facial expressions, a general determination standard may be adopted, for example, a standard in the field of expression recognition, and for closed lips, tight eyebrow, and pressed eyebrow, the determination of these expressions may be compared with the usual expressions of the user, or may be directly determined by the current facial picture of the user.
Optionally, pushing the menu information to the user according to the predetermined emotional state comprises at least one of: pushing a dish making menu corresponding to the preset emotional state to the user; a takeout sales platform for pushing dishes corresponding to the preset emotional states to the user; a restaurant that pushes dishes corresponding to a predetermined emotional state to the user.
According to the technical scheme, when the menu is pushed to the user, the menu can be pushed to make dishes, so that the user can cook and make dishes by himself, and under some conditions, the user cannot make dishes by himself due to condition limitation, therefore, the takeaway sales platform for the dishes is also pushed by the method, restaurants capable of eating in a room are pushed, and the user can freely select the menus.
Further, when pushing the menu to the user, in addition to considering the mood and the preference of the user, more information can be comprehensively considered, such as the current season, the weather, whether the user is in a disease period, and the like, and the more comprehensive consideration is, the higher the possibility that the user is satisfied by the final pushing is.
Optionally, before pushing the information of the menu corresponding to the predetermined emotional state to the user, determining whether the number of times of pushing the information of the menu to the user within a predetermined time period exceeds a preset number of times; and if the number of the menus exceeds the preset threshold, pushing information of other menus in the menu list to be pushed to the user.
The preset times can be set according to user requirements, for example, the number of times of pushing a certain menu per week is not more than 2, although some menus can obviously improve the emotion of a user, if the menu is pushed every time, the user can be tired, so that the number of times of pushing each menu is not more than 2, and if the menu with the optimal solution is calculated through a model during the 3 rd pushing, the menu with the suboptimal solution is replaced, so that the user experience is improved.
Optionally, determining whether the user is in the predetermined emotional state according to the behavior and sign data comprises: and judging whether the user is in a preset emotional state or not through the trained recognition model in the artificial intelligence system.
In the calculation process of the scheme of the embodiment, mainly in an artificial intelligence system, data acquired by each sensor can be transmitted to a cloud platform (such as an intelligent home APP) regularly through an NFC patch of a mobile phone shell; the intelligent APP is uploaded to an artificial intelligence system (AI system) after being added with weather and air information of the day; the AI system carries out negative mood analysis and learning on data fed back by the information in all the sensors and the APP; the AI system gives the most appropriate few choices and tells the user where to eat or which take-away platform to order if it is not conditional.
Optionally, after the menu information is pushed to the user according to the preset emotional state, behavior and sign data of the user are continuously acquired, and whether the emotional state of the user is improved or not is judged; if not, adjusting the menu pushing logic, for example, changing parameters of the pushing model, so that the pushed menu can improve the emotional state of the user.
The embodiment not only can provide a catering suitable for the current negative mood for the user, but also can improve the negative mood of the user by integrating the menu into the food therapy of China, thereby realizing the healthy eating and the comfortable eating.
The present embodiment also provides a preferred embodiment, and the following describes the technical solution of the present embodiment with reference to the preferred embodiment.
The embodiment uses an AI technology and a multi-sensor analysis technology, determines the type of the negative mood of the user through the physical signs of the user and the call record, and recommends a more suitable negative mood improvement recipe for the user.
The present embodiment relates to an emotion management recipe recommendation system based on an AI cloud system and sensor analysis, fig. 2 is a schematic diagram of a data interaction process of the emotion management recipe recommendation system of the present embodiment, and as shown in fig. 2, the interaction process includes the following steps:
1. all sensors transmit data to the smart home APP through the NFC patches of the mobile phone shell in a timed mode;
2. the intelligent APP is uploaded to the AI system after being added with weather and air information of the day;
3. the AI system carries out negative mood analysis and learning on data fed back by the information in all the sensors and the APP;
4. the AI system gives the most appropriate few choices and tells the user where to go to eat if there is no condition to do so.
The three negative psychology in this embodiment include: anxiety, anger, sadness.
The user needs to enter the mental state and the various states of the mental state and the mental state for comparison.
For different negative psychology, performing data fusion judgment through a sensor;
1. anxiety (compared with usual time) is judged as an anxiety state when any 4 of the following conditions occur simultaneously.
When the GPS signal is not greatly changed, the mobile phone shakes violently when the game is not started;
the switching frequency of the mobile phone is obviously increased;
the palm temperature rises;
facial expression: the eyebrow is slightly pressed down, the facial muscle is slightly lifted up, and the mouth is slightly closed;
the tidal humidity of the palm rises;
through the voice recording of the communication within the last 30 minutes, the voice speed is high, and the loudness of the breathing sound is enhanced;
the heartbeat is faster than usual and the body temperature rises slightly.
2. Anger (compared with usual time) is determined to be angry if any of the following 4 conditions occur simultaneously.
Facial expression: the eyes are gazelle, the lips are closed tightly, the eyebrows are locked tightly, and the eyebrows are pressed downwards;
through the communication voice recording within the last 30 minutes, the loudness of breathing sound is enhanced, the loudness of speaking is obviously increased, and the occurrence of non-civilized words is accompanied;
the heartbeat is faster than the conventional heartbeat, and the body temperature is slightly increased;
the grip strength of the hand casing is obviously increased;
the tidal humidity of the palm rises.
3. The period of time when any of the following 3 conditions occur simultaneously is determined as the sad state (compared with the ordinary state).
The sleep quality detection module detects that the sleep quality is poor;
the switching frequency of the mobile phone is obviously increased;
facial expression: slightly opening mouths and breaking mouth angles, keeping lips unintelligible, and simultaneously lifting the two cheeks and enabling the upper eyelid to droop;
through the communication voice recording within the last 30 minutes, the loudness of breathing sound is weak, the loudness of speaking is slightly reduced, the speed of speech is slow, and sigh words appear for many times;
the grip of the hand casing is increased and then quickly lifted and lowered.
Scene one: mood of people just after face test (24820); 24820;, restlessness
1. After the user finishes the face test, the palms of the hands sweat more, and the palms can be determined by a palm temperature sensor and a palms humidity sensor of the mobile phone shell;
in addition, the mobile phone can be held by a user with a relatively strong force during interview, and the mobile phone is also a stressed characteristic, so that the mobile phone can be captured by a grip sensor of the mobile phone shell; in waiting for an interview, a mobile phone screen is normally switched on and off continuously, and the mobile phone screen can be recorded through a mobile phone; the body temperature and heartbeat sensor can also detect the change of the state of the user; the user can use the face unblock sometimes, just can gather the expression using face unblock.
2. Through the above phenomena, the AI system, after learning and training, will draw conclusions through the various sensor indicators and indicate which kind of negative emotion this is, while deriving the recipe that is most suitable for consumption and the restaurant that has recently provided the recipe.
3. The user seeing the push in APP will very easily make a decision to have a meal for a discipline and confirm that he has a meal/take-out by himself.
4. When the user uses the APP to take the food on demand, whether the user starts to eat or not can be confirmed through the meal delivery time; when the user uses the APP to inquire about the menu, the user is asked whether to finish the menu to obtain feedback, and whether the user starts to eat food is determined.
5. These data are collected and sent to the AI system and monitored for improvement in the user's negative mood indicators over the next several hours.
Scene two: mood anger during work
1. The data acquisition system records the call records of a user in one day, wherein the call records comprise tone variation, loudness variation, speech speed variation, call time length, call gap duration, call tone variation, and facial screenshots unlocked by using a face ID before starting to prepare for call;
2. the AI system obtains the current activity frequency and the passive mood index and the type of the user by combining the data with the current weather data analysis, the air temperature and the dryness;
3. the user can receive the pushed menu and greeting words at the time of eating;
4. when the user uses the APP to take the food on demand, whether the user starts to eat or not can be confirmed through the meal delivery time; when the user uses the APP to inquire about the menu, the user is asked whether to finish the menu to obtain feedback, and whether the user starts to eat food is determined.
5. These data are collected and sent to the AI system and it is monitored whether the user's negative mood indicators improve over the next few hours to correct the recipe push logic.
The embodiment comprehensively analyzes the psychological states of people with negative emotions through the AI system and the multi-sensor technology, improves the negative psychological states of the people through diet, can particularly pay attention to the people with the negative emotions, improves the negative psychological states of the people through the simplest way, and provides a technical reference for the later intelligent healthy life.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
The embodiment of the invention provides a menu pushing device which can be used for executing the menu pushing method of the embodiment of the invention.
Fig. 3 is a schematic view of a recipe pushing apparatus according to an embodiment of the present invention, as shown in fig. 3, the apparatus including:
the acquiring unit 10 is used for acquiring behavior and physical sign data of a user;
the judging unit 20 is used for judging whether the user is in a preset emotional state according to the behavior and sign data;
and the pushing unit 30 is used for pushing the menu information to the user according to the preset emotional state when the judgment result is yes.
The embodiment adopts an acquisition unit 10 for acquiring behavior and physical sign data of a user; the judging unit 20 is used for judging whether the user is in a preset emotional state according to the behavior and sign data; and the pushing unit 30 is used for pushing the menu information to the user according to the preset emotional state when the judgment result is yes, so that the problem that the pushing result cannot meet the requirements of the user when the menu is pushed is solved, and the effect of pushing the menu according to the emotional state of the user is achieved.
The menu pushing device comprises a processor and a memory, wherein the acquiring unit, the judging unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more, and the menu is pushed according to the emotional state of the user by adjusting the parameters of the kernel.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium on which a program is stored, and the program implements the recipe push method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the menu pushing method is executed when the program runs.
The embodiment of the invention provides equipment, which comprises at least one processor, at least one memory and a bus, wherein the memory and the bus are connected with the processor; the processor and the memory complete mutual communication through a bus; the processor is used for calling the program instructions in the memory to execute the menu pushing method. The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device: acquiring behavior and physical sign data of a user; judging whether the user is in a preset emotional state or not according to the behavior and sign data; if the judgment result is yes, the menu information is pushed to the user according to the preset emotional state, and the problem that the pushing result cannot meet the requirements of the user when the menu is pushed is solved.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A menu pushing method is characterized by comprising the following steps:
acquiring behavior and physical sign data of a user;
judging whether the user is in a preset emotional state or not according to the behavior and sign data;
and if so, pushing menu information to the user according to the preset emotional state.
2. The method of claim 1, wherein obtaining behavioral and vital sign data for a user comprises at least two of:
acquiring palm temperature of a user through a temperature sensor on the mobile terminal;
acquiring palm humidity of a user through a humidity sensor on the mobile terminal;
acquiring shaking data of the mobile terminal;
acquiring facial expression data of a user;
acquiring volume, tone, speech speed and breath sound loudness data of user call voice;
acquiring screen switching frequency data of the mobile terminal;
acquiring heartbeat data of a user;
sleep data of a user is acquired.
3. The method of claim 1, wherein the predetermined emotional state comprises an anxiety state, and wherein determining whether the user is in the predetermined emotional state based on the behavioral and vital sign data comprises: judging whether the user is in the anxiety state or not according to any four items:
the shaking data of the mobile terminal is larger than a shaking reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a calm state of a user;
the screen switching frequency of the mobile terminal exceeds a switching reference value, wherein the shaking reference value is the shaking value of the mobile terminal in a user calm state;
detecting a user palm temperature increase;
the facial expression of the user is detected as: pressing eyebrow, lifting facial muscle, and sipping mouth;
detecting an increase in the tidal humidity of the user's hand;
judging that the speech speed is accelerated and the loudness of the breath sound is enhanced through the communication voice record in the latest preset duration;
the heartbeat of the user is detected to be accelerated, and the body temperature is detected to be increased.
4. The method of claim 1, wherein the predetermined emotional state comprises an angry state, and wherein determining whether the user is in the predetermined emotional state based on the behavioral and vital sign data comprises: determining whether the user is in the angry state by satisfying any four of:
the facial expression of the user is detected as: the eyes are gazelle, the lips are closed tightly, the eyebrows are locked tightly, and the eyebrows are pressed downwards;
judging whether the breath sound loudness is enhanced, the sound is increased and the occurrence of an uncivilized vocabulary is accompanied by the breath sound loudness enhancement and the sound increase through the communication voice recording within the latest preset time length;
detecting that the heartbeat of the user is accelerated and the body temperature is increased;
detecting an increase in the grip of the opponent's shell;
an increase in the tidal humidity of the user's hand is detected.
5. The method of claim 1, wherein the predetermined emotional state comprises a sad state, and wherein determining whether the user is in the predetermined emotional state based on the behavioral and vital sign data comprises: judging whether the user is in the sad state or not by satisfying any three items:
detecting that the user has poor sleep quality;
detecting the increase of the switching frequency of the mobile phone;
the facial expression of the user is detected as: slightly opening the mouth, breaking the mouth angle, keeping the lips still, and simultaneously lifting the two cheeks to enable the upper eyelid to droop;
judging that the breath loudness is weak, the speaking loudness is reduced, the speech speed is slow, and sigh words appear for many times through the communication voice record in the latest preset duration;
the grip of the hand casing is detected to be quickly lifted and quickly lowered.
6. The method of claim 1, wherein pushing recipe information to a user according to the predetermined emotional state comprises at least one of:
pushing a menu for making dishes corresponding to the preset emotional state to the user;
a takeout sales platform for pushing dishes corresponding to the preset emotional states to the user;
and a restaurant pushing dishes corresponding to the preset emotional state to the user.
7. The method of claim 1, wherein prior to pushing information of a recipe corresponding to the predetermined emotional state to a user, the method further comprises:
judging whether the frequency of pushing the menu information to the user in a preset time period exceeds a preset frequency or not;
and if the number of the menus exceeds the preset threshold, pushing information of other menus in the menu list to be pushed to the user.
8. The method of claim 1, wherein determining from the behavioral and vital sign data whether the user is in a predetermined emotional state comprises:
and judging whether the user is in a preset emotional state or not through a trained recognition model in the artificial intelligence system.
9. A menu pushing device, comprising:
the acquisition unit is used for acquiring behavior and physical sign data of a user;
the judging unit is used for judging whether the user is in a preset emotional state or not according to the behavior and sign data;
and the pushing unit is used for pushing menu information to the user according to the preset emotion state when the judgment result is yes.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program, when executed by a processor, performs the recipe push method of any one of claims 1 to 8.
CN202110352793.3A 2021-03-31 2021-03-31 Menu pushing method and device Pending CN113111258A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110352793.3A CN113111258A (en) 2021-03-31 2021-03-31 Menu pushing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110352793.3A CN113111258A (en) 2021-03-31 2021-03-31 Menu pushing method and device

Publications (1)

Publication Number Publication Date
CN113111258A true CN113111258A (en) 2021-07-13

Family

ID=76713783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110352793.3A Pending CN113111258A (en) 2021-03-31 2021-03-31 Menu pushing method and device

Country Status (1)

Country Link
CN (1) CN113111258A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103920291A (en) * 2014-04-29 2014-07-16 深圳市中兴移动通信有限公司 Method using mobile terminal as auxiliary information source and mobile terminal
CN106874265A (en) * 2015-12-10 2017-06-20 深圳新创客电子科技有限公司 A kind of content outputting method matched with user emotion, electronic equipment and server
CN107578805A (en) * 2016-07-05 2018-01-12 九阳股份有限公司 Cooking control method and equipment based on user emotion/state of mind
CN108648802A (en) * 2018-05-15 2018-10-12 四川斐讯信息技术有限公司 A kind of recipe recommendation method and system
US20180300460A1 (en) * 2017-04-18 2018-10-18 International Business Machines Corporation Appetite improvement system through memory association
CN108830265A (en) * 2018-08-29 2018-11-16 奇酷互联网络科技(深圳)有限公司 Method, communication terminal and the storage device that mood in internet exchange is reminded
CN110215218A (en) * 2019-06-11 2019-09-10 北京大学深圳医院 A kind of wisdom wearable device and its mood identification method based on big data mood identification model
CN110970113A (en) * 2018-09-30 2020-04-07 宁波方太厨具有限公司 Intelligent menu recommendation method based on user emotion
CN112489815A (en) * 2020-11-17 2021-03-12 中国电子科技集团公司电子科学研究院 Depression emotion monitoring method and device and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103920291A (en) * 2014-04-29 2014-07-16 深圳市中兴移动通信有限公司 Method using mobile terminal as auxiliary information source and mobile terminal
CN106874265A (en) * 2015-12-10 2017-06-20 深圳新创客电子科技有限公司 A kind of content outputting method matched with user emotion, electronic equipment and server
CN107578805A (en) * 2016-07-05 2018-01-12 九阳股份有限公司 Cooking control method and equipment based on user emotion/state of mind
US20180300460A1 (en) * 2017-04-18 2018-10-18 International Business Machines Corporation Appetite improvement system through memory association
CN108648802A (en) * 2018-05-15 2018-10-12 四川斐讯信息技术有限公司 A kind of recipe recommendation method and system
CN108830265A (en) * 2018-08-29 2018-11-16 奇酷互联网络科技(深圳)有限公司 Method, communication terminal and the storage device that mood in internet exchange is reminded
CN110970113A (en) * 2018-09-30 2020-04-07 宁波方太厨具有限公司 Intelligent menu recommendation method based on user emotion
CN110215218A (en) * 2019-06-11 2019-09-10 北京大学深圳医院 A kind of wisdom wearable device and its mood identification method based on big data mood identification model
CN112489815A (en) * 2020-11-17 2021-03-12 中国电子科技集团公司电子科学研究院 Depression emotion monitoring method and device and readable storage medium

Similar Documents

Publication Publication Date Title
US20220395983A1 (en) Social robot with environmental control feature
CN107392124A (en) Emotion identification method, apparatus, terminal and storage medium
CN113553449A (en) Machine intelligent predictive communication and control system
CN104036776A (en) Speech emotion identification method applied to mobile terminal
CN108133055A (en) Intelligent dress ornament storage device and based on its storage, recommend method and apparatus
CN112655177B (en) Asynchronous co-viewing
CN103902046A (en) Intelligent prompting method and terminal
JPWO2019207896A1 (en) Information processing system, information processing method, and recording medium
CN111975772A (en) Robot control method, device, electronic device and storage medium
KR102015097B1 (en) Apparatus and computer readable recorder medium stored program for recognizing emotion using biometric data
CN109872800A (en) A kind of diet accompanies system and diet to accompany method
CN109278051A (en) Exchange method and system based on intelligent robot
US10108784B2 (en) System and method of objectively determining a user's personal food preferences for an individualized diet plan
CN107837075A (en) Dreamland monitoring device and monitoring method
CN107659710B (en) Intelligent terminal, playing control method thereof and device with storage function
KR102342863B1 (en) A method of serving content through emotional classification of streaming video
CN104933517A (en) Mental stress assessment method and system based on intelligent wearable equipment
CN113435518A (en) Feature fusion interaction method and device based on multiple modes
CN113111258A (en) Menu pushing method and device
CN106303939A (en) The method and device of healthalert
WO2022011509A1 (en) Method and apparatus for monitoring dietary behavior
CN211832366U (en) Pet monitoring device and pet monitoring system
EP3799407B1 (en) Initiating communication between first and second users
CN106411834A (en) Session method based on companion equipment, equipment and system
CN205334503U (en) Robot with face identification function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210713