CN114489441A - Recipe display method and device, electronic equipment and storage medium - Google Patents

Recipe display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114489441A
CN114489441A CN202210077912.3A CN202210077912A CN114489441A CN 114489441 A CN114489441 A CN 114489441A CN 202210077912 A CN202210077912 A CN 202210077912A CN 114489441 A CN114489441 A CN 114489441A
Authority
CN
China
Prior art keywords
target
recipe
display
determining
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210077912.3A
Other languages
Chinese (zh)
Inventor
郭颖珊
宋德超
吴云娅
文英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202210077912.3A priority Critical patent/CN114489441A/en
Publication of CN114489441A publication Critical patent/CN114489441A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B19/00Driving, starting, stopping record carriers not specifically of filamentary or web form, or of supports therefor; Control thereof; Control of operating function ; Driving both disc and head
    • G11B19/02Control of operating function, e.g. switching from recording to reproducing
    • G11B19/022Control panels
    • G11B19/025'Virtual' control panels, e.g. Graphical User Interface [GUI]
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Nutrition Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure relates to a recipe display method, a device, an electronic device and a storage medium, wherein the method comprises the following steps: firstly, acquiring characteristic information of a target object, then determining a recipe set based on the characteristic information, then determining a target recipe from the recipe set, then determining a target interaction mode corresponding to the target recipe, and finally sending the target recipe to a display device so that the display device displays the target recipe, and controlling the displayed target recipe by adopting the target interaction mode. By the scheme, the characteristic information of the user is acquired and identified, different recipes are recommended according to different identified users and corresponding to different characteristic information, the recipes can be set in an interactive mode and displayed, and the interactive mode can be used for controlling the display of the recipes. Finally, the user does not need to repeatedly operate between cooking and checking the recipes in the cooking process, checking operation is simplified, and user experience is improved.

Description

Recipe display method and device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of smart home, in particular to a recipe display method and device, electronic equipment and a storage medium.
Background
Nowadays, socioeconomic development is rapid, the living standard of the public is improved year by year, so that people have more time and energy to pursue the quality of life, people hope to make heart-shaped meals on diet, some recipes can be obtained through networks or other ways, and cooking is carried out according to the contents of the recipes.
However, no matter the paper books, still electronic equipment all need use both hands to operate when looking over the recipe, and this can occupy both hands at the actual culinary art in-process, cooks and looks over the repeated operation between the recipe, and the process is loaded down with trivial details and influences user experience.
Disclosure of Invention
In view of this, in order to solve the technical problems that the process is complicated and the user experience is affected due to repeated operations between cooking and recipe viewing, embodiments of the present disclosure provide a recipe display method and apparatus, an electronic device, and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a recipe display method, where the method includes:
acquiring characteristic information of a target object;
determining a recipe set based on the characteristic information;
determining a target recipe from the recipe set;
determining a target interaction mode corresponding to the target recipe;
and sending the target recipe to a display device so that the display device displays the target recipe, and controlling the displayed target recipe in the target interaction mode.
Optionally, in the method of any embodiment of the present disclosure, the method further includes:
detecting the action information of the target object in the process of displaying the target recipe by the display equipment;
and determining the corresponding target interaction mode based on the action information.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is gesture interaction, acquiring target gesture action of a target object;
and controlling the displayed target recipe by adopting the target gesture actions, wherein a plurality of groups of control operations in the target recipe are configured with a plurality of corresponding gesture actions.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is human-computer interaction, acquiring a target trigger action of a target object, wherein the target trigger action is generated by a control button of the target recipe in a display interface;
and controlling the displayed target recipe by adopting the target triggering action.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe includes:
when the display form of the target recipe is text display, performing page turning control on the displayed target recipe;
or the like, or, alternatively,
when the display form of the target recipe is video display, performing play control on the displayed target recipe, where the play control at least includes one of: fast forward, fast rewind, or pause.
Optionally, in the method according to any embodiment of the present disclosure, the determining a recipe set based on the feature information includes:
determining identity information of the target object based on the characteristic information;
and determining a recipe set corresponding to the identity information.
Optionally, in the method according to any embodiment of the present disclosure, the determining a target recipe from the recipe set includes:
selecting a part or all of the recipes from the recipe set as the target recipe;
or the like, or, alternatively,
acquiring kitchen ware information corresponding to a kitchen area;
and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
In a second aspect, an embodiment of the present disclosure provides a recipe display device, where the apparatus includes:
an acquisition unit configured to acquire feature information of a target object;
the collection unit is used for determining a recipe collection based on the characteristic information;
a recipe unit for determining a target recipe from the recipe set;
the interaction unit is used for determining a target interaction mode corresponding to the target recipe;
and the sending unit is used for sending the target recipe to a display device so that the display device displays the target recipe, and the displayed target recipe is controlled by adopting the target interaction mode.
Optionally, in an apparatus according to any embodiment of the present disclosure, the apparatus further includes:
a detection unit, configured to detect motion information of the target object during a process of displaying the target recipe by the display device;
and the determining unit is used for determining the corresponding target interaction mode based on the action information.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit includes:
the first gesture subunit is used for acquiring a target gesture action of a target object when the target interaction mode is gesture interaction;
and the second gesture subunit is used for controlling the displayed target recipe by adopting the target gesture actions, and a plurality of corresponding gesture actions are configured in a plurality of groups of control operations in the target recipe.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit includes:
the first human-machine subunit is used for acquiring a target trigger action of a target object when the target interaction mode is human-machine interaction, and the target trigger action is generated by a control button of the target recipe in a display interface;
and the second man-machine subunit is used for controlling the displayed target recipe by adopting the target triggering action.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit includes:
the first control subunit is used for executing page turning control on the displayed target recipe when the display form of the target recipe is text display;
a second control subunit, configured to, when the target recipe is presented in a video presentation, perform playback control on the presented target recipe, where the playback control at least includes one of: fast forward, fast rewind, or pause.
Optionally, in an apparatus according to any embodiment of the present disclosure, the aggregation unit includes:
an identity subunit, configured to determine identity information of the target object based on the feature information;
and the set subunit is used for determining the recipe set corresponding to the identity information.
Optionally, in the apparatus according to any embodiment of the present disclosure, the recipe unit includes:
a first recipe subunit configured to select a part or all of the recipes from the recipe set as the target recipe;
the second recipe subunit is used for acquiring kitchen ware information corresponding to the kitchen area; and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including:
a memory for storing a computer program;
a processor, configured to execute the computer program stored in the memory, and when the computer program is executed, implement the recipe display method according to any embodiment of the recipe display method of the first aspect of the disclosure.
In a fourth aspect, the disclosed embodiments provide a computer readable storage medium, and the computer program, when executed by a processor, implements the recipe display method according to any one of the embodiments of the recipe display method of the first aspect.
In a fifth aspect, the disclosed embodiments provide a computer program comprising computer readable code which, when run on a device, causes a processor in the device to execute instructions for implementing the steps in the method as in any of the embodiments of the recipe presentation method of the first aspect described above.
According to the recipe display scheme provided by the embodiment of the disclosure, the feature information of a target object is firstly acquired, a recipe set is determined based on the feature information, a target recipe is determined from the recipe set, a target interaction mode corresponding to the target recipe is determined, and finally the target recipe is sent to display equipment, so that the display equipment displays the target recipe, and the displayed target recipe is controlled by adopting the target interaction mode. By the scheme, the characteristic information of the user is acquired and identified, different recipes are recommended according to different identified users and corresponding to different characteristic information, the recipes can be set in an interactive mode and displayed, and the interactive mode can be used for controlling the display of the recipes. Finally, the user does not need to repeatedly operate between cooking and checking the recipes in the cooking process, checking operation is simplified, and user experience is improved.
Drawings
Fig. 1 is a schematic flow chart of a recipe display method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart diagram of another recipe display method provided in the embodiment of the present disclosure;
fig. 3 is a schematic flow chart of another recipe display method provided in the embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a recipe display apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of parts and steps, numerical expressions, and values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
It will be understood by those within the art that the terms "first," "second," and the like in the embodiments of the present disclosure are used merely to distinguish one object, step, device, or module from another object, and do not denote any particular technical meaning or logical order therebetween.
It is also understood that in embodiments of the present disclosure, "a plurality" may refer to two or more and "at least one" may refer to one, two or more.
It is also to be understood that any reference to any component, data, or structure in the embodiments of the disclosure, may be generally understood as one or more, unless explicitly defined otherwise or stated otherwise.
In addition, the term "and/or" in the present disclosure is only one kind of association relationship describing an associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the former and latter associated objects are in an "or" relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and the same or similar parts may be referred to each other, so that the descriptions thereof are omitted for brevity.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
It should be noted that, in the present disclosure, the embodiments and the features of the embodiments may be combined with each other without conflict. For the purpose of facilitating an understanding of the embodiments of the present disclosure, the present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with the embodiments. It is to be understood that the described embodiments are only a few, and not all, of the disclosed embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 1 is a schematic flow chart of a recipe display method provided in an embodiment of the present disclosure, and as shown in fig. 1, the method specifically includes:
and S11, acquiring the characteristic information of the target object.
Wherein, the characteristic information may include: facial information, cooking frequency, age, taste preferences, cooking style preferences, etc. The obtaining mode may include: when an acquisition instruction is received or when a target object is detected to be in use, identification acquisition is performed through an AR (Augmented Reality) device and/or acquisition is performed through external input.
In the embodiment of the present disclosure, the feature information of the target object may be acquired.
In one example, after receiving the acquisition instruction, the feature information of the target object is acquired through external input.
In an example, the characteristic information of the target object may include: the target object face image is cooked every day, and the cooking mode of pungent taste and quick frying are favored.
And S12, determining a recipe set based on the characteristic information.
In the embodiment of the present disclosure, after the feature information is obtained, the recipe query is performed according to the feature information, and the recipe set is determined.
In an example, the characteristic information of the target object may include: and if the user likes spicy taste and the cooking mode of quick frying, inquiring the recipes which are quick frying in the pre-stored recipes or by using a network according to the recipes with spicy taste and the recipes with the cooking mode, and taking the inquired recipes as a recipe set.
In an example, the set of recipes can include: boiled sliced meat, stir-fried squid, and the like.
And S13, determining the target recipe from the recipe set.
In the embodiment of the present disclosure, a target recipe may be determined from a recipe set determined according to the target object feature information.
In an example, the set of recipes can include: and (3) boiling sliced meat, stir-frying squid and the like, and determining the boiled sliced meat as a target recipe or determining the boiled sliced meat and the stir-fried squid as the target recipe.
And S14, determining a target interaction mode corresponding to the target recipe.
Wherein the target recipe may include: graphics and text recipes, video recipes, audio recipes, and the like. The interaction mode can comprise: touch interaction, motion interaction, and the like.
In the embodiment of the disclosure, after the target recipe is determined, the target interaction mode may be determined according to specific content or form of the target recipe.
In an example, the target recipe may include a video recipe, and determining the corresponding target interaction manner may include an action interaction manner.
And S15, sending the target recipe to a display device so that the display device can display the target recipe, and controlling the displayed target recipe in the target interaction mode.
Wherein, the display device can include: AR devices, etc.
In the embodiment of the disclosure, after the target interaction mode is determined, the target recipe is sent to the display device, the display device displays the target recipe, and the display device controls the target recipe according to the target interaction mode.
In an example, the target recipe is a video recipe, the target interaction mode is an action interaction mode, the video recipe is sent to the AR device, the AR device performs display enhancement display on a display interface, and the AR device recognizes the action interaction mode to perform corresponding control on the video recipe.
The recipe display method provided by the embodiment of the disclosure includes the steps of firstly obtaining feature information of a target object, then determining a recipe set based on the feature information, then determining a target recipe from the recipe set, then determining a target interaction mode corresponding to the target recipe, and finally sending the target recipe to display equipment to enable the display equipment to display the target recipe, and controlling the displayed target recipe by adopting the target interaction mode. By the scheme, the characteristic information of the user is acquired and identified, different recipes are recommended according to different identified users and corresponding to different characteristic information, the recipes can be set in an interactive mode and displayed, and the interactive mode can be used for controlling the display of the recipes. Finally, the user does not need to repeatedly operate between cooking and checking the recipes in the cooking process, checking operation is simplified, and user experience is improved.
Fig. 2 is a schematic flow chart of another recipe display method provided in the embodiment of the present disclosure, and as shown in fig. 2, the method specifically includes:
s201, acquiring characteristic information of the target object.
S202, determining the identity information of the target object based on the characteristic information.
And S203, determining a recipe set corresponding to the identity information.
S201 is already described in fig. 1, and is not described herein again, and S202 and S203 are described below in a unified manner:
in the embodiment of the present disclosure, after the characteristic information has been obtained, the recipe set is confirmed, the identity of the target object is further confirmed through the characteristic information, and the recipe set corresponding to the identity information is confirmed for the identity information of the identity.
In an example, facial feature information of a target object is acquired through an AR device, the facial feature information can further confirm that the target object is a user a, and the user a prefers a spicy taste and a quick-fry cooking manner, a plurality of recipes are obtained through querying according to the spicy taste and the quick-fry cooking manner, and then the obtained recipes are confirmed as a recipe set.
And S204, selecting part or all of the recipes from the recipe set as the target recipes.
S205, obtaining kitchen ware information corresponding to a kitchen area; and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
Wherein, kitchen ware information can include: woks, ovens, microwave ovens, and the like.
S204 and S205 are explained collectively as follows:
in the embodiment of the disclosure, after the recipe set is determined, the target food material can be determined through any one of two steps, and a part of or all of the recipes can be directly selected from the recipe set as the target recipe. And the kitchen ware information in the kitchen can be acquired, and the recipe which can finish cooking by using the kitchen ware in the kitchen in the recipe set is taken as the target recipe.
In an example, there are several recipes in the recipe set, and one or any one of the recipes may be selected as a target recipe, or all of the several recipes may be selected as target recipes. The kitchen ware information in the kitchen can also be acquired, then the kitchen ware required during cooking in the recipe set is compared, and then the recipe that the kitchen ware information in the kitchen and the required kitchen ware are consistent is determined as the target recipe.
And S206, sending the target recipe to a display device so that the display device can display the target recipe, and controlling the displayed target recipe in the target interaction mode.
In the embodiment of the disclosure, after the target recipe is determined, the target recipe is sent to the display device, and the display device displays the target recipe and correspondingly controls the target recipe according to the target interaction mode.
In an example, after receiving the target recipe, the AR device displays the target recipe on an AR interface, and correspondingly controls the displayed target recipe according to an indication of a target interaction manner.
S207, when the target interaction mode is gesture interaction, acquiring a target gesture action of a target object; and controlling the displayed target recipe by adopting the target gesture actions, wherein a plurality of groups of control operations in the target recipe are configured with a plurality of corresponding gesture actions.
S208, when the target interaction mode is human-computer interaction, acquiring a target trigger action of a target object, wherein the target trigger action is generated by a control button of the target recipe in a display interface; and controlling the displayed target recipe by adopting the target triggering action.
S207 and S208 are explained collectively as follows:
in the embodiment of the present disclosure, the target interaction manner may include: gesture interaction and man-machine interaction.
When the target interaction mode is gesture interaction, the target gesture action of the target object is acquired, the target food material is controlled according to the control instruction corresponding to the target gesture, and the gesture interaction comprises more than one gesture action and corresponding instructions. When the target interaction mode is human-computer interaction, corresponding control buttons are generated in a display interface of the target recipe, target trigger operation of the target object on the control buttons is obtained, and the target recipe is controlled according to the control buttons corresponding to the target trigger operation.
In an example, the target interaction mode is gesture interaction, the AR device acquires a gesture action of the target object, the target object makes a gesture action corresponding to control to be performed within an acquirable range of the AR device, the AR device acquires a gesture interaction instruction corresponding to the gesture action, then the target recipe is controlled by the gesture interaction instruction, and different gesture actions correspond to different gesture interaction instructions.
In another example, the target interaction mode is human-computer interaction, a display interface of the AR device generates a control button, when the target object needs to be controlled, the control button is clicked to control, at this time, the AR device identifies the trigger operation of the target object, and then, corresponding to the control instruction of the triggered control button, the target recipe is correspondingly controlled.
S209, when the display form of the target recipe is text display, performing page turning control on the displayed target recipe; or, when the display form of the target recipe is a video display, performing play control on the displayed target recipe, where the play control at least includes one of: fast forward, fast rewind, or pause.
In the embodiment of the present disclosure, the presentation form of the target recipe may include: and when the display form is text display, controlling the target food material to be pictures and texts such as page turning and the like on the target recipe. When the display form is video display, the target recipe is controlled to be subjected to video playing operations such as fast forward, fast backward or pause.
In an example, the display mode of the target recipe is video display, the AR device plays the video on a display interface, the target object performs a corresponding interactive action if the playing needs to be paused, and the AR device controls the target recipe to pause the playing after detecting the interactive action.
In another example, the target recipe is displayed in a manner of image-text, the AR device displays the image-text on the display interface, the target user needs to turn over the target recipe for an interactive action, and the AR device controls the target recipe to turn over the page after detecting the interactive action.
Optionally, in this disclosure, in a process of displaying the target recipe by the display device, motion information of the target object is detected, and a corresponding target interaction manner is determined based on the motion information.
Further, the action information of the target object can be acquired, the action information of the target object can be detected simultaneously when the target recipe is displayed, and the corresponding interaction mode can be determined according to the detected action information. And if the acquired action information is a gesture action, determining that the target interaction mode is gesture interaction, and if the acquired action information is a click action, determining that the target interaction mode is man-machine interaction.
And when the target recipe is displayed, if the display form of the target recipe is image-text display, determining that the interaction mode is man-machine interaction, acquiring the trigger operation of the target object, and if the display form of the target recipe is video display, determining that the target interaction mode is gesture interaction, and acquiring the gesture action of the target user.
In an example, when the action of the AR device, which is obtained while the target recipe is displayed, is a gesture action, the target interaction mode is determined to be gesture interaction, the gesture action of the target object is detected, and the target recipe is controlled according to the gesture action of the target object.
In another example, the AR device displays the target recipe in a video format, plays the target recipe, determines that the target interaction mode is a human-computer interaction mode, generates a control button on a display interface of the AR device, detects a trigger operation of the target object, and controls the target recipe according to the trigger operation of the target object.
According to the recipe display method provided by the embodiment of the disclosure, the recipe recommendation can be performed according to the current cooking environment of the user by identifying the kitchen ware information in the kitchen of the target object, and the interaction mode can be selected autonomously by identifying the display form of the target recipe or identifying the action information of the target user. Further, the recipe recommendation is more flexible, the operation is convenient and fast, and the user experience is improved.
Fig. 3 is a schematic flow chart of another recipe display method provided in the embodiment of the present disclosure, and as shown in fig. 3, the method specifically includes:
as further shown in fig. 3, the method specifically includes:
the device involved in the embodiments of the present disclosure may include: electronic equipment such as mobile phones, cameras, AR glasses (also display equipment) and the like.
The user (i.e., the target object) will have intelligent kitchen appliances (i.e., kitchen ware information), such as: and the intelligent range hood, the intelligent chopping board and the like are networked and configured to the user side. The networking mode can include: the user uses the mobile phone to install the client and log in the account, the device sends out a bluetooth signal, and the device information contained in the bluetooth signal may include: the user uses a Bluetooth signal sent by client searching equipment installed in a mobile phone to connect, and the client transmits the account number and the equipment information of the user to a server, namely, the equipment networking operation is completed.
Firstly, a user performs face recognition on a mobile phone and selects the age, cooking frequency, taste type and favorite menu name (namely characteristic information) of the user.
And secondly, when the user stands in the kitchen, the camera carries out face recognition (namely, obtains the characteristic information of the target object). Firstly judging whether the input face image or video has a face, and if so, further giving the position and size of each face and the position information of each main facial organ. And further extracting the corresponding identity in each face according to the information, and comparing the identity with the known faces, thereby identifying the identity of each face (i.e. determining the identity information of the target object based on the characteristic information).
And thirdly, the menu system records menu information in advance, wherein the menu information comprises the name of the menu and the taste type corresponding to the menu. The AR shows the material. Setting plane clicking (namely human-computer interaction) and air gesture (namely gesture interaction) interaction actions of the AR presentation materials.
And fourthly, acquiring user identity information according to the face recognition, thereby acquiring the age, the cooking frequency, the taste type and the favorite menu name of the user. And displaying the recommended menu to the wall surface through the projection of the camera AR. The user selects the recipe and displays the recipe by using the AR. Interaction is carried out through preset air gesture actions (namely, the target recipe displayed is controlled in a target interaction mode), and the AR is projected to follow-up cooking skill guidance.
According to the recipe display scheme provided by the embodiment of the disclosure, the feature information of a target object is firstly acquired, a recipe set is determined based on the feature information, a target recipe is determined from the recipe set, a target interaction mode corresponding to the target recipe is determined, and finally the target recipe is sent to display equipment, so that the display equipment displays the target recipe, and the displayed target recipe is controlled by adopting the target interaction mode. By the scheme, the characteristic information of the user is acquired and identified, different recipes are recommended according to different identified users and corresponding to different characteristic information, the recipes can be set in an interactive mode and displayed, and the interactive mode can be used for controlling the display of the recipes. Finally, the user does not need to repeatedly operate between cooking and checking the recipes in the cooking process, checking operation is simplified, and user experience is improved.
Fig. 4 is a schematic structural diagram of a recipe display device provided in an embodiment of the present disclosure, which specifically includes:
an acquisition unit 401 configured to acquire feature information of a target object;
a set unit 402, configured to determine a recipe set based on the feature information;
a recipe unit 403, configured to determine a target recipe from the recipe set;
an interaction unit 404, configured to determine a target interaction manner corresponding to the target recipe;
a sending unit 405, configured to send the target recipe to a display device, so that the display device displays the target recipe, and controls the displayed target recipe in the target interaction manner.
Optionally, in the apparatus according to any embodiment of the present disclosure, the apparatus further includes (not shown in the figure):
a detection unit, configured to detect motion information of the target object during a process of displaying the target recipe by the display device;
and the determining unit is used for determining the corresponding target interaction mode based on the action information.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit 405 includes (not shown in the figure):
the first gesture subunit is used for acquiring a target gesture action of a target object when the target interaction mode is gesture interaction;
and the second gesture subunit is used for controlling the displayed target recipe by adopting the target gesture actions, and a plurality of corresponding gesture actions are configured in a plurality of groups of control operations in the target recipe.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit 405 includes (not shown in the figure):
the first human-machine subunit is used for acquiring a target trigger action of a target object when the target interaction mode is human-machine interaction, and the target trigger action is generated by a control button of the target recipe in a display interface;
and the second man-machine subunit is used for controlling the displayed target recipe by adopting the target triggering action.
Optionally, in an apparatus according to any embodiment of the present disclosure, the sending unit 405 includes (not shown in the figure):
the first control subunit is used for executing page turning control on the displayed target recipe when the display form of the target recipe is text display;
a second control subunit, configured to, when the presentation form of the target recipe is a video presentation, perform playback control on the presented target recipe, where the playback control at least includes one of: fast forward, fast rewind, or pause.
Optionally, in an apparatus according to any embodiment of the present disclosure, the aggregation unit 402 includes (not shown in the figure):
an identity subunit, configured to determine identity information of the target object based on the feature information;
and the set subunit is used for determining the recipe set corresponding to the identity information.
Optionally, in the apparatus according to any embodiment of the present disclosure, the recipe unit 403 includes (not shown in the figure):
a first recipe subunit configured to select a part or all of the recipes from the recipe set as the target recipe;
the second recipe subunit is used for acquiring kitchen ware information corresponding to the kitchen area; and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
The recipe display device provided in this embodiment may be the recipe display device shown in fig. 4, and may perform all the steps of the recipe display method shown in fig. 1 to 3, so as to achieve the technical effects of the recipe display method shown in fig. 1 to 3, and please refer to the description related to fig. 1 to 3 for brevity, which is not described herein again.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device 500 shown in fig. 5 includes: at least one processor 501, memory 502, at least one network interface 504, and other user interfaces 503. The various components in the electronic device 500 are coupled together by a bus system 505. It is understood that the bus system 505 is used to enable connection communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 505 in FIG. 5.
The user interface 503 may include, among other things, a display, a keyboard or a pointing device (e.g., a mouse, trackball), a touch pad or a touch screen, among others.
It is to be understood that the memory 502 in embodiments of the present disclosure may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), synchlronous SDRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 502 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 502 stores elements, executable units or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system 5021 and application programs 5022.
The operating system 5021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 5022 includes various applications, such as a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services. A program implementing the method of the embodiments of the present disclosure may be included in the application program 5022.
In the embodiment of the present disclosure, by calling a program or an instruction stored in the memory 502, specifically, a program or an instruction stored in the application 5022, the processor 501 is configured to execute the method steps provided by the method embodiments, for example, including:
acquiring characteristic information of a target object;
determining a recipe set based on the characteristic information;
determining a target recipe from the recipe set;
determining a target interaction mode corresponding to the target recipe;
and sending the target recipe to a display device so that the display device displays the target recipe, and controlling the displayed target recipe in the target interaction mode.
Optionally, in the method of any embodiment of the present disclosure, the method further includes:
detecting the action information of the target object in the process of displaying the target recipe by the display equipment;
and determining the corresponding target interaction mode based on the action information.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is gesture interaction, acquiring target gesture action of a target object;
and controlling the displayed target recipe by adopting the target gesture actions, wherein a plurality of groups of control operations in the target recipe are configured with a plurality of corresponding gesture actions.
Optionally, in a method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is human-computer interaction, acquiring a target trigger action of a target object, wherein the target trigger action is generated by a control button of the target recipe in a display interface;
and controlling the displayed target recipe by adopting the target triggering action.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe includes:
when the display form of the target recipe is text display, performing page turning control on the displayed target recipe;
or the like, or, alternatively,
when the display form of the target recipe is a video display, performing play control on the displayed target recipe, where the play control at least includes one of: fast forward, fast rewind, or pause.
Optionally, in the method according to any embodiment of the present disclosure, the determining a recipe set based on the feature information includes:
determining identity information of the target object based on the characteristic information;
and determining a recipe set corresponding to the identity information.
Optionally, in the method according to any embodiment of the present disclosure, the determining a target recipe from the recipe set includes:
selecting a part or all of the recipes from the recipe set as the target recipe;
or the like, or, alternatively,
acquiring kitchen ware information corresponding to a kitchen area;
and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
The method disclosed by the embodiment of the present disclosure may be applied to the processor 501, or may be implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 501. The Processor 501 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present disclosure may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 502, and the processor 501 reads the information in the memory 502 and completes the steps of the method in combination with the hardware.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units performing the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The electronic device provided in this embodiment may be the electronic device shown in fig. 5, and may perform all the steps of the recipe display method shown in fig. 1 to 3, so as to achieve the technical effect of the recipe display method shown in fig. 1 to 3, and for brevity, please refer to the description related to fig. 1 to 3, which is not repeated herein.
The disclosed embodiments also provide a storage medium (computer-readable storage medium). The storage medium herein stores one or more programs. Among others, the storage medium may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, a hard disk, or a solid state disk; the memory may also comprise a combination of memories of the kind described above.
When the one or more programs in the storage medium are executable by the one or more processors, the recipe presentation method executed on the electronic device side is implemented.
The processor is configured to execute the recipe display program stored in the memory to implement the following steps of the recipe display method executed on the electronic device side:
acquiring characteristic information of a target object;
determining a recipe set based on the characteristic information;
determining a target recipe from the recipe set;
determining a target interaction mode corresponding to the target recipe;
and sending the target recipe to a display device so that the display device displays the target recipe, and controlling the displayed target recipe in the target interaction mode.
Optionally, in the method of any embodiment of the present disclosure, the method further includes:
detecting the action information of the target object in the process of displaying the target recipe by the display equipment;
and determining the corresponding target interaction mode based on the action information.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is gesture interaction, acquiring target gesture action of a target object;
and controlling the displayed target recipe by adopting the target gesture actions, wherein a plurality of groups of control operations in the target recipe are configured with a plurality of corresponding gesture actions.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe in the target interaction manner includes:
when the target interaction mode is human-computer interaction, acquiring a target trigger action of a target object, wherein the target trigger action is generated by a control button of the target recipe in a display interface;
and controlling the displayed target recipe by adopting the target triggering action.
Optionally, in the method according to any embodiment of the present disclosure, the controlling the displayed target recipe includes:
when the display form of the target recipe is text display, performing page turning control on the displayed target recipe;
or the like, or, alternatively,
when the display form of the target recipe is video display, performing play control on the displayed target recipe, where the play control at least includes one of: fast forward, fast rewind, or pause.
Optionally, in a method according to any embodiment of the present disclosure, the determining a recipe set based on the feature information includes:
determining identity information of the target object based on the characteristic information;
and determining a recipe set corresponding to the identity information.
Optionally, in the method according to any embodiment of the present disclosure, the determining a target recipe from the recipe set includes:
selecting a part or all of the recipes from the recipe set as the target recipe;
or the like, or, alternatively,
acquiring kitchen ware information corresponding to a kitchen area;
and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information in the recipe set as a target recipe.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments, objects, technical solutions and advantages of the present disclosure are described in further detail, it should be understood that the above-mentioned embodiments are merely illustrative of the present disclosure and are not intended to limit the scope of the present disclosure, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A recipe display method, characterized in that the method comprises:
acquiring characteristic information of a target object;
determining a recipe set based on the feature information;
determining a target recipe from the set of recipes;
determining a target interaction mode corresponding to the target recipe;
and sending the target recipe to display equipment so that the display equipment displays the target recipe, and controlling the displayed target recipe in the target interaction mode.
2. The method of claim 1, further comprising:
detecting the action information of the target object in the process of displaying the target recipe by the display equipment;
and determining the corresponding target interaction mode based on the action information.
3. The method of claim 2, wherein the controlling the target recipe displayed in the target interaction manner comprises:
when the target interaction mode is gesture interaction, acquiring target gesture action of a target object;
and controlling the displayed target recipe by adopting the target gesture actions, wherein a plurality of groups of control operations in the target recipe are configured with a plurality of corresponding gesture actions.
4. The method of claim 2, wherein the controlling the target recipe displayed in the target interaction manner comprises:
when the target interaction mode is human-computer interaction, acquiring a target trigger action of a target object, wherein the target trigger action is generated by a control button of the target recipe in a display interface;
and controlling the displayed target recipe by adopting the target trigger action.
5. The method of claim 1, wherein said controlling said target recipe for presentation comprises:
when the display form of the target recipe is text display, performing page turning control on the displayed target recipe;
or the like, or, alternatively,
when the display form of the target recipe is video display, performing play control on the displayed target recipe, wherein the play control at least comprises one of the following steps: fast forward, fast rewind, or pause.
6. The method of claim 1, wherein determining the recipe set based on the characteristic information comprises:
determining identity information of the target object based on the characteristic information;
and determining a recipe set corresponding to the identity information.
7. The method of claim 6, wherein determining a target recipe from the set of recipes comprises:
selecting part or all of the recipes from the recipe set as the target recipes;
or the like, or, alternatively,
acquiring kitchen ware information corresponding to a kitchen area;
and taking the recipe which is finished by using the kitchen ware corresponding to the kitchen ware information to perform cooking in the recipe set as a target recipe.
8. A recipe display device, characterized in that the device comprises:
an acquisition unit configured to acquire feature information of a target object;
a set unit for determining a recipe set based on the feature information;
a recipe unit for determining a target recipe from the set of recipes;
the interaction unit is used for determining a target interaction mode corresponding to the target recipe;
and the sending unit is used for sending the target recipe to display equipment so that the display equipment displays the target recipe, and the displayed target recipe is controlled by adopting the target interaction mode.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program stored in the memory, and when the computer program is executed, implementing the recipe display method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the recipe presentation method according to any one of the claims 1-7.
CN202210077912.3A 2022-01-21 2022-01-21 Recipe display method and device, electronic equipment and storage medium Pending CN114489441A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210077912.3A CN114489441A (en) 2022-01-21 2022-01-21 Recipe display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210077912.3A CN114489441A (en) 2022-01-21 2022-01-21 Recipe display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114489441A true CN114489441A (en) 2022-05-13

Family

ID=81473395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210077912.3A Pending CN114489441A (en) 2022-01-21 2022-01-21 Recipe display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114489441A (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8990274B1 (en) * 2012-05-10 2015-03-24 Audible, Inc. Generating a presentation associated with a set of instructions
CN108257441A (en) * 2017-12-31 2018-07-06 武汉烽火云创软件技术有限公司 Support the virtual culinary art training system and method based on motion sensing manipulation of more menus
US20190130786A1 (en) * 2017-10-27 2019-05-02 Sundaresan Natarajan Kumbakonam System and method for generating a recipe player
CN109996148A (en) * 2017-12-29 2019-07-09 青岛有屋科技有限公司 A kind of intelligent kitchen multimedia play system
CN110109596A (en) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 Recommended method, device and the controller and medium of interactive mode
CN110874200A (en) * 2018-08-29 2020-03-10 阿里巴巴集团控股有限公司 Interaction method, device, storage medium and operating system
US20200233897A1 (en) * 2019-01-18 2020-07-23 Haier Us Appliance Solutions, Inc. Cooking engagement system equipped with a recipe application for combining third party recipe content
CN111444982A (en) * 2020-04-17 2020-07-24 文思海辉智科科技有限公司 Information processing method and device, electronic equipment and readable storage medium
CN111459054A (en) * 2020-04-14 2020-07-28 珠海格力电器股份有限公司 Recipe pushing method, equipment, storage medium and kitchen appliance
CN111723278A (en) * 2019-03-19 2020-09-29 佛山市顺德区美的电热电器制造有限公司 Menu recommendation method, device, recommendation system and related equipment
CN112017754A (en) * 2019-05-31 2020-12-01 青岛海尔智慧厨房电器有限公司 Menu recommendation method and device, range hood and storage medium
CN112256181A (en) * 2020-10-26 2021-01-22 北京达佳互联信息技术有限公司 Interaction processing method and device, computer equipment and storage medium
US20210082308A1 (en) * 2019-09-13 2021-03-18 Guangdong Midea Kitchen Appliances Manufacturing Co., Ltd. System and method for providing intelligent assistance for food preparation
CN112558753A (en) * 2019-09-25 2021-03-26 佛山市顺德区美的电热电器制造有限公司 Multimedia interaction mode switching method and device, terminal and storage medium
CN112579873A (en) * 2019-09-27 2021-03-30 北京安云世纪科技有限公司 Cooking recipe recommendation method and device, storage medium and electronic equipment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8990274B1 (en) * 2012-05-10 2015-03-24 Audible, Inc. Generating a presentation associated with a set of instructions
US20190130786A1 (en) * 2017-10-27 2019-05-02 Sundaresan Natarajan Kumbakonam System and method for generating a recipe player
CN109996148A (en) * 2017-12-29 2019-07-09 青岛有屋科技有限公司 A kind of intelligent kitchen multimedia play system
CN108257441A (en) * 2017-12-31 2018-07-06 武汉烽火云创软件技术有限公司 Support the virtual culinary art training system and method based on motion sensing manipulation of more menus
CN110874200A (en) * 2018-08-29 2020-03-10 阿里巴巴集团控股有限公司 Interaction method, device, storage medium and operating system
US20200233897A1 (en) * 2019-01-18 2020-07-23 Haier Us Appliance Solutions, Inc. Cooking engagement system equipped with a recipe application for combining third party recipe content
CN111723278A (en) * 2019-03-19 2020-09-29 佛山市顺德区美的电热电器制造有限公司 Menu recommendation method, device, recommendation system and related equipment
CN110109596A (en) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 Recommended method, device and the controller and medium of interactive mode
CN112017754A (en) * 2019-05-31 2020-12-01 青岛海尔智慧厨房电器有限公司 Menu recommendation method and device, range hood and storage medium
US20210082308A1 (en) * 2019-09-13 2021-03-18 Guangdong Midea Kitchen Appliances Manufacturing Co., Ltd. System and method for providing intelligent assistance for food preparation
CN112558753A (en) * 2019-09-25 2021-03-26 佛山市顺德区美的电热电器制造有限公司 Multimedia interaction mode switching method and device, terminal and storage medium
CN112579873A (en) * 2019-09-27 2021-03-30 北京安云世纪科技有限公司 Cooking recipe recommendation method and device, storage medium and electronic equipment
CN111459054A (en) * 2020-04-14 2020-07-28 珠海格力电器股份有限公司 Recipe pushing method, equipment, storage medium and kitchen appliance
CN111444982A (en) * 2020-04-17 2020-07-24 文思海辉智科科技有限公司 Information processing method and device, electronic equipment and readable storage medium
CN112256181A (en) * 2020-10-26 2021-01-22 北京达佳互联信息技术有限公司 Interaction processing method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
EP3661187B1 (en) Photography method and mobile terminal
US20220292590A1 (en) Two-dimensional code identification method and device, and mobile terminal
CN107632895B (en) Information sharing method and mobile terminal
JP6270982B2 (en) Interactive input for background tasks
US20170024226A1 (en) Information processing method and electronic device
RU2654145C2 (en) Information search method and device and computer readable recording medium thereof
CN107678644B (en) Image processing method and mobile terminal
US9542949B2 (en) Satisfying specified intent(s) based on multimodal request(s)
CN114020203A (en) User interface for content streaming
US10712936B2 (en) First electronic device and information processing method applicable to first or second electronic device comprising a first application
CN106383638B (en) Payment mode display method and mobile terminal
CN106339436B (en) Picture-based shopping method and mobile terminal
WO2017161904A1 (en) Method and device for displaying wallpaper image
CN109218819B (en) Video preview method and mobile terminal
CN108366169B (en) Notification message processing method and mobile terminal
CN111309214A (en) Video interface setting method and device, electronic equipment and storage medium
CN107454255B (en) Lyric display method, mobile terminal and computer readable storage medium
CN113220178B (en) Application program control method and device
CN111079016A (en) Short video recommendation method and device and electronic equipment
US20180275756A1 (en) System And Method Of Controlling Based On A Button Having Multiple Layers Of Pressure
US10564812B2 (en) Information processing method and electronic device
WO2020000975A1 (en) Video capturing method, client, terminal, and medium
US9971413B2 (en) Positioning method and apparatus
EP3887930A1 (en) Methods, systems, and media for navigating user interfaces
CN111048126B (en) Menu broadcasting method, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination