CN113848745A - Menu generation method and device - Google Patents

Menu generation method and device Download PDF

Info

Publication number
CN113848745A
CN113848745A CN202111223281.3A CN202111223281A CN113848745A CN 113848745 A CN113848745 A CN 113848745A CN 202111223281 A CN202111223281 A CN 202111223281A CN 113848745 A CN113848745 A CN 113848745A
Authority
CN
China
Prior art keywords
cooking
menu
intelligent
recipe
program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111223281.3A
Other languages
Chinese (zh)
Inventor
龚连发
荀晶
李冰
秦文东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Home Appliances Group Co Ltd
Hisense Guangdong Kitchen and Bath System Co Ltd
Original Assignee
Hisense Home Appliances Group Co Ltd
Hisense Guangdong Kitchen and Bath System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Home Appliances Group Co Ltd, Hisense Guangdong Kitchen and Bath System Co Ltd filed Critical Hisense Home Appliances Group Co Ltd
Priority to CN202111223281.3A priority Critical patent/CN113848745A/en
Publication of CN113848745A publication Critical patent/CN113848745A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2643Oven, cooking

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application provides a method and a device for generating a menu, relates to the technical field of intelligent kitchen electricity, and can conveniently and quickly convert a common menu into an intelligent menu which can be executed by the intelligent kitchen electricity so as to provide more intelligent menus for a user. The method comprises the following steps: acquiring a long picture of a common menu, wherein the long picture is used for recording all cooking steps in the common menu; performing text recognition on the long picture to acquire text information of all cooking steps in the common menu; processing the text information of all cooking steps in the common menu to obtain a cooking program; and generating an intelligent menu according to the cooking program.

Description

Menu generation method and device
Technical Field
The application relates to the technical field of intelligent kitchen electricity, in particular to a method and a device for generating a menu.
Background
Along with the improvement of the living standard of people, the requirements of people on the living quality are higher and higher, and various intelligent kitchen electricity is more and more appeared in the kitchen of people, so that better cooking experience is brought to people. Wherein, intelligence kitchen electricity can cooperate intelligent menu to use together to automatically, carry out the culinary art of food, simplify the manual operation process, provide more convenient culinary art mode for the user.
The intelligent recipe is preset mostly according to the cooking mode of the intelligent kitchen electricity, the recipe structure is single, and the intelligent kitchen electricity is limited in the types of dishes made according to the intelligent recipe. When the user wants to try a different new menu, the user may choose to look up the existing general menu. However, since the common recipe is not adapted to the intelligent kitchen electricity, the intelligent kitchen electricity cannot automatically cook corresponding dishes according to the common recipe, and thus the requirement of the user on the diversity of the dishes cannot be met.
Disclosure of Invention
The embodiment of the application provides a method and a device for generating a menu, which can conveniently and quickly convert a common menu into an intelligent menu capable of being executed by an intelligent kitchen power supply, and improve the diversity of the intelligent menu.
In a first aspect, an embodiment of the present application provides a method for generating a recipe, where the method includes: acquiring a long picture of a common menu, wherein the long picture is used for recording all cooking steps in the common menu; performing text recognition on the long picture to acquire text information of all cooking steps in the common menu; processing the text information of all cooking steps in the common menu to obtain a cooking program; and generating an intelligent menu according to the cooking program.
Based on the technical scheme, the text information of all cooking steps in the common menu is identified by performing text identification on the long pictures used for recording all cooking steps in the common menu. And further processing the text information of all cooking steps in the common menu to obtain a cooking program which can be directly executed by the intelligent kitchen electricity, and further generating the intelligent menu matched with the intelligent kitchen electricity. Different from the current intelligent menu and preset according to the cooking mode of the intelligent kitchen electricity, the scheme only needs to acquire the long picture of the common menu, and the common menu can be converted into the intelligent menu conveniently and quickly, so that the convenient cooking mode is provided for the user.
In addition, this scheme can become intelligent menu with a large amount of ordinary menu conversions that have now, is favorable to the scalability of intelligent menu, can be so that the intelligence menu is more diversified to satisfy the demand that the user tried different new dishes, promote user's use and experience.
In a second aspect, a menu execution method is provided, including: receiving an instruction for instructing execution of an intelligent recipe, the intelligent recipe including a cooking program, the cooking program including a control program for a plurality of cooking stages; in the process of executing the intelligent menu, if target reminding information needing to be broadcasted before a control program of a target cooking stage to be executed currently exists in the intelligent menu, the target reminding information is broadcasted, and the target cooking stage is any one of a plurality of cooking stages; receiving a confirmation instruction, wherein the confirmation instruction is used for instructing to execute a control program of a target cooking stage; in response to a confirmation instruction of the user, the control program of the target cooking stage is executed.
Based on the technical scheme, the intelligent kitchen electric appliance can broadcast related reminding information in the automatic cooking process, so that a user can timely execute related auxiliary cooking operation, and the cooking equipment can cook dishes according with expectations based on a menu
In a third aspect, an apparatus for generating a recipe is provided, including: the receiving and sending unit is used for obtaining a long picture of the common menu, and the long picture is used for recording all cooking steps in the common menu; the identification unit is used for carrying out text identification on the long picture and acquiring text information of all cooking steps in the common menu; the processing unit is used for processing the text information of all cooking steps in the common menu to obtain a cooking program; and the processing unit is also used for generating an intelligent menu according to the cooking program.
In a fourth aspect, an apparatus for generating a recipe is provided, including: the intelligent recipe processing device comprises a transceiving unit, a processing unit and a processing unit, wherein the transceiving unit is used for receiving an instruction for instructing to execute an intelligent recipe, the intelligent recipe comprises a cooking program, and the cooking program comprises a control program of a plurality of cooking stages; the broadcasting unit is used for broadcasting the target reminding information under the condition that the target reminding information which needs to be broadcasted before the control program of the target cooking stage to be executed currently exists in the intelligent menu in the process of executing the intelligent menu, wherein the target cooking stage is any one of a plurality of cooking stages; the receiving and sending unit is also used for receiving a confirmation instruction, and the confirmation instruction is used for indicating to execute the control program of the target cooking stage; and the processing unit is used for responding to the confirmation instruction of the user and executing the control program of the target cooking stage.
In a fifth aspect, an electronic device is provided, comprising: one or more processors and memory; wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform any of the methods provided by the first aspect.
In a sixth aspect, there is provided an intelligent kitchen appliance, comprising: one or more processors and memory; wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions which, when executed by the smart kitchen appliance, cause the smart kitchen appliance to perform any one of the methods provided in the second aspect above.
In a seventh aspect, a computer-readable storage medium is provided, which includes computer instructions, when the computer instructions are executed on a computer, the computer is caused to execute any one of the methods provided by the first aspect or the second aspect.
In an eighth aspect, there is provided a computer program product comprising computer instructions which, when run on a computer, cause the computer to perform any one of the methods provided in the first or second aspects above.
The technical effect brought by any possible scheme in the third aspect to the eighth aspect may refer to the corresponding beneficial effect analysis in the first aspect or the second aspect, and is not described herein again.
Drawings
FIG. 1 is a schematic diagram of a system provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 4 is a flowchart of a recipe generation method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a long picture of a general menu according to an embodiment of the present disclosure;
fig. 6 is a schematic view of a display interface of a terminal device according to an embodiment of the present application;
fig. 7 is a schematic display interface diagram of another terminal device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a long picture of another general recipe provided in an embodiment of the present application;
fig. 9 is a schematic diagram of a picture of a general menu provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of another general menu picture provided in the embodiments of the present application;
fig. 11 is a schematic display interface diagram of another terminal device according to an embodiment of the present application;
fig. 12 is a flowchart of another recipe generation method according to an embodiment of the present application;
fig. 13 is a flowchart of another recipe generation method according to an embodiment of the present application;
fig. 14 is a flowchart of another recipe generation method according to an embodiment of the present application;
fig. 15 is a schematic composition diagram of a menu generation apparatus according to an embodiment of the present application;
fig. 16 is a schematic composition diagram of an apparatus for executing recipes according to an embodiment of the present application.
Detailed Description
In the description of this application, "/" means "or" unless otherwise stated, for example, A/B may mean A or B. "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. Further, "at least one" means one or more, "a plurality" means two or more. The terms "first", "second", and the like do not necessarily limit the number and execution order, and the terms "first", "second", and the like do not necessarily limit the difference.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
As described in the background art, the number of the existing ordinary recipes is large, but the ordinary recipes are not matched with the intelligent kitchen electricity, so that the intelligent kitchen electricity cannot automatically make corresponding dishes according to the ordinary recipes.
In view of the above technical problems, an embodiment of the present application provides a recipe generation method, which can perform character recognition on a long picture of a common recipe to obtain text information of all cooking steps in the common recipe. And processing the text information of all cooking steps in the common menu to obtain a cooking program which can be directly executed by the intelligent kitchen electricity. Therefore, an intelligent menu matched with the intelligent kitchen electric appliance can be generated according to the obtained cooking program.
Based on this, can convert a large amount of current ordinary menu into intelligent menu, be favorable to the scalability of intelligent menu, can make intelligent menu more diversified to satisfy the demand that the user tried different new dishes, promote the user and experience the use of intelligence kitchen electricity.
Fig. 1 schematically illustrates a system to which the method provided by the embodiment of the present application is applied. As shown in fig. 1, the system includes a terminal device and an intelligent kitchen power supply. The terminal equipment can be connected with the intelligent kitchen electricity, so that the terminal equipment and the intelligent kitchen electricity can be communicated with each other.
In some embodiments, the terminal device may be installed with an application program related to the intelligent kitchen electricity, and the user may control the intelligent kitchen electricity on the terminal device through the application program related to the intelligent kitchen electricity. In the embodiment of the application, the user can upload the long picture of the common menu to the application program related to the intelligent kitchen electricity, so that the common menu is converted into the intelligent menu which can be directly identified by the intelligent kitchen electricity through the application program related to the intelligent kitchen electricity.
Illustratively, the terminal device can be any form of mobile terminal. Such as cell phones, tablet computers, desktop computers, laptop computers, handheld computers, notebook computers, ultra-mobile personal computers (UMPCs), netbooks, and devices such as cellular phones, Personal Digital Assistants (PDAs), Augmented Reality (AR) devices, Virtual Reality (VR) devices, etc. that may be installed with smart kitchen-related applications.
For example, the intelligent kitchen electricity may be a cooking device with an automatic cooking function, such as an intelligent electric cooker, an intelligent electric frying pan, an intelligent oven, an intelligent steaming and baking oven, and a cooking robot, which is not limited thereto. The intelligent kitchen electricity can acquire the intelligent menu, execute the received intelligent menu and finish cooking of corresponding dishes according to the intelligent menu.
Optionally, as shown in fig. 1, the system may further include a server, which may be a management server of the intelligent kitchen appliance related application program. The server can be connected with the terminal equipment and the intelligent kitchen electricity respectively, so that the terminal equipment can realize information interaction with the intelligent kitchen electricity through the server.
Illustratively, the server may be a device having data processing capabilities as well as data storage capabilities. The server may be, for example, one server, or a server cluster composed of a plurality of servers, or one cloud computing service center, which is not limited to this.
Taking the terminal device in the embodiment of the present application as an example of a mobile phone, a general hardware architecture of the mobile phone is described below with reference to fig. 2.
As shown in fig. 2, the mobile phone 100 may specifically include: processor 101, Radio Frequency (RF) circuitry 102, memory 103, touch screen 104, bluetooth device 105, one or more sensors 106, Wi-Fi device 107, positioning device 108, audio circuitry 109, peripherals interface 110, and power system 111. These components may communicate over one or more communication buses or signal lines (not shown in fig. 2). Those skilled in the art will appreciate that the hardware configuration shown in fig. 2 is not intended to be limiting of the handset 100, and that the handset 100 may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The processor 101 is a control center of the mobile phone 100, connects various parts of the mobile phone 100 by various interfaces and lines, and executes various functions of the mobile phone 100 and processes data by running or executing an application program (hereinafter, may be abbreviated as App) stored in the memory 103 and calling data stored in the memory 103. In some embodiments, processor 101 may include one or more processing units.
The rf circuit 102 may be used for receiving and transmitting wireless signals during the transmission and reception of information or calls. In particular, the rf circuit 102 may receive downlink data of the base station and then process the received downlink data to the processor 101; in addition, data relating to uplink is transmitted to the base station. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry 102 may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
The memory 103 is used for storing application programs and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory 103. The memory 103 mainly includes a program storage area and a data storage area, wherein the program storage area can store an operating system and application programs (such as a sound playing function and an image playing function) required by at least one function; the storage data area may store data (e.g., audio data, a phonebook, etc.) created from use of the handset 100. Further, the memory 103 may include high speed random access memory, and may also include non-volatile memory, such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device. The memory 103 may store various operating systems, such as the IOS operating system developed by apple, the Android operating system developed by google, and so on.
The touch screen 104 may include a touch pad 104-1 and a display 104-2. Wherein the touch pad 104-1 can capture touch events on or near the touch pad 104-1 by a user of the cell phone 100 (e.g., user operation on or near the touch pad 104-1 using any suitable object such as a finger, a stylus, etc.), and transmit the captured touch information to other devices, such as the processor 101.
In the embodiment of the present application, the mobile phone 100 may further have a fingerprint recognition function. For example, the fingerprint identifier 112 may be disposed on the back side of the handset 100 (e.g., below the rear facing camera), or the fingerprint identifier 112 may be disposed on the front side of the handset 100 (e.g., below the touch screen 104). For another example, the fingerprint acquisition device 112 may be configured in the touch screen 104 to realize the fingerprint identification function, i.e., the fingerprint acquisition device 112 may be integrated with the touch screen 104 to realize the fingerprint identification function of the mobile phone 100. In this case, the fingerprint acquisition device 112 is disposed in the touch screen 104, may be a part of the touch screen 104, and may be disposed in the touch screen 104 in other manners. The main component of the fingerprint acquisition device 112 in the present embodiment is a fingerprint sensor, which may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, among others.
In the embodiment of the present application, the mobile phone 100 may further include a bluetooth device 105 for enabling data exchange between the mobile phone 100 and other short-distance terminals (e.g., the mobile phone 100, a smart watch, etc.). The bluetooth device in the embodiment of the present application may be an integrated circuit or a bluetooth chip.
The handset 100 may also include at least one sensor 106, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display of the touch screen 104 according to the brightness of ambient light, and a proximity sensor that turns off the power of the display when the mobile phone 100 is moved to the ear. As one of the motion sensors, the acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the mobile phone 100, and related functions (such as pedometer and tapping) for vibration recognition; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone 100, further description is omitted here.
The Wi-Fi device 107 is used for providing network access for the mobile phone 100 according to Wi-Fi related standard protocols, the mobile phone 100 can be accessed to a Wi-Fi access point through the Wi-Fi device 107, so that the mobile phone helps a user to send and receive e-mails, browse webpages, access streaming media and the like, and wireless broadband internet access is provided for the user. In other embodiments, the Wi-Fi device 107 can also act as a Wi-Fi wireless access point and can provide Wi-Fi network access to other terminals.
And a positioning device 108 for providing a geographical position for the handset 100. It is understood that the positioning device 108 may be a receiver of a Global Positioning System (GPS) or a positioning system such as the beidou satellite navigation system, russian GLONASS, etc. After receiving the geographical location transmitted by the positioning system, the positioning device 108 transmits the information to the processor 101 for processing or transmits the information to the memory 103 for storage. In still other embodiments, the positioning device 108 may also be a receiver of an Assisted Global Positioning System (AGPS) that assists the positioning device 108 in performing ranging and positioning services by acting as an assistance server, in which case the assistance positioning server provides positioning assistance by communicating with the positioning device 108 (i.e., GPS receiver) of the terminal, such as the handset 100, over a wireless communication network. In other embodiments, the location device 108 may also be a Wi-Fi access point based location technology. Each Wi-Fi access point has a globally unique MAC address, so that the terminal can scan and collect broadcast signals of the surrounding Wi-Fi access points under the condition of starting Wi-Fi, and the MAC address broadcasted by the Wi-Fi access points can be acquired; the terminal sends the data (such as the MAC address) capable of indicating the Wi-Fi access point to the location server through the wireless communication network, the location server retrieves the geographical location of each Wi-Fi access point, and calculates the geographical location of the terminal according to the strength of the Wi-Fi broadcast signal and sends the geographical location of the terminal to the positioning device 108 of the terminal.
The audio circuitry 109, speaker 113, microphone 114 can provide an audio interface between a user and the handset 100. The audio circuit 109 may transmit the electrical signal converted from the received audio data to the speaker 113, and convert the electrical signal into a sound signal by the speaker 113 for output; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 109, and outputs the audio data to the RF circuit 102 to be transmitted to, for example, another cellular phone, or outputs the audio data to the memory 103 for further processing.
A communication interface 110 for providing various interfaces for external input/output devices (e.g., a keyboard, a mouse, an external display, an external memory, a sim card, etc.). For example, the terminal is connected to a mouse or a display through a Universal Serial Bus (USB) interface, connected to a subscriber identity module card (SIM) provided by a telecommunications carrier through a metal contact on a card slot of the SIM card, and used to implement a communication function with other terminals through an interface of a Wi-Fi device 107, an interface of a Near Field Communication (NFC) device, an interface of a bluetooth module, and the like. The communication interface 110 may be used to couple the aforementioned external input/output peripheral devices to the processor 101 and memory 103.
The mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) for supplying power to each component, and the battery may be logically connected to the processor 101 through the power management chip, so as to implement functions of managing charging, discharging, and power consumption through the power supply device 111.
Although not shown in fig. 2, the mobile phone 100 may further include a camera (front camera and/or rear camera), a flash, a micro-projector, a Near Field Communication (NFC) device, and the like, which are not described in detail herein.
The general hardware architecture of the server in fig. 1 is described below in conjunction with fig. 3.
As shown in fig. 3, server 200 includes a processor 201, a communication line 202, and a communication interface 203.
Further, the server 200 may also include a memory 204. The processor 201, the memory 204 and the communication interface 203 may be connected via a communication line 202.
The processor 201 may be a Central Processing Unit (CPU), a general purpose processor Network (NP), a Digital Signal Processor (DSP), a microprocessor, a microcontroller, a Programmable Logic Device (PLD), or any combination thereof. The processor 201 may also be any other device with processing function, such as a circuit, a device or a software module, without limitation.
A communication line 202 for transmitting information between the respective components included in the server 200.
A communication interface 203 for communicating with other devices or other communication networks. The other communication network may be an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), or the like. The communication interface 203 may be a module, a circuit, a transceiver, or any device capable of enabling communication.
A memory 204 for storing instructions. Wherein the instructions may be a computer program.
The memory 204 may be a read-only memory (ROM) or another type of static storage device that can store static information and/or instructions, an access memory (RAM) or another type of dynamic storage device that can store information and/or instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or another optical disc storage, an optical disc storage (including a compact disc, a laser disc, an optical disc, a digital versatile disc, a blu-ray disc, etc.), a magnetic disc storage medium or another magnetic storage device, and the like, without limitation.
It is noted that the memory 204 may exist separately from the processor 201 or may be integrated with the processor 201. The memory 204 may be used for storing instructions or program code or some data etc. The memory 204 may be located inside the server 200 or outside the server 200, without limitation.
The processor 201 is configured to execute the instructions stored in the memory 204 to implement the communication method provided by the following embodiments of the present application. For example, when the server 200 is a terminal or a chip or a system on a chip in a terminal, the processor 201 may execute instructions stored in the memory 204 to implement the steps performed by the transmitting end in the embodiments described below in the present application.
In one example, processor 201 may include one or more CPUs, such as CPU0 and CPU1 in fig. 3.
As an alternative implementation, the server 200 includes multiple processors, for example, the processor 207 may be included in addition to the processor 201 in fig. 3.
As an alternative implementation, the server 200 further includes an output device 205 and an input device 206. Illustratively, the input device 206 is a keyboard, mouse, microphone, or joystick, among other devices, and the output device 205 is a display screen, speaker, among other devices.
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
As shown in fig. 4, an embodiment of the present application provides a method for generating a recipe, including the following steps:
s101, the terminal equipment obtains a long picture of a common menu.
Wherein, the long picture of the ordinary menu is used for recording all cooking steps in the ordinary menu. Furthermore, the long picture of the common menu can also be used for recording other information such as material information of the common menu.
Exemplarily, fig. 5 shows a long picture of a general recipe for honeydew chicken wings. Wherein, the long picture has all the contents of the steps 1 to 7 of preparing the honeydew chicken wings until the preparation of the honeydew chicken wings is completed. In addition, as shown in fig. 5, the long picture may further include other information such as material information for preparing the honeydew chicken wings.
As a possible implementation, the terminal device may directly obtain a long picture of a general recipe.
For example, the long picture of the general recipe may be the long picture shown in fig. 5 obtained through the long screenshot or the scrolling screenshot function of the terminal device when the user reads the cooking step of the general recipe on the terminal device, and the long picture has text information of the cooking step of the general recipe.
As another possible implementation manner, the terminal device may obtain multiple pictures of the common recipe, and perform splicing processing on the multiple pictures to generate a long picture of the common recipe.
Wherein, each picture in the plurality of pictures is used for recording partial content of the common menu.
It should be understood that the plurality of pictures of the general recipe are the same as those described in the long picture, and the plurality of pictures of the general recipe are also used for describing information related to the general recipe, such as cooking steps, material information, and the like. For example, the multiple pictures of the common recipe may be pictures obtained by screen capturing through the terminal device when the user reads the cooking steps of the common recipe on the terminal device, and the pictures have text information of the cooking steps of the common recipe.
In the process that a user cooks by using the intelligent kitchen electricity, the user may not find the dishes to be made in the existing intelligent menu of the intelligent kitchen electricity because the types of the dishes made by the intelligent kitchen electricity according to the intelligent menu are very limited. At this time, the user can instruct the terminal device to obtain multiple pictures of the common menu of the dish, and the conversion from the common menu to the intelligent menu is performed.
For example, when a user wants to make a honey chicken wing using a smart oven, the user may open a menu interface of the smart oven on the terminal device. As shown in (a) of fig. 6, the terminal device may display basic descriptions of the intelligent recipes, such as the recipe names and the recipe cover drawings of "braised pork in brown sauce", "cola chicken wings" and "roast steak", and the user may refer to all the intelligent recipes of the intelligent oven in the menu interface of the intelligent oven. If the user does not find the intelligent menu of the honeydew chicken wings on the menu interface of the intelligent oven, the user can click the adding button 11 to upload the pictures of the common menu of the honeydew chicken wings found by the user.
As shown in (b) of fig. 6, in response to a click operation of the add button 11 by the user, the terminal device may display a general recipe upload interface including the picture add button 12 and the prompt box 13. As shown in fig. 6 (b), the prompt box 13 may include "please upload the complete cooking steps" and "please upload the pictures in sequence according to the front and back order of the screenshot", and the prompt box 13 is used to prompt the user to upload the pictures of the general recipe as required. If the user is ready to upload multiple pictures of the common recipe for honeydew chicken wings, the user can click the picture adding button 12. In response to the operation, the terminal device may display an album interface, where the album interface may include one or more photos stored by the terminal device. The user can select the pictures of the common menu according to the front and back sequence of the screenshot. Accordingly, the terminal device receives a selection operation of the user for the photo, and in response to the operation, as shown in (a) of fig. 7, the terminal device may display a plurality of pictures of the general recipe from left to right and from top to bottom in the order in which the user selects the pictures.
In addition, as shown in fig. 7 (a), the terminal device may further display a selection button 21 and a prompt word "please check a picture including a cooking step" to remind the user to mark the picture including the cooking step, so that the terminal device can extract text information of the cooking step. If the cooking steps of the common menu of the honeydew chicken wings are recorded in the picture uploaded by the user, the user can click the selection button 21 corresponding to the picture. As shown in (b) in fig. 7, in response to an operation of clicking a selection button corresponding to the picture 2, the picture 3, and the picture 4 by the user, the terminal device may display a selection identifier on the picture 2, the picture 3, and the picture 4.
Referring to fig. 7 (b), the terminal device may further display a determination button 22. After the user finishes checking the pictures 2, 3 and 4 and confirms that the uploaded pictures are correct, the user can click the determination button 22 to finish uploading multiple pictures of the common menu of the honeydew chicken wings. Accordingly, in response to the click operation of the determination button 22 by the user, the terminal device can acquire a plurality of pictures of the general recipe of the honeydew chicken wings.
Optionally, after the terminal device acquires multiple pictures of the common menu, the acquired multiple pictures may be spliced according to the uploading sequence to generate a long picture of the common menu.
Or after the terminal equipment acquires a plurality of pictures of the common menu, at least one picture with the selected identification can be identified. And splicing the at least one identified picture to generate a long picture of the common menu. Based on the method, the terminal equipment can more accurately acquire the cooking steps of the common menu from the long picture of the common menu.
For example, taking the picture 1, the picture 2, the picture 3, and the picture 4 shown in (a) in fig. 7 as an example, as shown in (a) in fig. 8, the terminal device splices the picture 1, the picture 2, the picture 3, and the picture 4 into a complete long picture from top to bottom according to the uploading sequence of the user.
Or, as shown in fig. 8 (b), the terminal device splices the picture 2, the picture 3, and the picture 4 with the selected identifier into a complete long picture from top to bottom according to the uploading sequence of the user.
Optionally, before the terminal device splices the acquired multiple pictures, the terminal device may further perform pixel comparison between two adjacent pictures in the multiple pictures. And if the overlapping part exists between the two adjacent pictures, the terminal equipment cuts the next picture between the two adjacent pictures so as to cut the overlapping part of the next picture and the previous picture. And then, the terminal equipment acquires the cut picture and splices the long picture.
For example, as shown in fig. 9, in the process of performing the stitching process on (a), (b), and (c) in fig. 9 by the terminal device, the terminal device may first perform pixel comparison on (a) in fig. 9 and (b) in fig. 9, and if there is an overlapping portion 1 between (a) and (b) in fig. 9, the terminal device may cut out (b) in fig. 9, and cut out the overlapping portion 1 therein, so as to obtain (b2) in fig. 10.
Similarly, the terminal device may perform pixel comparison between (b) in fig. 9 and (c) in fig. 9, and if there is an overlapping portion 2 between (b) and (c) in fig. 9, the terminal device may crop (c) in fig. 9, and crop off the overlapping portion 2, resulting in (c2) in fig. 10.
Furthermore, the terminal device splices (a) in fig. 9, (b2) in fig. 10, and (c2) in fig. 10 in the order of uploading the pictures, so as to obtain a long picture of the general recipe of the honeydew chicken wings as shown in fig. 5.
It should be understood that the long picture of the general recipe of the honey chicken wings acquired by the terminal device completely and unrepeatedly records the cooking steps of the general recipe of the honey chicken wings, so that the terminal device can conveniently identify the content on the long picture.
S102, the terminal equipment performs text recognition on the long picture to acquire text information of all cooking steps in the common menu.
As a possible implementation manner, after the terminal device acquires the long picture of the ordinary menu, text Recognition may be performed on the acquired long picture by using an Optical Character Recognition (OCR) method to obtain all text information on the long picture of the ordinary menu. Further, the terminal device screens all text information on the long picture of the common recipe to obtain the text information of all cooking steps in the common recipe.
For example, after acquiring all the text information on the long picture of the general recipe, the terminal device may identify the keywords related to the step from all the text information on the long picture of the general recipe. Illustratively, the keywords associated with a step may be step 1, the first step, etc.
Assuming that N keywords related to the steps are identified, for the first N-1 cooking steps, the terminal device may determine the related text information between the keyword related to the ith step and the keyword related to the (i + 1) th step in all the text information on the long picture of the common recipe as the text information of the ith cooking step, wherein i is any positive integer greater than or equal to 1 and less than or equal to N-1.
For the nth step, the terminal device may confirm all text information following the keyword related to the nth step among all text information on the long picture of the general recipe as the text information of the nth cooking step. Or, the terminal device may confirm text information from a keyword related to the nth step to a keyword indicating an end among all text information on the long picture of the general recipe as text information of the nth cooking step. For example, the keyword indicating the end may be completion, end, cooking completion, and the like, which is not limited thereto.
S103, the terminal device processes the text information of all cooking steps in the common menu to obtain a cooking program.
In some embodiments, after the terminal device obtains the text information of the cooking step, the text information of the cooking step may be classified to obtain the text information of the cooking step executed by the smart kitchen phone and the text information of the cooking step that needs to be operated by the user. Therefore, the terminal device can generate the cooking program according to the acquired text information of the cooking steps executed by the intelligent kitchen electricity.
It should be understood that the generated cooking program is adapted to the intelligent kitchen electricity, and the intelligent kitchen electricity can execute the cooking program to automatically cook the dishes.
In the present embodiment, a cooking program generally includes a control program for a plurality of cooking stages.
And S104, the terminal equipment generates an intelligent menu according to the cooking program.
In some embodiments, the terminal device directly generates the intelligent recipe according to the cooking program.
In other embodiments, since some errors may exist in the text recognition, which may cause errors in the obtained cooking program, the terminal device may further receive an editing operation of the cooking program by the user, and in response to the editing operation, edit the cooking program to obtain the edited cooking program. And then, the terminal equipment generates an intelligent menu according to the edited cooking program.
Illustratively, the terminal device may display a cooking program editing interface after generating the cooking program, as shown in (a) of fig. 11, the cooking program editing interface including the cooking program of the honeydew chicken wings generated by the terminal device. Further, the cooking program editing interface includes an edit button 41 and a confirm button 42. If the user perceives that the cooking program generated by the terminal device is wrong, the user can click the edit button 41 to modify the text information of the cooking program. After the modification is completed, the user may click on the confirmation button 42, instructing the terminal device to generate the intelligent recipe.
Optionally, after generating the recipe, the terminal device may automatically or under the instruction of the user send the recipe to the server, so that the server stores the recipe.
Optionally, after the menu is generated, the user may select to set the menu to be visible only by himself, or only by friends, or to be visible by all users. The terminal equipment receives the visible permission setting operation of the user, responds to the operation, and sends a visible permission setting instruction to the server, wherein the visible permission setting instruction is used for indicating the server to configure the visible permission of the menu. For example, the user may set itself to be visible only, so that the server does not send the recipe to other users.
In some embodiments, the terminal device may obtain recipe information other than the cooking step according to the obtained multiple pictures of the common recipe, and add the obtained recipe information to the intelligent recipe.
In one example, if the long picture of the general recipe is formed by splicing all pictures uploaded by the user, the terminal device may acquire text information other than the cooking steps after performing text recognition on the long picture and acquiring text information of all cooking steps in the general recipe, and add the text information to the intelligent recipe.
Taking the long picture of the ordinary recipe shown in fig. 5 as an example, the terminal device obtains the material information of the honeydew chicken wings according to all the text information on the long picture, and adds the material information to the intelligent recipe.
In another example, if the long picture of the common menu is formed by splicing pictures selected by the user, the terminal device may perform character recognition on the pictures not selected by the user, acquire text information in the pictures, and add the text information to the intelligent menu.
Optionally, the user may also edit information such as a recipe cover, a recipe name, and the like.
Illustratively, as shown in fig. 11 (b), the terminal device may display a recipe edit interface including a recipe cover edit box 41 and a recipe name edit button 42 thereon. The recipe cover edit box 41 may include a cover add option 411. If the user is ready to add a recipe cover, the cover add option 411 may be clicked. The terminal device receives a user click operation on the cover adding option 411, and in response to the operation, the terminal device may display an album interface, wherein the album interface may include one or more photos stored by the terminal device. From which the user can select a photo as a recipe cover. If the user is ready to edit the recipe name, the recipe name edit button 42 may be clicked and the recipe name may be entered.
Compare in current intelligent menu and mostly predetermine according to the culinary art mode of intelligent kitchen electricity, this scheme only needs to acquire the long picture of ordinary menu, alright turn into intelligent menu with ordinary menu in convenient and fast ground to for the culinary art mode that the user facilitates. In addition, this scheme can become intelligent menu with a large amount of ordinary menu conversions that have now, is favorable to the scalability of intelligent menu, can be so that the intelligence menu is more diversified to satisfy the demand that the user tried different new dishes, promote user's use and experience.
As an alternative embodiment, as shown in fig. 12, the step S103 may be implemented as the following steps:
and S1031, the terminal equipment identifies the keywords of the text information of all cooking steps in the common menu to obtain the identified keywords.
The keywords comprise a first type keyword, a second type keyword and a third type keyword.
The first type keywords are used for reflecting the working modes of the intelligent kitchen electricity, such as up heating, down heating, up and down heating, stewing and the like, or the actions performed by the intelligent kitchen electricity, such as baking, frying, cooking, frying, stewing and the like.
The second type of keyword is a combination word of a number and a temperature unit. The number can be in the form of Chinese or Arabic number, and the temperature unit can be in the form of Chinese or English letters. For example, the second type of keyword may be identified at 200 degrees celsius, 25 degrees celsius, 180 degrees celsius, or the like.
The third type of keyword is a combination word of a number and a time unit. The number can be in the form of Chinese or Arabic number, and the time unit can be in the form of Chinese or English letters. Illustratively, the third type of keyword that is identified may be 1 hour, 20 minutes, 3 hours, etc.
Optionally, the terminal device may determine the recognition sequence of the first type keyword, the second type keyword and the third type keyword according to the category of the smart kitchen phone.
In an example, taking the smart oven as an example, the terminal device may perform second-type keyword recognition on text information of all cooking steps in a common recipe, and obtain the recognized second-type keywords. And then, recognizing the first type keywords and the third type keywords of the text information of all cooking steps in the common menu to obtain the first type keywords and the third type keywords corresponding to the second type keywords.
It should be noted that the cooking program of the smart oven is composed of a first type keyword, a second type keyword, and a third type keyword. However, a user-operated cooking step, such as "marinate beefsteak for 30 minutes", may be included in the general recipe, with a third type of keyword. In addition, the common menu may also include first type keywords such as "roasting", "toasting", and the like, which do not refer to a specific operating mode of the intelligent oven.
Therefore, in some embodiments, to improve the accuracy and efficiency of the recognition, when generating the intelligent recipe for the intelligent oven, the terminal device may first recognize the second type of keyword, and then recognize the first type of keyword and the third type of keyword.
Taking the above recipe of honeydew chicken wings as an example, the terminal device may first identify 3 second type keywords: "160 degrees", "175 degrees" and "225 degrees".
Furthermore, the terminal device may perform the first type keyword and the third type keyword recognition on the text information of all cooking steps in the general recipe of the honeydew chicken wings based on the recognized second type keywords "160 degrees", "175 degrees" and "225 degrees", and acquire the first type keyword and the third type keyword which are similar to the appearance positions of the second type keyword according to the rule of proximity.
The terminal device may finally determine that the text information of all cooking steps in the common recipe of the honeydew chicken wings obtains and identifies the following keywords:
1. 160 degrees, baking and 10 minutes.
2. 175 degrees, roast and 8 minutes.
3. "225 degrees", "baking", "2 minutes".
In another example, taking the intelligent electric cooker as an example, the terminal device may perform first type keyword recognition on text information of all cooking steps in a common recipe, and obtain the recognized first type keywords. And then, identifying second type keywords or third type keywords of the text information of all cooking steps in the common menu to obtain the second type keywords or the third type keywords corresponding to the first type keywords.
It should be noted that the cooking program of the intelligent electric cooker may be composed of a first type of keywords and a third type of keywords, for example, "stew for 30 minutes", wherein the second type of keywords are not included. Moreover, taking the porridge cooking mode of the intelligent electric cooker as an example, the intelligent electric cooker can automatically stop running according to the maturity condition of food materials when cooking is finished, and does not need to set the working time.
And S1032, the terminal equipment generates a cooking program according to the identified keywords.
If the first type keywords identified by the terminal device in the text information of all cooking steps in the common recipe are 'baking', and the like, the terminal device can automatically match with the default working mode 'heating up and down' of the intelligent oven.
TABLE 1
Cooking stage Cooking program
Cooking stage 1 Heating up and down at 160 deg.C for 10 min
Cooking stage
2 Heating at 175 deg.C + up and down +8 min
Cooking stage
3 225 ℃ heating up and down for 2 minutes
In some embodiments, after the terminal device generates the cooking program, the terminal device may determine the reminding information corresponding to each cooking stage of the cooking program.
Specifically, a cooking program generally includes a control program for a plurality of cooking stages. The terminal device can extract reminding information corresponding to at least one cooking stage in the plurality of cooking stages from text information of all cooking steps in the common menu. Further, the terminal device generates an intelligent menu according to the cooking program and the reminding information corresponding to at least one cooking stage in the plurality of cooking stages.
For example, the text information between the keywords of two adjacent cooking stages can be determined according to the positions of the keywords of each cooking stage of the cooking program in the text information of all cooking steps of the honey chicken wings, and the text information can be used as the reminding information corresponding to the latter cooking stage between the two adjacent cooking stages. It should be understood that all text information before the keyword of the first cooking stage can be used as the reminding information corresponding to the first cooking stage.
Based on the embodiment, the terminal equipment can identify the keywords of the text information of all cooking steps in the common menu, so that the text information of the cooking program can be quickly obtained, and the cooking program of the intelligent kitchen electric can be conveniently and quickly generated.
It should be noted that the recipe generation methods provided in fig. 4 and 12 described above are also applicable to the server. Taking step S101 as an example, after the terminal device acquires multiple pictures of the ordinary menu, the terminal device may send the acquired multiple pictures of the ordinary menu to the server. Correspondingly, the server can splice a plurality of received pictures of the common menu to obtain a long picture of the common menu.
As an alternative embodiment, as shown in fig. 13, an embodiment of the present application further provides an execution method of an intelligent recipe, where the method includes the following steps:
s201, the intelligent kitchen electric power receives an instruction for instructing execution of the intelligent menu.
For example, the user may input an operation to start executing the intelligent recipe on the terminal device, and in response to the operation, the terminal device transmits an instruction to the intelligent recipe to execute the intelligent recipe. Correspondingly, the intelligent kitchen electrical terminal receives an execution instruction sent by the terminal equipment.
S202, responding to the instruction of instructing to execute the intelligent menu, and executing the cooking program in the intelligent menu by the intelligent kitchen phone.
After the intelligent kitchen electric power receives the instruction for executing the intelligent menu sent by the terminal equipment, the intelligent kitchen electric power responds to the instruction, and the intelligent menu corresponding to the instruction is obtained by the intelligent kitchen electric power. The intelligent menu comprises a cooking program and reminding information corresponding to each cooking stage of the cooking program.
Further, the intelligent kitchen electrical controller executes the cooking program in the intelligent menu, and when the intelligent kitchen electrical controller executes the cooking program to reach any cooking stage, the reminding information corresponding to the cooking stage is broadcasted.
The cooking program generally comprises a control program for a plurality of cooking stages, and during the cooking process of the intelligent recipe, the user also needs to perform some auxiliary operations before the intelligent kitchen performs the control program for some cooking stages. In order to avoid conflict between the auxiliary operation of the user and the control program for executing the cooking stage by the smart kitchen phone, as shown in fig. 14, the step S202 may be implemented as:
s2021, the intelligent kitchen electric device judges whether target reminding information needing to be broadcasted before executing the control program of the target cooking stage exists in the intelligent menu.
Since the reminding information is not required to be broadcasted before each cooking stage is executed, the intelligent kitchen appliance can determine whether the cooking stage has the corresponding reminding information before any control program of the cooking stage is executed. If so, execute S2022-S2024; if not, directly execute S2024.
S2022, the intelligent kitchen electric appliance plays the target reminding information.
Taking the above cooking procedure of the honeydew chicken wings as an example, before the procedure of "160 ℃ + up-down heating +10 minutes" in the cooking stage 1 is executed, the intelligent kitchen appliance can broadcast the text information corresponding to the cooking stage 1.
S2023, the intelligent kitchen electric terminal receives a confirmation instruction of the user.
For example, after the intelligent kitchen phone broadcasts the target reminding information, in order to ensure that the user has completed a cooking step in the reminding information that requires the user to operate, the intelligent kitchen phone may broadcast voice information such as "whether to continue cooking" or send a first instruction to the terminal device, where the first instruction is used to instruct the terminal device to display text information such as "whether to continue cooking" on a screen to inquire whether the user immediately executes a control program of the target cooking stage.
Further, the intelligent kitchen appliance may directly receive a confirmation instruction of continuing cooking input by the user, and execute step S2024 according to the confirmation instruction.
Or, the terminal device may receive a confirmation instruction for continuing cooking input by the user and send the confirmation instruction to the intelligent kitchen appliance. Accordingly, the smart kitchen electrical connector receives the confirmation command from the user, and starts to execute step S2024 described below.
Optionally, after the intelligent kitchen phone queries whether the user immediately executes the control program of the target cooking stage, if the confirmation instruction of the user is not received, the intelligent kitchen phone may query again whether the user immediately executes the control program of the target cooking stage after a preset time period.
S2024, the intelligent kitchen electric machine executes a control program of the target cooking stage.
Further, steps S2021-S2024 are repeatedly performed until the control process for all cooking stages is completed.
Based on the above embodiment, the intelligent kitchen electric appliance can broadcast the related reminding information in the automatic cooking process, so that the user can timely execute the related auxiliary cooking operation, and the cooking equipment can cook dishes according with expectations based on the menu.
The above description has presented the scheme provided herein primarily from a methodological perspective. It is understood that the terminal device and the server include hardware structures and/or software modules for performing the functions in order to realize the functions. Those of skill in the art will readily appreciate that the present invention can be implemented in hardware or a combination of hardware and computer software, in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The present application may perform the division of the functional modules for the terminal device and the server according to the above method examples, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation.
Fig. 15 is a schematic composition diagram of a menu generation device according to an embodiment of the present application. As shown in fig. 15, the recipe generation apparatus 1000 includes a transmission/reception unit 1001, a processing unit 1002, and an identification unit 1003.
A transceiver unit 1001 for obtaining a long picture of a general recipe, the long picture being used for recording all cooking steps in the general recipe;
the identification unit 1003 is used for performing text identification on the long picture and acquiring text information of all cooking steps in the common menu;
the processing unit 1002 is used for processing the text information of all cooking steps in the common recipe to obtain a cooking program;
the processing unit 1002 is further configured to generate an intelligent recipe according to the cooking program.
In some embodiments, the transceiver 1001 is further configured to obtain multiple pictures of a general recipe intercepted by a user, where each of the multiple pictures is used to record a part of content of the general recipe; the processing unit 1002 is further configured to perform splicing processing on multiple pictures to generate a long picture.
In some embodiments, the identifying unit 1002 is specifically configured to perform keyword identification on text information of all cooking steps in a common recipe, and obtain identified keywords; the processing unit 1002 is specifically configured to generate a cooking program according to the identified keyword.
In some embodiments, the keywords include a first type keyword, a second type keyword, and a third type keyword, the first type keyword is used for reflecting an operation mode of the smart kitchen appliance, the second type keyword is a combination word of a number and a temperature unit, and the third type keyword is a combination word of a number and a time unit.
In some embodiments, the processing unit 1002 is specifically configured to: responding to the editing operation of a user, and editing the cooking program to obtain the edited cooking program; and generating an intelligent menu according to the edited cooking program.
In some embodiments, the cooking program includes a control program for a plurality of cooking stages, and the processing unit 1002 is further configured to extract a reminding message corresponding to at least one of the plurality of cooking stages from text messages of all cooking steps in the common recipe; and generating an intelligent menu according to the cooking program and the reminding information corresponding to at least one cooking stage in the plurality of cooking stages.
Fig. 16 is a schematic composition diagram of an apparatus for executing a recipe provided in an embodiment of the present application. As shown in fig. 16, the menu execution device 2000 includes a transmission/reception unit 2001, a processing unit 2002, and a broadcast unit 1003.
A transceiving unit 2001 for receiving an instruction instructing execution of an intelligent recipe including a cooking program including a control program of a plurality of cooking stages.
In the process of executing the intelligent recipe, the broadcasting unit 1003 is configured to, in the case that there is target reminding information that needs to be broadcasted before the control program of the target cooking stage to be currently executed in the intelligent recipe, broadcast the target reminding information, where the target cooking stage is any one of the plurality of cooking stages.
The transceiver 2001 is further configured to receive a confirmation instruction of the user, where the confirmation instruction of the user is used to instruct execution of the control program of the target cooking stage.
A processing unit 2002 for executing a control program of the target cooking stage in response to a confirmation instruction of the user.
The elements in fig. 15 or 16 may also be referred to as modules, for example, the processing elements may be referred to as processing modules. In the embodiment shown in fig. 15 or 16, the names of the respective units may not be the names shown in the figure, and for example, the transmitting and receiving unit may also be referred to as a communication unit.
The respective units in fig. 15 or 16, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. A storage medium storing a computer software product comprising: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Embodiments of the present invention further provide a computer-readable storage medium, where the computer-readable storage medium includes computer-executable instructions, and when the computer-executable instructions are executed on a computer, the computer is caused to execute any one of the methods provided in the above embodiments.
The embodiment of the present invention further provides a computer program product, which can be directly loaded into the memory and contains software codes, and after being loaded and executed by the computer, the computer program product can implement any one of the methods provided by the above embodiments.
An embodiment of the present application further provides a chip, including: a processing circuit and a transceiver pin, the processing circuit being coupled to the memory via the transceiver pin, the processing circuit causing any of the methods provided by the above embodiments to be performed when the processing circuit executes a computer program or computer-executable instructions in the memory.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer-executable instructions. The processes or functions described in accordance with the embodiments of the present application occur, in whole or in part, when computer-executable instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer executable instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer executable instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of generating a recipe, the method comprising:
acquiring a long picture of a common menu, wherein the long picture is used for recording all cooking steps in the common menu;
performing text recognition on the long picture to acquire text information of all cooking steps in the common menu;
processing the text information of all cooking steps in the common menu to obtain a cooking program;
and generating an intelligent menu according to the cooking program.
2. The method of claim 1, wherein obtaining a long picture of a common recipe comprises:
acquiring a plurality of pictures of the common menu, wherein each picture in the plurality of pictures is used for recording partial content of the common menu;
and splicing the plurality of pictures to generate the long picture.
3. The method of claim 1, wherein said processing text information for all cooking steps in said general recipe to obtain a cooking program comprises:
performing keyword identification on the text information of all cooking steps in the common menu to obtain identified keywords;
and generating the cooking program according to the identified key words.
4. The method of claim 3, wherein the keywords comprise a first type keyword, a second type keyword and a third type keyword, the first type keyword is used for reflecting an operation mode of the intelligent kitchen power, the second type keyword is a combination word of a number and a temperature unit, and the third type keyword is a combination word of a number and a time unit.
5. The method of any of claims 1 to 4, wherein said generating an intelligent recipe according to said cooking program comprises:
responding to the editing operation of a user, editing the cooking program to obtain the edited cooking program;
and generating the intelligent menu according to the edited cooking program.
6. The method of claim 1, wherein the cooking program comprises a control program for a plurality of cooking stages; the method further comprises the following steps:
extracting reminding information corresponding to at least one cooking stage in the plurality of cooking stages from text information of all cooking steps in the common menu;
the generating of the intelligent recipe according to the cooking program comprises:
and generating the intelligent menu according to the cooking program and the reminding information corresponding to at least one cooking stage in the plurality of cooking stages.
7. A method of executing a recipe, the method comprising:
receiving an instruction instructing execution of an intelligent recipe, the intelligent recipe including a cooking program, the cooking program including a control program for a plurality of cooking stages;
in the process of executing the intelligent menu, if target reminding information needing to be broadcasted before a control program of a target cooking stage to be executed currently exists in the intelligent menu, the target reminding information is broadcasted, wherein the target cooking stage is any one of the plurality of cooking stages;
receiving a confirmation instruction of a user, wherein the confirmation instruction of the user is used for instructing to execute a control program of the target cooking stage;
and responding to the confirmation instruction of the user, and executing the control program of the target cooking stage.
8. An apparatus for generating a recipe, the apparatus comprising:
the system comprises a receiving and sending unit, a processing unit and a processing unit, wherein the receiving and sending unit is used for obtaining a long picture of a common menu, and the long picture is used for recording all cooking steps in the common menu;
the identification unit is used for carrying out text identification on the long picture and acquiring text information of all cooking steps in the common menu;
the processing unit is used for processing the text information of all cooking steps in the common menu to obtain at least one cooking program;
the processing unit is further used for generating an intelligent menu according to the cooking program.
9. An apparatus for executing a recipe, the apparatus comprising:
a transceiver unit for receiving an instruction instructing execution of an intelligent recipe, the intelligent recipe including a cooking program, the cooking program including a control program for a plurality of cooking stages;
the broadcasting unit is used for broadcasting the target reminding information under the condition that the target reminding information which needs to be broadcasted before the program of the target cooking stage to be executed currently exists in the intelligent menu in the process of executing the intelligent menu, wherein the target cooking stage is any one of the plurality of cooking stages;
the transceiver unit is further configured to receive a confirmation instruction, where the confirmation instruction is used to instruct execution of the control program of the target cooking stage;
and the processing unit is used for responding to a confirmation instruction of a user and executing the control program of the target cooking stage.
10. A computer-readable storage medium comprising computer instructions which, when executed on a computer, cause the computer to perform the method of any of claims 1 to 7.
CN202111223281.3A 2021-10-20 2021-10-20 Menu generation method and device Pending CN113848745A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111223281.3A CN113848745A (en) 2021-10-20 2021-10-20 Menu generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111223281.3A CN113848745A (en) 2021-10-20 2021-10-20 Menu generation method and device

Publications (1)

Publication Number Publication Date
CN113848745A true CN113848745A (en) 2021-12-28

Family

ID=78982376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111223281.3A Pending CN113848745A (en) 2021-10-20 2021-10-20 Menu generation method and device

Country Status (1)

Country Link
CN (1) CN113848745A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100936048B1 (en) * 2009-03-26 2010-01-08 현대통신 주식회사 Tv for providing a cooking information in kitchen
US20150066516A1 (en) * 2013-09-03 2015-03-05 Panasonic Intellectual Property Corporation Of America Appliance control method, speech-based appliance control system, and cooking appliance
WO2019000988A1 (en) * 2017-06-30 2019-01-03 广东美的厨房电器制造有限公司 Recipe recommendation method, device and system, storage medium and computer device
CN110781874A (en) * 2019-09-17 2020-02-11 张玮 Method for converting electronic menu into menu of cooking machine and menu making terminal
CN110968748A (en) * 2019-11-29 2020-04-07 珠海优特智厨科技有限公司 Electronic menu processing method, device and system
CN113238508A (en) * 2021-05-27 2021-08-10 海信家电集团股份有限公司 Cooking reminding method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100936048B1 (en) * 2009-03-26 2010-01-08 현대통신 주식회사 Tv for providing a cooking information in kitchen
US20150066516A1 (en) * 2013-09-03 2015-03-05 Panasonic Intellectual Property Corporation Of America Appliance control method, speech-based appliance control system, and cooking appliance
WO2019000988A1 (en) * 2017-06-30 2019-01-03 广东美的厨房电器制造有限公司 Recipe recommendation method, device and system, storage medium and computer device
CN110781874A (en) * 2019-09-17 2020-02-11 张玮 Method for converting electronic menu into menu of cooking machine and menu making terminal
CN110968748A (en) * 2019-11-29 2020-04-07 珠海优特智厨科技有限公司 Electronic menu processing method, device and system
CN113238508A (en) * 2021-05-27 2021-08-10 海信家电集团股份有限公司 Cooking reminding method and device

Similar Documents

Publication Publication Date Title
US10969954B2 (en) Electronic device for processing user input and method for processing user input
US10468903B2 (en) Device for performing wireless charging and method thereof
US9577692B2 (en) Subscriber identification module management method and electronic device supporting the same
EP3402185B1 (en) Method and apparatus for transmitting video during voice call
CN110622123A (en) Display method and device
US10225791B2 (en) Device searching method and electronic device for supporting the same
CN112996141B (en) Image sharing method and electronic equipment
US9591433B2 (en) Communication method, electronic device, and storage medium
US10587566B2 (en) Data transmission method and electronic device for supporting the same
US10185530B2 (en) Contents sharing method and electronic device supporting the same
CN110622571B (en) Network connection method, device and terminal
EP3487201B1 (en) Electronic device for controlling an external device using a number and method thereof
US11323880B2 (en) Method for wireless connection and electronic device therefor
US20150372965A1 (en) Method of inviting other electronic devices to chat room by using information on access point and electronic device therefor
CN113985745B (en) Menu configuration and execution method and device
US20160197780A1 (en) Method and apparatus for transmitting configuration information
CN107943687B (en) Equipment sharing method, device, server and storage medium
US10362036B2 (en) Electronic device, operation method thereof and recording medium
CN113848745A (en) Menu generation method and device
CN113467316A (en) Menu generation and execution method and device
US20170048292A1 (en) Electronic device and method for providing content
US10009421B2 (en) Contents control in electronic device
US20150373187A1 (en) Device for displaying identification information on other device and method thereof
CN113741222B (en) Menu generation and execution methods and devices
US10127703B2 (en) Image output method and electronic device for supporting the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination