CN114343437A - Auxiliary cooking system and method based on voice recognition - Google Patents

Auxiliary cooking system and method based on voice recognition Download PDF

Info

Publication number
CN114343437A
CN114343437A CN202210040707.XA CN202210040707A CN114343437A CN 114343437 A CN114343437 A CN 114343437A CN 202210040707 A CN202210040707 A CN 202210040707A CN 114343437 A CN114343437 A CN 114343437A
Authority
CN
China
Prior art keywords
voice
user
instruction
intelligent kitchen
menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210040707.XA
Other languages
Chinese (zh)
Other versions
CN114343437B (en
Inventor
糜鑫
邱栋泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yolanda Technology Co ltd
Original Assignee
Shenzhen Yolanda Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yolanda Technology Co ltd filed Critical Shenzhen Yolanda Technology Co ltd
Priority to CN202210040707.XA priority Critical patent/CN114343437B/en
Publication of CN114343437A publication Critical patent/CN114343437A/en
Application granted granted Critical
Publication of CN114343437B publication Critical patent/CN114343437B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses supplementary culinary art system and method based on speech recognition belongs to culinary art auxiliary assembly technical field, wherein, the method includes: acquiring a menu selection instruction of a user; acquiring a target menu based on a menu selection instruction of a user and displaying the target menu, wherein the target menu comprises a plurality of operation steps; acquiring a device binding instruction; establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction; acquiring a voice auxiliary opening instruction; acquiring a menu voice packet corresponding to a target menu based on a voice auxiliary opening instruction, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively; repeatedly executing and acquiring the voice instruction of the user, broadcasting at least one voice file and/or controlling at least one intelligent kitchen device to finish target operation based on the voice instruction of the user until the user finishes cooking, and the intelligent kitchen device has the advantages of assisting the user in cooking and improving the cooking experience of the user.

Description

Auxiliary cooking system and method based on voice recognition
Technical Field
The invention mainly relates to the technical field of cooking auxiliary equipment, in particular to an auxiliary cooking system and method based on voice recognition.
Background
With the continuous development and progress of science and technology, more and more people begin to try baking and drinking by themselves. Under the condition that people have higher and higher requirements on weight precision of food materials required to be prepared for baking and drinking, intelligent kitchen equipment and software for assisting users in cooking appear.
The existing intelligent kitchen equipment and software for assisting the user in cooking still need to be manually controlled, for example, the user needs to manually click the software to check the next operation step of the menu; as another example, during the manufacturing process, the user is required to manually control the intelligent nutrition kitchen appliance to perform certain operations. The use is inconvenient.
Therefore, it is desirable to provide a voice recognition-based auxiliary cooking system and method for assisting a user in cooking and improving the cooking experience of the user.
Disclosure of Invention
One of the embodiments of the present specification provides an auxiliary cooking method based on voice recognition, including: acquiring a menu selection instruction of a user; acquiring a target menu based on the user menu selection instruction, and displaying the target menu, wherein the target menu comprises a plurality of operation steps; acquiring a device binding instruction; establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction; acquiring a voice auxiliary opening instruction; acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary opening instruction, wherein the menu voice packet comprises voice files corresponding to the plurality of operation steps of the target menu respectively; and repeatedly executing to obtain a user voice instruction, and broadcasting at least one voice file and/or controlling the at least one intelligent kitchen device to finish target operation based on the user voice instruction voice until the user finishes cooking.
In some embodiments, the establishing a communication connection with at least one smart kitchen device based on the device binding instruction includes: judging whether the user terminal is bound with at least one intelligent kitchen device or not based on the device binding instruction; if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a new binding prompt, judging whether the at least one new intelligent kitchen device needs to be bound or not based on feedback of the user on the new binding prompt, and if the at least one new intelligent kitchen device needs to be bound, establishing communication connection with the at least one new intelligent kitchen device through Bluetooth based on feedback of the user on the new binding prompt; if the user terminal is not bound with at least one intelligent kitchen device, sending a binding prompt, judging whether the at least one intelligent kitchen device needs to be bound or not based on the feedback of the user to the binding prompt, and if the at least one intelligent kitchen device needs to be bound, establishing communication connection with the at least one intelligent kitchen device through Bluetooth based on the feedback of the user to the binding prompt.
In some embodiments, said instructing to voice-cast at least one of said voice files based on said user voice comprises: and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice indication voice of the user.
In some embodiments, the at least one intelligent kitchen device comprises at least one of a kitchen scale and a kitchen processing device; the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling an operating time of the kitchen processing equipment.
In some embodiments, the method further comprises: acquiring the operation steps completed by the user, and generating a progress bar based on the completed operation steps; and displaying the progress bar.
One of the embodiments of the present specification provides an auxiliary cooking system based on voice recognition, including: the menu instruction acquisition module is used for acquiring a menu selection instruction of a user; the target menu acquisition module is used for acquiring a target menu based on the user menu selection instruction and displaying the target menu, and the target menu comprises a plurality of operation steps; the binding instruction acquisition module is used for acquiring a device binding instruction; the device binding module is used for establishing communication connection with at least one intelligent kitchen device based on the device binding instruction; the auxiliary instruction acquisition module is used for acquiring a voice auxiliary opening instruction; the voice auxiliary module is used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files corresponding to the plurality of operation steps of the target menu respectively; and the intelligent kitchen equipment is also used for repeatedly executing and acquiring a user voice instruction based on the voice auxiliary starting instruction, and broadcasting at least one voice file and/or controlling the at least one intelligent kitchen equipment to finish target operation based on the user voice instruction until the user finishes cooking.
In some embodiments, the device binding module is further configured to: judging whether the user terminal is bound with at least one intelligent kitchen device or not based on the device binding instruction; if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a new binding prompt, judging whether the at least one new intelligent kitchen device needs to be bound or not based on feedback of the user on the new binding prompt, and if the at least one new intelligent kitchen device needs to be bound, establishing communication connection with the at least one new intelligent kitchen device through Bluetooth based on feedback of the user on the new binding prompt; if the user terminal is not bound with at least one intelligent kitchen device, sending a binding prompt, judging whether the at least one intelligent kitchen device needs to be bound or not based on the feedback of the user to the binding prompt, and if the at least one intelligent kitchen device needs to be bound, establishing communication connection with the at least one intelligent kitchen device through Bluetooth based on the feedback of the user to the binding prompt.
In some embodiments, the voice assistance module is further to: based on the voice indication voice of the user, playing the voice file corresponding to the current operation step again or playing the voice file corresponding to the next operation step
In some embodiments, the voice assistance module is further to: the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling an operating time of the kitchen processing equipment.
In some embodiments, the system further comprises a progress display module, configured to obtain the operation steps completed by the user, and generate a progress bar based on the completed operation steps; and is also used for displaying the progress bar.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a speech recognition-based assisted cooking system according to some embodiments of the present application;
FIG. 2 is an exemplary block diagram of a computing device shown in accordance with some embodiments of the present application;
fig. 3 is an exemplary flow diagram of a method of assisted cooking based on speech recognition according to some embodiments of the present application.
In the figure, 100, application scenarios; 110. a processing device; 120. a network; 130. a user terminal; 140. a storage device; 210. a processor; 220. a read-only memory; 230. a random access memory; 240. a communication port; 250. an input/output interface; 260. and a hard disk.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. It is understood that these exemplary embodiments are given solely to enable those skilled in the relevant art to better understand and implement the present invention, and are not intended to limit the scope of the invention in any way. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules or units in a system according to embodiments of the present application, any number of different modules or units may be used and run on a client and/or server. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario 100 of a speech recognition-based assisted cooking system according to some embodiments of the present application.
As shown in fig. 1, the application scenario 100 may include a processing device 110, a network 120, a user terminal 130, and a storage device 140.
In some embodiments, the processing device 110 may be used to process information and/or data related to assisted cooking. For example, the processing device 110 may be used to recognize a user voice indication. In some embodiments, the processing device 110 may be regional or remote. For example, processing device 110 may access information and/or profiles stored in user terminal 130 and storage device 140 via network 120. In some embodiments, processing device 110 may be directly connected to user terminal 130 and storage device 140 to access information and/or material stored therein. In some embodiments, the processing device 110 may execute on a cloud platform. In some embodiments, the processing device 110 may include a processor 210, and the processor 210 may include one or more sub-processors (e.g., a single core processing device or a multi-core processing device). Merely by way of example, the processor 210 may comprise a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), the like, or any combination thereof.
The network 120 may facilitate the exchange of data and/or information in the application scenario 100. In some embodiments, one or more components in the application scenario 100 (e.g., the processing device 110, the user terminal 130, and the storage device 140) may send data and/or information to other components in the processing device 100 via the network 120. For example, the processing device 110 may transmit the target recipe to the user terminal 130 via the network 120. In some embodiments, the network 120 may be any type of wired or wireless network. For example, network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a wireless area network (WLAN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, the like, or any combination thereof.
The user terminal 130 is a terminal device used by a user. In some embodiments, the user terminal 130 may acquire information or data related to auxiliary cooking. For example, the user terminal 130 may obtain a user menu selection instruction; the target recipe is acquired from the processing device 110 through the network 120 based on the user recipe selection instruction, and the target recipe is displayed. For another example, the user terminal 130 may obtain a device binding instruction; and establishing communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, the user terminal 130 may include one or any combination of a mobile device, a tablet, a laptop, and the like.
In some embodiments, the user terminal 130 may be configured with an auxiliary cooking system based on voice recognition to assist the user in cooking, and the auxiliary cooking system based on voice recognition may be used as an application software (APP) on the user terminal 130.
In some embodiments, a voice recognition-based auxiliary cooking system may include a recipe instruction acquisition module, a target recipe acquisition module, a binding instruction acquisition module, an equipment binding module, an auxiliary instruction acquisition module, and a voice auxiliary module.
The menu instruction acquisition module can be used for acquiring a menu selection instruction of a user.
The target menu obtaining module may be configured to obtain a target menu based on the user menu selection instruction and display the target menu, where the target menu includes a plurality of operation steps.
The binding instruction obtaining module may be configured to obtain a device binding instruction.
The device binding module may be configured to establish a communication connection with at least one smart kitchen device based on the device binding instruction.
The auxiliary instruction acquisition module can be used for acquiring a voice auxiliary opening instruction.
The voice auxiliary module can be used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively; the intelligent kitchen equipment is also used for repeatedly executing and acquiring the voice instruction of the user based on the voice auxiliary starting instruction, and broadcasting at least one voice file and/or controlling at least one intelligent kitchen equipment to finish target operation based on the voice instruction of the user until the user finishes cooking.
For more description of the menu instruction obtaining module, the target menu obtaining module, the binding instruction obtaining module, the device binding module, the auxiliary instruction obtaining module, and the voice auxiliary module, reference may be made to fig. 3 and related description thereof, which are not repeated herein.
In some embodiments, storage device 140 may be connected to network 120 to enable communication with one or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.). One or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.) may access data or instructions stored in storage device 140 via network 120. In some embodiments, storage device 140 may be directly connected to or in communication with one or more components in processing device 100 (e.g., processing device 110, user terminal 130). In some embodiments, the storage device 140 may be part of the processing device 110. In some embodiments, the processing device 110 may also be located in the user terminal 130.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 140 may be a data storage device comprising a cloud computing platform, such as a public cloud, a private cloud, a community and hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
FIG. 2 is an exemplary block diagram of a computing device shown in accordance with some embodiments of the present application.
In some embodiments, processing device 110 and/or user terminal 130 may be implemented on computing device 200. For example, processing device 110 may implement and execute the get work tasks disclosed herein on computing device 200.
As shown in fig. 2, computing device 200 may include a processor 210, a read only memory 220, a random access memory 230, a communication port 240, an input/output interface 250, and a hard disk 260.
The processor 210 may execute the computing instructions (program code) and perform the functions of the processing device 100 described herein. The computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (which refer to specific functions described herein). For example, the processor 210 may establish a deep reinforcement learning model and use the deep reinforcement learning model to determine the engineering supervision result of the construction node based on the model difference data, the construction quality standard data and the actual construction quality data of the construction node. In some embodiments, processor 210 may include microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), Central Processing Units (CPU), Graphics Processing Units (GPU), Physical Processing Units (PPU), microcontroller units, Digital Signal Processors (DSP), Field Programmable Gate Array (FPGA), Advanced RISC Machines (ARM), programmable logic devices, any circuit or processor capable of executing one or more functions, or the like, or any combination thereof. For illustration only, the computing device 200 in fig. 2 depicts only one processor, but it should be noted that the computing device 200 in the present application may also include multiple processors.
The memory (e.g., Read Only Memory (ROM) 220, Random Access Memory (RAM) 230, hard disk 260, etc.) of the computing device 200 may store data/information obtained from any other component of the application scenario 100. Such as a target recipe retrieved from the storage device 140. Exemplary ROMs may include Mask ROM (MROM), Programmable ROM (PROM), erasable programmable ROM (PEROM), Electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), digital versatile disk ROM, and the like. Exemplary RAM may include Dynamic RAM (DRAM), double-data-rate synchronous dynamic RAM (DDR SDRAM), Static RAM (SRAM), and the like.
The input/output interface 250 may be used to input or output signals, data, or information. In some embodiments, the input/output interface 250 may enable a user to interface with the computing device 200. For example, the user inputs a user recipe selection instruction to the computing device 200 via the input/output interface 250. In some embodiments, input/output interface 250 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, and the like, or any combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), Light Emitting Diode (LED) based displays, flat panel displays, curved displays, television equipment, Cathode Ray Tubes (CRTs), and the like, or any combination thereof. The communication port 240 may be connected to a network for data communication. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, an optical cable, or a telephone line, among others, or any combination thereof. The wireless connection may include bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile networks (e.g., 3G, 4G, or 5G, etc.), and the like, or any combination thereof. In some embodiments, the communication port 240 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, the communication port 240 may be a specially designed port.
Computing device 200 depicts only one central processor and/or processor for purposes of illustration only. However, it should be noted that the computing device 200 in the present application may include a plurality of central processing units and/or processors, and thus the operations and/or methods described in the present application implemented by one central processing unit and/or processor may also be implemented by a plurality of central processing units and/or processors, collectively or independently. For example, a central processor and/or processors of computing device 200 may perform steps a and B. In another example, steps a and B may also be performed by two different central processors and/or processors in computing device 200, either in combination or separately (e.g., a first processor performing step a and a second processor performing step B, or both a first and second processor performing steps a and B together).
Fig. 3 is an exemplary flowchart of a method of assisted cooking based on speech recognition according to some embodiments of the present application. As shown in fig. 3, the voice recognition-based auxiliary cooking method may include the following steps. In some embodiments, the voice recognition based assisted cooking method may be performed by the user terminal 130 or the computing device 200.
Step 310, obtaining a menu selection instruction of the user. In some embodiments, step 310 may be performed by the recipe instruction acquisition module.
The user menu selection instruction is used for representing the target menu selected by the user. In some embodiments, the user may input the user menu selection instruction to the menu instruction acquisition module by voice input, clicking on a screen, and the like. For example, icons of a plurality of recipes may be displayed on a display screen of the user terminal 130 or the computing device 200, a user may click on an icon of a certain recipe, and the user terminal 130 or the computing device 200 may automatically obtain a user recipe selection instruction. Illustratively, when the user opens the APP, browses the today's menu, selects a favorite menu as the target menu, enters a tutorial card, and starts to prepare for cooking.
And 320, acquiring a target menu based on the menu selection instruction of the user and displaying the target menu. In some embodiments, step 320 may be performed by the target recipe acquisition module.
In some embodiments, the target recipe acquisition module may acquire the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the user recipe selection instruction, wherein the target recipe includes a plurality of operation steps. After acquiring the target recipe, the target recipe acquisition module may display the target recipe on a display screen of the user terminal 130 or the computing device 200.
Step 330, obtain the device binding instruction. In some embodiments, step 330 may be performed by the binding instruction fetch module.
The device binding instructions are used to characterize a user's desire for the user terminal 130 or the computing device 200 to establish a communication connection with at least one intelligent kitchen device. Wherein, intelligent kitchen equipment can be for kitchen balance and kitchen processing equipment etc. wherein, kitchen balance can be used for weighing the weight of eating the material, and kitchen processing equipment can be used for eating material and process. Kitchen processing equipment may include ovens, stoves, rice cookers, and the like. In some embodiments, the user may input the device binding instruction to the binding instruction obtaining module by voice input, clicking a screen, and the like.
And step 340, establishing communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, step 330 may be performed by the device binding module.
In some embodiments, the device binding module establishing a communication connection with at least one smart kitchen device based on the device binding instruction may include:
judging whether the user terminal is bound with at least one intelligent kitchen device or not based on the device binding instruction;
if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a new bound prompt, wherein the new bound prompt can be a voice prompt or an identifier or a character displayed on a display screen, judging whether the at least one new intelligent kitchen device needs to be bound or not based on the feedback of the user on the new bound prompt, and if the at least one new intelligent kitchen device needs to be bound, establishing communication connection with the at least one new intelligent kitchen device through Bluetooth based on the feedback of the user on the new bound prompt; in some embodiments, the user may input the feedback of the user to the newly added binding prompt to the device binding module in a voice input mode, a screen click mode, or the like;
if the user terminal is not bound with at least one intelligent kitchen device, sending a binding prompt, wherein the binding prompt can be a voice prompt or an identifier or a character displayed on a display screen, judging whether the at least one intelligent kitchen device needs to be bound or not based on the feedback of the user on the binding prompt, and if the at least one intelligent kitchen device needs to be bound, establishing communication connection with the at least one intelligent kitchen device through Bluetooth based on the feedback of the user on the binding prompt; in some embodiments, the user may input user feedback of the binding prompt to the device binding module by voice input, clicking on a screen, and the like.
Illustratively, the device binding module detects whether intelligent kitchen equipment has been bound, if not, clicks the plus sign in the APP tutorial and adds equipment, detects whether to open the bluetooth, if not, prompts the user to open the bluetooth, opens the bluetooth, detects whether to connect equipment, if not, prompts the user intelligent kitchen to open equipment, opens intelligent kitchen equipment, APP searches through the bluetooth and automatically connects intelligent kitchen equipment, when the tutorial interface shows its icon, shows that the connection is successful.
Step 350, acquiring a voice auxiliary starting instruction. In some embodiments, step 350 may be performed by an auxiliary instruction fetch module.
The voice assistance turn-on instruction is used to indicate that the user requires voice assistance from the user terminal 130 or the computing device 200 during the cooking process. In some embodiments, the user may input the voice auxiliary opening instruction to the auxiliary instruction acquisition module by voice input, clicking a screen, and the like.
In some embodiments, after obtaining the voice-assisted activation instruction, the user terminal 130 or the computing device 200 may perform an operation based on the voice instruction of the user.
And 360, acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary opening instruction, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively. In some embodiments, step 320 may be performed by a voice assistance module.
In some embodiments, the voice assistance module may obtain the recipe voice packet corresponding to the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the voice assistance start instruction.
And step 370, repeatedly executing to obtain the voice instruction of the user based on the voice auxiliary starting instruction, and broadcasting at least one voice file and/or controlling at least one intelligent kitchen device to finish the target operation based on the voice instruction of the user until the user finishes cooking. In some embodiments, step 320 may be performed by a voice assistance module.
In some embodiments, the user terminal 130 or computing device 200 may include a microphone that may be used to collect user voice instructions. The voice assistance module may perform voice recognition on the user voice indication. For example, the voice assistant module may rely on a microphone in iOS and Siri native libraries for voice listening and recognition. After the voice recognition, the user terminal 130 or the computing device 200 may extract the keyword in the voice recognition result, and match the operation corresponding to the user voice instruction based on the keyword.
In some embodiments, the voice assistance module instructs to voice-cast the at least one voice file based on the user's voice, and may include:
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice indication voice of the user.
For example, before each step starts, the voice auxiliary module broadcasts the voice file corresponding to the current step in a voice manner, and after the user can say that the voice file is broadcast again, the voice auxiliary module broadcasts the voice file again; the user says 'switch next step', and the voice auxiliary module can automatically switch the next operation step and broadcast the voice file corresponding to the next operation step.
In some embodiments, the voice assistance module controls the at least one intelligent kitchen appliance to perform the target operation based on the user voice indication includes at least one of controlling kitchen scale zero setting, controlling kitchen scale switching units, controlling a processing temperature of the kitchen processing appliance, and controlling an operating time of the kitchen processing appliance.
For example, the user says "switch next step", and the voice assistant module will automatically switch next operation step and broadcast the voice file corresponding to the next operation step and control the kitchen scale to set zero. For another example, the user says "switch the weighing unit to 'kilogram'", and the voice assistance module would control the weighing unit of the kitchen scale to switch to "kilogram".
In some embodiments, for the current operation step, the voice assistant module controls the kitchen processing device to automatically set a value required for the cooking (for example, at least one of a processing temperature of the kitchen processing device and an operating time of the kitchen processing device) by recognizing a keyword of the current operation step in the target recipe, converting the value in the keyword into data and transmitting the data to the kitchen processing device.
In some embodiments, the progress display module may further obtain the operation steps completed by the user, generate a progress bar based on the completed operation steps, and display the progress bar. For example, in the cooking process, a user can make a cooking according to a prompt, a kitchen scale and a kitchen processing device can transmit operation data of the user to the user terminal 130 or the computing device 200, the user terminal 130 or the computing device 200 is converted into a progress bar form to display the operation dynamics of the user, when the operation is completed, the progress bar process is loaded completely, the state of the completion of the step is displayed, the user terminal 130 or the computing device 200 broadcasts the next step through voice, each step of the user during cooking is broadcasted according to the user process until the cooking is completed, and the error operation of the user in subjective judgment is effectively reduced.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may run entirely on the supervisor computer, as a stand-alone software package, partly on the supervisor computer, partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the proctoring person computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (e.g., through the internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the present disclosure.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. An auxiliary cooking method based on voice recognition, which is used for being executed on a user terminal, is characterized by comprising the following steps:
acquiring a menu selection instruction of a user;
acquiring a target menu based on the user menu selection instruction, and displaying the target menu, wherein the target menu comprises a plurality of operation steps;
acquiring a device binding instruction;
establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction;
acquiring a voice auxiliary opening instruction;
based on the voice-assisted activation instruction,
acquiring a menu voice packet corresponding to the target menu, wherein the menu voice packet comprises voice files corresponding to the plurality of operation steps of the target menu respectively;
and repeatedly executing to obtain a user voice instruction, and broadcasting at least one voice file and/or controlling the at least one intelligent kitchen device to finish target operation based on the user voice instruction voice until the user finishes cooking.
2. The auxiliary cooking method based on voice recognition of claim 1, wherein the establishing of the communication connection with at least one intelligent kitchen device based on the device binding instruction comprises:
judging whether the user terminal is bound with at least one intelligent kitchen device or not based on the device binding instruction;
if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a new binding prompt, judging whether the at least one new intelligent kitchen device needs to be bound or not based on feedback of the user on the new binding prompt, and if the at least one new intelligent kitchen device needs to be bound, establishing communication connection with the at least one new intelligent kitchen device through Bluetooth based on feedback of the user on the new binding prompt;
if the user terminal is not bound with at least one intelligent kitchen device, sending a binding prompt, judging whether the at least one intelligent kitchen device needs to be bound or not based on the feedback of the user to the binding prompt, and if the at least one intelligent kitchen device needs to be bound, establishing communication connection with the at least one intelligent kitchen device through Bluetooth based on the feedback of the user to the binding prompt.
3. The auxiliary cooking method based on voice recognition according to claim 1, wherein the voice broadcasting at least one voice file based on the user voice instruction comprises:
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice indication voice of the user.
4. The method of claim 1, wherein the at least one smart kitchen device comprises at least one of a kitchen scale and a kitchen processing device;
the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling an operating time of the kitchen processing equipment.
5. The voice recognition-based auxiliary cooking method according to any one of claims 1 to 4, further comprising:
acquiring the operation steps completed by the user, and generating a progress bar based on the completed operation steps;
and displaying the progress bar.
6. An auxiliary cooking system based on voice recognition, which is used for being executed on a user terminal, and is characterized by comprising:
the menu instruction acquisition module is used for acquiring a menu selection instruction of a user;
the target menu acquisition module is used for acquiring a target menu based on the user menu selection instruction and displaying the target menu, and the target menu comprises a plurality of operation steps;
the binding instruction acquisition module is used for acquiring a device binding instruction;
the device binding module is used for establishing communication connection with at least one intelligent kitchen device based on the device binding instruction;
the auxiliary instruction acquisition module is used for acquiring a voice auxiliary opening instruction;
the voice auxiliary module is used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files corresponding to the plurality of operation steps of the target menu respectively; and the intelligent kitchen equipment is also used for repeatedly executing and acquiring a user voice instruction based on the voice auxiliary starting instruction, and broadcasting at least one voice file and/or controlling the at least one intelligent kitchen equipment to finish target operation based on the user voice instruction until the user finishes cooking.
7. The voice recognition-based assisted cooking system of claim 6, wherein the device binding module is further configured to:
judging whether the user terminal is bound with at least one intelligent kitchen device or not based on the device binding instruction;
if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a new binding prompt, judging whether the at least one new intelligent kitchen device needs to be bound or not based on feedback of the user on the new binding prompt, and if the at least one new intelligent kitchen device needs to be bound, establishing communication connection with the at least one new intelligent kitchen device through Bluetooth based on feedback of the user on the new binding prompt;
if the user terminal is not bound with at least one intelligent kitchen device, sending a binding prompt, judging whether the at least one intelligent kitchen device needs to be bound or not based on the feedback of the user to the binding prompt, and if the at least one intelligent kitchen device needs to be bound, establishing communication connection with the at least one intelligent kitchen device through Bluetooth based on the feedback of the user to the binding prompt.
8. The voice recognition-based assisted cooking system of claim 6, wherein the voice assistance module is further configured to:
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice indication voice of the user.
9. The voice recognition-based assisted cooking system of claim 6, wherein the voice assistance module is further configured to:
the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling an operating time of the kitchen processing equipment.
10. The voice recognition-based auxiliary cooking system according to any one of claims 6 to 9, further comprising a progress display module for acquiring the operation steps completed by the user, and generating a progress bar based on the completed operation steps; and is also used for displaying the progress bar.
CN202210040707.XA 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition Active CN114343437B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210040707.XA CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210040707.XA CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Publications (2)

Publication Number Publication Date
CN114343437A true CN114343437A (en) 2022-04-15
CN114343437B CN114343437B (en) 2023-08-04

Family

ID=81108831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210040707.XA Active CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Country Status (1)

Country Link
CN (1) CN114343437B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010073929A (en) * 2000-01-22 2001-08-03 윤종용 Apparatus For Cooking Control With Voice Recognition Of a Microwave Oven And Method The Same
JP2001218678A (en) * 2000-02-10 2001-08-14 Tiger Vacuum Bottle Co Ltd Electric rice cooker
JP2003108183A (en) * 2001-09-28 2003-04-11 Toshiba Corp Voice controller and heating cooker
CN101411592A (en) * 2007-10-19 2009-04-22 深圳市壹声通语音科技有限公司 Intelligent cooking apparatus with sound control recognition function
WO2017166317A1 (en) * 2016-04-01 2017-10-05 深圳市赛亿科技开发有限公司 Smart cooking device, and smart cooking system and assistant method thereof
CN108903587A (en) * 2018-05-29 2018-11-30 福来宝电子(深圳)有限公司 Voice auxiliary cooking method, speech processing device and computer readable storage medium
CN109067997A (en) * 2018-08-31 2018-12-21 上海与德通讯技术有限公司 The method and mobile terminal of voice guidance culinary art
CN109215644A (en) * 2017-07-07 2019-01-15 佛山市顺德区美的电热电器制造有限公司 A kind of control method and device
JP2019074215A (en) * 2017-10-12 2019-05-16 株式会社パロマ Cooking stove
KR20190118385A (en) * 2018-04-10 2019-10-18 (주)소닉더치코리아 Cold Brew Coffee Extractor Adjusting Sonic Vibration Based on Voice Recognizing Order
CN110522289A (en) * 2018-05-25 2019-12-03 潘通 A kind of control method of intelligent electric cooker, device and intelligent electric cooker
CN110786757A (en) * 2019-09-16 2020-02-14 上海纯米电子科技有限公司 Kitchen appliance with intelligent voice control and broadcast functions and control method thereof
CN112128810A (en) * 2020-09-18 2020-12-25 珠海格力电器股份有限公司 Cooking appliance control method and device, cooking appliance and storage medium
CN112256230A (en) * 2020-10-16 2021-01-22 广东美的厨房电器制造有限公司 Menu interaction method and system and storage medium
CN113475943A (en) * 2021-07-09 2021-10-08 海信家电集团股份有限公司 Menu execution method and device
CN114568948A (en) * 2020-11-30 2022-06-03 浙江绍兴苏泊尔生活电器有限公司 Cooking control method, device and system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010073929A (en) * 2000-01-22 2001-08-03 윤종용 Apparatus For Cooking Control With Voice Recognition Of a Microwave Oven And Method The Same
JP2001218678A (en) * 2000-02-10 2001-08-14 Tiger Vacuum Bottle Co Ltd Electric rice cooker
JP2003108183A (en) * 2001-09-28 2003-04-11 Toshiba Corp Voice controller and heating cooker
CN101411592A (en) * 2007-10-19 2009-04-22 深圳市壹声通语音科技有限公司 Intelligent cooking apparatus with sound control recognition function
WO2017166317A1 (en) * 2016-04-01 2017-10-05 深圳市赛亿科技开发有限公司 Smart cooking device, and smart cooking system and assistant method thereof
CN109215644A (en) * 2017-07-07 2019-01-15 佛山市顺德区美的电热电器制造有限公司 A kind of control method and device
JP2019074215A (en) * 2017-10-12 2019-05-16 株式会社パロマ Cooking stove
KR20190118385A (en) * 2018-04-10 2019-10-18 (주)소닉더치코리아 Cold Brew Coffee Extractor Adjusting Sonic Vibration Based on Voice Recognizing Order
CN110522289A (en) * 2018-05-25 2019-12-03 潘通 A kind of control method of intelligent electric cooker, device and intelligent electric cooker
CN108903587A (en) * 2018-05-29 2018-11-30 福来宝电子(深圳)有限公司 Voice auxiliary cooking method, speech processing device and computer readable storage medium
CN109067997A (en) * 2018-08-31 2018-12-21 上海与德通讯技术有限公司 The method and mobile terminal of voice guidance culinary art
CN110786757A (en) * 2019-09-16 2020-02-14 上海纯米电子科技有限公司 Kitchen appliance with intelligent voice control and broadcast functions and control method thereof
CN112128810A (en) * 2020-09-18 2020-12-25 珠海格力电器股份有限公司 Cooking appliance control method and device, cooking appliance and storage medium
CN112256230A (en) * 2020-10-16 2021-01-22 广东美的厨房电器制造有限公司 Menu interaction method and system and storage medium
CN114568948A (en) * 2020-11-30 2022-06-03 浙江绍兴苏泊尔生活电器有限公司 Cooking control method, device and system
CN113475943A (en) * 2021-07-09 2021-10-08 海信家电集团股份有限公司 Menu execution method and device

Also Published As

Publication number Publication date
CN114343437B (en) 2023-08-04

Similar Documents

Publication Publication Date Title
CN108831469B (en) Voice command customizing method, device and equipment and computer storage medium
US10284705B2 (en) Method and apparatus for controlling smart device, and computer storage medium
KR20200012933A (en) Shortened voice user interface for assistant applications
CN106681160A (en) Method and device for controlling intelligent equipment
EP3660854A1 (en) Triage dialogue method, device, and system
CN107943796A (en) A kind of interpretation method and device, terminal, readable storage medium storing program for executing
TWI798652B (en) A method and system for automatically generating a data collection module
US10599402B2 (en) Techniques to configure a web-based application for bot configuration
WO2016192188A1 (en) Method and terminal for adjusting cpu operation parameter
WO2018121206A1 (en) Verification code data processing method, apparatus and storage medium
US11269938B2 (en) Database systems and methods for conversational database interaction
CN109447851A (en) One kind prepares for a meal, vegetable providing method, device and equipment
WO2023098732A1 (en) Cross-device interaction method and apparatus, electronic device, and storage medium
CN109101309B (en) Method and apparatus for updating user interface
US20240086631A1 (en) Table processing method and apparatus, electronic device, medium and program product
CN110968367A (en) E-commerce commodity field configuration method, device, server and storage medium
US20190311647A1 (en) Methods and Systems for Conversationalization of Recipes
CN110364155A (en) Voice control error-reporting method, electric appliance and computer readable storage medium
CN114343437B (en) Auxiliary cooking system and method based on voice recognition
EP3848801B1 (en) Speech interaction method and apparatuses
CN110909522B (en) Data processing method and device, electronic equipment and medium
US11971977B2 (en) Service providing apparatus
CN103488512B (en) program interface display processing method and device
CN113204495A (en) Automatic testing method and device based on B/S architecture, storage medium and equipment
KR101372837B1 (en) Method for making electronic documents by hybrid basis using direct inputs and voice commands, and computer-readable recording medium for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant