CN114343437B - Auxiliary cooking system and method based on voice recognition - Google Patents

Auxiliary cooking system and method based on voice recognition Download PDF

Info

Publication number
CN114343437B
CN114343437B CN202210040707.XA CN202210040707A CN114343437B CN 114343437 B CN114343437 B CN 114343437B CN 202210040707 A CN202210040707 A CN 202210040707A CN 114343437 B CN114343437 B CN 114343437B
Authority
CN
China
Prior art keywords
voice
user
instruction
intelligent kitchen
menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210040707.XA
Other languages
Chinese (zh)
Other versions
CN114343437A (en
Inventor
糜鑫
邱栋泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yolanda Technology Co ltd
Original Assignee
Shenzhen Yolanda Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yolanda Technology Co ltd filed Critical Shenzhen Yolanda Technology Co ltd
Priority to CN202210040707.XA priority Critical patent/CN114343437B/en
Publication of CN114343437A publication Critical patent/CN114343437A/en
Application granted granted Critical
Publication of CN114343437B publication Critical patent/CN114343437B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The embodiment of the application discloses an auxiliary cooking system and method based on voice recognition, which belong to the technical field of cooking auxiliary equipment, wherein the method comprises the following steps: acquiring a menu selection instruction of a user; acquiring a target menu based on a user menu selection instruction, and displaying the target menu, wherein the target menu comprises a plurality of operation steps; acquiring a device binding instruction; establishing communication connection with at least one intelligent kitchen device based on the device binding instruction; acquiring a voice auxiliary starting instruction; based on the voice auxiliary starting instruction, acquiring a menu voice packet corresponding to the target menu, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively; the method comprises the steps of repeatedly executing and obtaining the user voice instruction, voice broadcasting at least one voice file based on the user voice instruction and/or controlling at least one intelligent kitchen device to complete target operation until the user completes cooking, and has the advantages of assisting the user in cooking and improving cooking experience of the user.

Description

Auxiliary cooking system and method based on voice recognition
Technical Field
The invention mainly relates to the technical field of cooking auxiliary equipment, in particular to an auxiliary cooking system and method based on voice recognition.
Background
With the continuous development and progress of technology, more and more people start to try to bake and drink by themselves. Under the condition that the requirements of people on the weight accuracy of food materials required to be prepared for baking and drinking are also higher and higher, intelligent kitchen equipment and software for assisting users to cook are appeared.
Existing intelligent kitchen devices and software for assisting a user in cooking still need to be manually controlled, for example, the user needs to manually click on the software to check the next operation step of a recipe; as another example, during manufacturing, a user is required to manually control the intelligent nutritional kitchen appliance to perform certain operations. The use is relatively inconvenient.
Therefore, it is desirable to provide an auxiliary cooking system and method based on voice recognition, which are used for assisting a user in cooking and improving the cooking experience of the user.
Disclosure of Invention
One of the embodiments of the present disclosure provides an auxiliary cooking method based on voice recognition, including: acquiring a menu selection instruction of a user; acquiring a target menu based on the user menu selection instruction, and displaying the target menu, wherein the target menu comprises a plurality of operation steps; acquiring a device binding instruction; establishing communication connection with at least one intelligent kitchen device based on the device binding instruction; acquiring a voice auxiliary starting instruction; acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files respectively corresponding to the plurality of operation steps of the target menu; and repeatedly executing the acquisition of the user voice instruction, voice broadcasting at least one voice file based on the user voice instruction and/or controlling the at least one intelligent kitchen equipment to finish target operation until the user finishes cooking.
In some embodiments, the establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction includes: based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device; if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, sending a newly-added binding prompt, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt; if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of the user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user.
In some embodiments, the voice broadcasting at least one of the voice files based on the user voice indication includes: and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice instruction voice of the user.
In some embodiments, the at least one smart kitchen device comprises at least one of a kitchen scale and a kitchen processing device; the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling a working time of the kitchen processing equipment.
In some embodiments, the method further comprises: acquiring the completed operation steps of the user, and generating a progress bar based on the completed operation steps; and displaying the progress bar.
One of the embodiments of the present specification provides an auxiliary cooking system based on voice recognition, comprising: the menu instruction acquisition module is used for acquiring a menu selection instruction of a user; the target menu obtaining module is used for obtaining a target menu based on the user menu selection instruction and displaying the target menu, and the target menu comprises a plurality of operation steps; the binding instruction acquisition module is used for acquiring the equipment binding instruction; the device binding module is used for establishing communication connection with at least one intelligent kitchen device based on the device binding instruction; the auxiliary instruction acquisition module is used for acquiring a voice auxiliary starting instruction; the voice auxiliary module is used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files respectively corresponding to the plurality of operation steps of the target menu; and the intelligent kitchen equipment is further used for repeatedly executing and acquiring user voice instructions based on the voice auxiliary starting instruction, broadcasting at least one voice file based on the user voice instructions and/or controlling the at least one intelligent kitchen equipment to finish target operation until the user finishes cooking.
In some embodiments, the device binding module is further to: based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device; if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, sending a newly-added binding prompt, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt; if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of the user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user.
In some embodiments, the voice assistance module is further to: playing the voice file corresponding to the current operation step again or playing the voice file corresponding to the next operation step based on the voice of the user indication voice
In some embodiments, the voice assistance module is further to: the target operation includes at least one of controlling the kitchen scale to zero, controlling the kitchen scale switching unit, controlling a processing temperature of the kitchen processing equipment, and controlling a working time of the kitchen processing equipment.
In some embodiments, the system further comprises a progress display module, configured to obtain a completed operation step of the user, and generate a progress bar based on the completed operation step; and the progress bar is also used for displaying the progress bar.
Drawings
The present application will be further illustrated by way of example embodiments, which will be described in detail with reference to the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of a speech recognition based auxiliary cooking system according to some embodiments of the present application;
FIG. 2 is an exemplary block diagram of a computing device shown in accordance with some embodiments of the present application;
fig. 3 is an exemplary flow chart of a speech recognition based assisted cooking method according to some embodiments of the present application.
In the figure, 100, an application scene; 110. a processing device; 120. a network; 130. a user terminal; 140. a storage device; 210. a processor; 220. a read-only memory; 230. a random access memory; 240. a communication port; 250. an input/output interface; 260. a hard disk.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. It should be understood that these exemplary embodiments are presented merely to enable those skilled in the relevant art to better understand and practice the invention and are not intended to limit the scope of the invention in any way. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Although the present application makes various references to certain modules or units in a system according to embodiments of the present application, any number of different modules or units may be used and run on clients and/or servers. The modules are merely illustrative, and different aspects of the systems and methods may use different modules.
Flowcharts are used in this application to describe the operations performed by systems according to embodiments of the present application. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic diagram of an application scenario 100 of a speech recognition-based auxiliary cooking system according to some embodiments of the present application.
As shown in fig. 1, the application scenario 100 may include a processing device 110, a network 120, a user terminal 130, and a storage device 140.
In some embodiments, the processing device 110 may be used to process information and/or data related to auxiliary cooking. For example, the processing device 110 may be used to recognize user voice instructions. In some embodiments, the processing device 110 may be regional or remote. For example, the processing device 110 may access information and/or material stored in the user terminal 130 and the storage device 140 via the network 120. In some embodiments, processing device 110 may be directly connected to user terminal 130 and storage device 140 to access information and/or material stored therein. In some embodiments, the processing device 110 may execute on a cloud platform. In some embodiments, processing device 110 may include a processor 210, and processor 210 may include one or more sub-processors (e.g., a single core processing device or a multi-core processing device). By way of example only, the processor 210 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), the like, or any combination thereof.
The network 120 may facilitate the exchange of data and/or information in the application scenario 100. In some embodiments, one or more components in the application scenario 100 (e.g., the processing device 110, the user terminal 130, and the storage device 140) may send data and/or information to other components in the processing device 100 over the network 120. For example, the processing device 110 may send the target recipe to the user terminal 130 over the network 120 via the network 120. In some embodiments, network 120 may be any type of wired or wireless network. For example, the network 120 may include a cable network, a wired network, a fiber optic network, a telecommunications network, an internal network, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Bluetooth network, a ZigBee network, a Near Field Communication (NFC) network, and the like, or any combination thereof.
The user terminal 130 is a terminal device used by a user. In some embodiments, the user terminal 130 may obtain information or data related to the auxiliary cooking. For example, the user terminal 130 may obtain a user recipe selection instruction; the target recipe is obtained from the processing device 110 via the network 120 based on the user recipe selection instruction and displayed. For another example, the user terminal 130 may obtain a device binding instruction; and establishing communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, the user terminal 130 may include one or any combination of a mobile device, a tablet, a notebook, etc.
In some embodiments, the user terminal 130 may be configured with a voice recognition based auxiliary cooking system for assisting the user in cooking, and the voice recognition based auxiliary cooking system may be an application software (APP) on the user terminal 130.
In some embodiments, the speech recognition based auxiliary cooking system may include a recipe instruction acquisition module, a target recipe acquisition module, a binding instruction acquisition module, a device binding module, an auxiliary instruction acquisition module, and a speech auxiliary module.
The menu instruction acquisition module may be configured to acquire a user menu selection instruction.
The target menu obtaining module may be configured to obtain a target menu based on a user menu selection instruction and display the target menu, the target menu including a plurality of operation steps.
The binding instruction acquisition module may be configured to acquire a device binding instruction.
The device binding module may be configured to establish a communication connection with at least one intelligent kitchen device based on the device binding instructions.
The auxiliary instruction acquisition module may be configured to acquire a voice auxiliary start instruction.
The voice auxiliary module can be used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu respectively; and the intelligent kitchen equipment is further used for repeatedly executing and acquiring the user voice instruction based on the voice auxiliary starting instruction, broadcasting at least one voice file based on the user voice instruction and/or controlling at least one intelligent kitchen equipment to finish target operation until the user finishes cooking.
Further description of the menu instruction acquisition module, the target menu acquisition module, the binding instruction acquisition module, the device binding module, the auxiliary instruction acquisition module, and the voice auxiliary module may be referred to in fig. 3 and related description thereof, and will not be repeated here.
In some embodiments, storage device 140 may be coupled to network 120 to enable communication with one or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.). One or more components of processing device 100 (e.g., processing device 110, user terminal 130, etc.) may access material or instructions stored in storage device 140 via network 120. In some embodiments, the storage device 140 may be directly connected to or in communication with one or more components in the processing device 100 (e.g., the processing device 110, the user terminal 130). In some embodiments, the storage device 140 may be part of the processing device 110. In some embodiments, the processing device 110 may also be located in the user terminal 130.
It should be noted that the foregoing description is provided for illustrative purposes only and is not intended to limit the scope of the present application. Many variations and modifications will be apparent to those of ordinary skill in the art, given the benefit of this disclosure. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 140 may be a data storage device including a cloud computing platform, such as a public cloud, a private cloud, a community, a hybrid cloud, and the like. However, such changes and modifications do not depart from the scope of the present application.
FIG. 2 is an exemplary block diagram of a computing device, shown in accordance with some embodiments of the present application.
In some embodiments, the processing device 110 and/or the user terminal 130 may be implemented on the computing device 200. For example, processing device 110 may implement and execute the acquisition work tasks disclosed herein on computing device 200.
As shown in fig. 2, computing device 200 may include a processor 210, a read-only memory 220, a random access memory 230, a communication port 240, an input/output interface 250, and a hard disk 260.
Processor 210 may execute computing instructions (program code) and perform the functions of processing device 100 described herein. The computing instructions may include programs, objects, components, data structures, procedures, modules, and functions (which refer to particular functions described herein). For example, processor 210 may build a deep reinforcement learning model and use the deep reinforcement learning model to determine engineering management results for the construction node based on model difference data, construction quality standard data, and actual construction quality data for the construction node. In some embodiments, processor 210 may include microcontrollers, microprocessors, reduced Instruction Set Computers (RISC), application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), central Processing Units (CPU), graphics Processing Units (GPU), physical Processing Units (PPU), microcontroller units, digital Signal Processors (DSP), field Programmable Gate Arrays (FPGA), advanced RISC Machines (ARM), programmable logic devices, and any circuits and processors capable of executing one or more functions, or the like, or any combination thereof. For illustration only, the computing device 200 in fig. 2 depicts only one processor, but it should be noted that the computing device 200 in the present application may also include multiple processors.
The memory of computing device 200 (e.g., read Only Memory (ROM) 220, random Access Memory (RAM) 230, hard disk 260, etc.) may store data/information retrieved from any other component of application scenario 100. For example, a target recipe retrieved from the storage device 140. Exemplary ROMs may include Mask ROM (MROM), programmable ROM (PROM), erasable programmable ROM (PEROM), electrically Erasable Programmable ROM (EEPROM), compact disk ROM (CD-ROM), and digital versatile disk ROM, among others. Exemplary RAM may include Dynamic RAM (DRAM), double rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), and the like.
The input/output interface 250 may be used to input or output signals, data, or information. In some embodiments, input/output interface 250 may enable a user to contact computing device 200. For example, a user inputs user recipe selection instructions to computing device 200 via input/output interface 250. In some embodiments, input/output interface 250 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, and the like, or any combination thereof. Exemplary output means may include a display device, a speaker, a printer, a projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), light Emitting Diode (LED) based displays, flat panel displays, curved displays, television equipment, cathode Ray Tubes (CRTs), and the like, or any combination thereof. The communication port 240 may be connected to a network for data communication. The connection may be a wired connection, a wireless connection, or a combination of both. The wired connection may include an electrical cable, optical cable, or telephone line, or the like, or any combination thereof. The wireless connection may include bluetooth, wi-Fi, wiMax, WLAN, zigBee, a mobile network (e.g., 3G, 4G, 5G, etc.), etc., or any combination thereof. In some embodiments, the communication port 240 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, communication port 240 may be a specially designed port.
For purposes of illustration only, computing device 200 depicts only one central processor and/or processor. However, it should be noted that the computing device 200 in this application may include multiple central processors and/or processors, and thus the operations and/or methods described in this application as being implemented by one central processor and/or processor may also be implemented by multiple central processors and/or processors, either collectively or independently. For example, the central processor and/or the processor of computing device 200 may perform steps a and B. In another example, steps a and B may also be performed by two different central processors and/or processors in computing device 200 in combination or separately (e.g., a first processor performing step a and a second processor performing step B, or the first and second processors collectively performing steps a and B).
Fig. 3 is an exemplary flow chart of a method of assisted cooking based on speech recognition according to some embodiments of the present application. As shown in fig. 3, the auxiliary cooking method based on voice recognition may include the following steps. In some embodiments, the speech recognition based assisted cooking method may be performed by the user terminal 130 or the computing device 200.
Step 310, a user menu selection instruction is obtained. In some embodiments, step 310 may be performed by a recipe instruction fetch module.
The user recipe selection instruction is used to characterize the target recipe selected by the user. In some embodiments, the user may input the user recipe selection instruction to the recipe instruction acquisition module by voice input, clicking on a screen, or the like. For example, icons of various recipes may be displayed on the display screen of the user terminal 130 or the computing device 200, the user may click on an icon of a certain recipe, and the user terminal 130 or the computing device 200 may automatically acquire a user recipe selection instruction. Illustratively, when the user opens the APP, browses the today's recipes, selects a favorite recipe as the target recipe, enters the tutorial card, and begins to prepare for cooking.
Step 320, obtaining a target menu based on the user menu selection instruction, and displaying the target menu. In some embodiments, step 320 may be performed by the target recipe acquisition module.
In some embodiments, the target recipe acquisition module may acquire the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the user recipe selection instruction, wherein the target recipe includes a plurality of operational steps. After the target recipe is acquired, the target recipe acquisition module may display the target recipe on a display screen of the user terminal 130 or the computing device 200.
Step 330, obtain the device binding instruction. In some embodiments, step 330 may be performed by a bind instruction fetch module.
The device binding instructions are used to characterize that the user needs the user terminal 130 or the computing device 200 to establish a communication connection with at least one smart kitchen device. Wherein, intelligent kitchen equipment can be kitchen balance and kitchen processing equipment etc. wherein, kitchen balance can be used for weighing the weight of edible material, and kitchen processing equipment can be used for processing edible material. Kitchen processing equipment may include ovens, cookers, rice cookers, and the like. In some embodiments, the user may input the device binding instruction to the binding instruction acquisition module by voice input, clicking on a screen, or the like.
Step 340, establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction. In some embodiments, step 330 may be performed by a device binding module.
In some embodiments, the device binding module establishes a communication connection with at least one intelligent kitchen device based on the device binding instruction, and may include:
based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device;
if the user terminal binds at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, and sending a newly-added binding prompt, wherein the newly-added binding prompt can be a voice prompt or an identifier or a character displayed on a display screen, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt; in some embodiments, the user may input feedback of the user on the newly added binding prompt to the device binding module by voice input, clicking on a screen, and the like;
if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, wherein the binding prompt can be a voice prompt or an identifier or a character displayed on a display screen, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of a user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user; in some embodiments, the user may input user feedback of the binding prompt to the device binding module by voice input, clicking on a screen, or the like.
The device binding module detects whether an intelligent kitchen device is bound or not, if the intelligent kitchen device is not bound, clicks a plus sign adding device in an APP course, detects whether Bluetooth is opened or not, if the intelligent kitchen device is not bound, prompts a user to open Bluetooth, detects whether a device is connected after the Bluetooth is opened, if the intelligent kitchen device is not connected, prompts the user to open the device, and after the intelligent kitchen device is opened, the APP searches through Bluetooth and automatically connects the intelligent kitchen device, and when a course interface displays an icon of the intelligent kitchen device, the icon indicates that the connection is successful.
Step 350, obtain a voice assisted on command. In some embodiments, step 350 may be performed by an auxiliary instruction fetch module.
The voice assisted on command is used to characterize the user's need for voice assistance by the user terminal 130 or the computing device 200 during the cooking process. In some embodiments, the user may input the voice auxiliary turn-on command to the auxiliary command acquisition module by voice input, clicking on a screen, or the like.
In some embodiments, after obtaining the voice assisted on instruction, the user terminal 130 or the computing device 200 may perform an operation based on the user's voice instruction.
Step 360, based on the voice auxiliary opening instruction, acquiring a menu voice packet corresponding to the target menu, wherein the menu voice packet comprises voice files corresponding to a plurality of operation steps of the target menu. In some embodiments, step 320 may be performed by a speech assistance module.
In some embodiments, the voice assistance module may obtain a recipe voice package corresponding to the target recipe from the processing device 110, the user terminal 130, the storage device 140, or an external data source based on the voice assistance activation instruction.
Step 370, based on the voice auxiliary opening instruction, repeatedly executing to obtain the user voice instruction, voice broadcasting at least one voice file and/or controlling at least one intelligent kitchen device to complete target operation based on the user voice instruction until the user completes cooking. In some embodiments, step 320 may be performed by a speech assistance module.
In some embodiments, the user terminal 130 or computing device 200 may include a microphone that may be used to collect user voice indications. The voice assistance module may perform voice recognition on the user voice indication. For example, the speech assistance module may rely on a microphone in the iOS and the Siri native library for speech listening and recognition. After the voice recognition, the user terminal 130 or the computing device 200 may extract keywords in the voice recognition result, and match the operation corresponding to the user voice instruction based on the keywords.
In some embodiments, the voice assistance module voice broadcasts the at least one voice file based on the user voice indication may include:
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice instruction voice of the user.
For example, before each step starts, the voice auxiliary module performs voice broadcasting on the voice file corresponding to the current step, and after the user can say "rebroadcast once again", the voice auxiliary module rebroadcasts once again; the user says "switch next", the voice auxiliary module automatically switches the next operation step and broadcasts the voice file corresponding to the next operation step.
In some embodiments, the voice assistance module controls the target operation performed by the at least one intelligent kitchen device based on the user voice indication to include at least one of controlling kitchen scale zero, controlling kitchen scale switching units, controlling a processing temperature of the kitchen processing device, and controlling an operating time of the kitchen processing device.
For example, the user speaks "switch next", and the voice assistance module automatically switches the next operation step and broadcasts the voice file corresponding to the next operation step and controls the kitchen scale to be zeroed. For another example, the user says "switch the weighing unit to" kg ". The voice assistance module controls the weighing unit of the kitchen scale to" kg ".
In some embodiments, for the current operating step, the voice assistance module automatically sets a desired value for the cooking (e.g., at least one of a processing temperature of the kitchen processing equipment and an operating time of the kitchen processing equipment) by identifying keywords of the current operating step in the target recipe, converting the values in the keywords into data, and transmitting the data to the kitchen processing equipment.
In some embodiments, the progress display module may further obtain a completed operation step of the user, generate a progress bar based on the completed operation step, and display the progress bar. For example, in the cooking process, the user can make according to the prompt, the kitchen scale and the kitchen processing equipment can transmit the operation data of the user to the user terminal 130 or the computing equipment 200, the user terminal 130 or the computing equipment 200 can convert the operation data into a progress bar form, the user operation dynamics is displayed, when the operation is completed, the progress bar process loading is completed, the completed state of the step is displayed, the user terminal 130 or the computing equipment 200 broadcasts the next step in a voice mode, and each step when the user cooks is broadcasted according to the user process until the cooking is completed, so that the error operation caused by subjective judgment of the user is effectively reduced.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present application. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present application may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the invention are illustrated and described in the context of a number of patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, etc., a conventional programming language such as C language, visual Basic, fortran 2003, perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, ruby and Groovy, or other programming languages, etc. The program code may execute entirely on the supervisory computer or as a stand-alone software package, partly on the supervisory computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the attendant computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application and are not intended to limit the order in which the processes and methods of the application are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this application is hereby incorporated by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the present application, documents that are currently or later attached to this application for which the broadest scope of the claims to the present application is limited. It is noted that the descriptions, definitions, and/or terms used in the subject matter of this application are subject to the use of descriptions, definitions, and/or terms in case of inconsistent or conflicting disclosure.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of this application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present application may be considered in keeping with the teachings of the present application. Accordingly, embodiments of the present application are not limited to only the embodiments explicitly described and depicted herein.

Claims (6)

1. An assisted cooking method based on speech recognition, for execution on a user terminal, comprising:
acquiring a menu selection instruction of a user;
acquiring a target menu based on the user menu selection instruction, and displaying the target menu, wherein the target menu comprises a plurality of operation steps;
acquiring a device binding instruction;
establishing communication connection with at least one intelligent kitchen device based on the device binding instruction, wherein the at least one intelligent kitchen device at least comprises a kitchen scale and a kitchen processing device;
acquiring a voice auxiliary starting instruction;
acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice comprises voice files respectively corresponding to the plurality of operation steps of the target menu;
repeatedly executing to obtain user voice instructions, voice broadcasting at least one voice file based on the user voice instructions and controlling the at least one intelligent kitchen equipment to finish target operation until the user finishes cooking;
the repeatedly executing to obtain the user voice instruction, voice broadcasting at least one voice file based on the user voice instruction and controlling the at least one intelligent kitchen device to complete target operation until the user completes cooking comprises the following steps:
playing a voice file corresponding to the current operation step;
identifying keywords of the current operation step in the target menu, converting the numerical values in the keywords into data and transmitting the data to the kitchen processing equipment, controlling the kitchen processing equipment to automatically set the numerical values required by the cooking, controlling the kitchen scale to be zero, controlling the kitchen scale switching unit, controlling the processing temperature of the kitchen processing equipment and/or controlling the working time of the kitchen processing equipment;
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice instruction of the user.
2. The speech recognition-based auxiliary cooking method according to claim 1, wherein the establishing a communication connection with at least one intelligent kitchen device based on the device binding instruction comprises:
based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device;
if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, sending a newly-added binding prompt, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt;
if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of the user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user.
3. An auxiliary cooking method based on speech recognition according to claim 1 or 2, further comprising:
acquiring the completed operation steps of the user, and generating a progress bar based on the completed operation steps;
and displaying the progress bar.
4. An auxiliary cooking system based on voice recognition for execution on a user terminal, comprising:
the menu instruction acquisition module is used for acquiring a menu selection instruction of a user;
the target menu obtaining module is used for obtaining a target menu based on the user menu selection instruction and displaying the target menu, and the target menu comprises a plurality of operation steps;
the binding instruction acquisition module is used for acquiring the equipment binding instruction;
the device binding module is used for establishing communication connection with at least one intelligent kitchen device based on the device binding instruction, wherein the at least one intelligent kitchen device at least comprises a kitchen scale and a kitchen processing device;
the auxiliary instruction acquisition module is used for acquiring a voice auxiliary starting instruction;
the voice auxiliary module is used for acquiring a menu voice packet corresponding to the target menu based on the voice auxiliary starting instruction, wherein the menu voice packet comprises voice files respectively corresponding to the plurality of operation steps of the target menu; the intelligent kitchen equipment is further used for repeatedly executing and acquiring user voice instructions based on the voice auxiliary starting instruction, voice broadcasting at least one voice file based on the user voice instructions and/or controlling the at least one intelligent kitchen equipment to finish target operation until the user finishes cooking;
the voice assistance module is further configured to:
playing a voice file corresponding to the current operation step;
identifying keywords of the current operation step in the target menu, converting the numerical values in the keywords into data and transmitting the data to the kitchen processing equipment, controlling the kitchen processing equipment to automatically set the numerical values required by the cooking, controlling the kitchen scale to be zero, controlling the kitchen scale switching unit, controlling the processing temperature of the kitchen processing equipment and/or controlling the working time of the kitchen processing equipment;
and playing the voice file corresponding to the current operation step or playing the voice file corresponding to the next operation step again based on the voice instruction of the user.
5. The speech recognition based auxiliary cooking system of claim 4 wherein the device binding module is further configured to:
based on the device binding instruction, judging whether the user terminal is bound with at least one intelligent kitchen device;
if the user terminal is bound with at least one intelligent kitchen device, acquiring and displaying device information of the bound intelligent kitchen device, sending a newly-added binding prompt, judging whether the at least one newly-added intelligent kitchen device needs to be bound based on feedback of the user on the newly-added binding prompt, and if the at least one newly-added intelligent kitchen device needs to be bound, establishing communication connection with the at least one newly-added intelligent kitchen device through Bluetooth based on feedback of the user on the newly-added binding prompt;
if the user terminal is not bound with at least one intelligent kitchen device, a binding prompt is sent out, whether the at least one intelligent kitchen device needs to be bound or not is judged based on feedback of the binding prompt of the user, and if the at least one intelligent kitchen device needs to be bound, communication connection is established with the at least one intelligent kitchen device through Bluetooth based on feedback of the binding prompt of the user.
6. The speech recognition based auxiliary cooking system according to claim 4 or 5, further comprising a progress display module for obtaining completed operation steps of the user, generating a progress bar based on the completed operation steps; and the progress bar is also used for displaying the progress bar.
CN202210040707.XA 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition Active CN114343437B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210040707.XA CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210040707.XA CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Publications (2)

Publication Number Publication Date
CN114343437A CN114343437A (en) 2022-04-15
CN114343437B true CN114343437B (en) 2023-08-04

Family

ID=81108831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210040707.XA Active CN114343437B (en) 2022-01-14 2022-01-14 Auxiliary cooking system and method based on voice recognition

Country Status (1)

Country Link
CN (1) CN114343437B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010073929A (en) * 2000-01-22 2001-08-03 윤종용 Apparatus For Cooking Control With Voice Recognition Of a Microwave Oven And Method The Same
JP2001218678A (en) * 2000-02-10 2001-08-14 Tiger Vacuum Bottle Co Ltd Electric rice cooker
JP2003108183A (en) * 2001-09-28 2003-04-11 Toshiba Corp Voice controller and heating cooker
CN101411592A (en) * 2007-10-19 2009-04-22 深圳市壹声通语音科技有限公司 Intelligent cooking apparatus with sound control recognition function
WO2017166317A1 (en) * 2016-04-01 2017-10-05 深圳市赛亿科技开发有限公司 Smart cooking device, and smart cooking system and assistant method thereof
CN108903587A (en) * 2018-05-29 2018-11-30 福来宝电子(深圳)有限公司 Voice auxiliary cooking method, speech processing device and computer readable storage medium
CN109067997A (en) * 2018-08-31 2018-12-21 上海与德通讯技术有限公司 The method and mobile terminal of voice guidance culinary art
CN109215644A (en) * 2017-07-07 2019-01-15 佛山市顺德区美的电热电器制造有限公司 A kind of control method and device
JP2019074215A (en) * 2017-10-12 2019-05-16 株式会社パロマ Cooking stove
KR20190118385A (en) * 2018-04-10 2019-10-18 (주)소닉더치코리아 Cold Brew Coffee Extractor Adjusting Sonic Vibration Based on Voice Recognizing Order
CN110522289A (en) * 2018-05-25 2019-12-03 潘通 A kind of control method of intelligent electric cooker, device and intelligent electric cooker
CN110786757A (en) * 2019-09-16 2020-02-14 上海纯米电子科技有限公司 Kitchen appliance with intelligent voice control and broadcast functions and control method thereof
CN112128810A (en) * 2020-09-18 2020-12-25 珠海格力电器股份有限公司 Cooking appliance control method and device, cooking appliance and storage medium
CN112256230A (en) * 2020-10-16 2021-01-22 广东美的厨房电器制造有限公司 Menu interaction method and system and storage medium
CN113475943A (en) * 2021-07-09 2021-10-08 海信家电集团股份有限公司 Menu execution method and device
CN114568948A (en) * 2020-11-30 2022-06-03 浙江绍兴苏泊尔生活电器有限公司 Cooking control method, device and system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010073929A (en) * 2000-01-22 2001-08-03 윤종용 Apparatus For Cooking Control With Voice Recognition Of a Microwave Oven And Method The Same
JP2001218678A (en) * 2000-02-10 2001-08-14 Tiger Vacuum Bottle Co Ltd Electric rice cooker
JP2003108183A (en) * 2001-09-28 2003-04-11 Toshiba Corp Voice controller and heating cooker
CN101411592A (en) * 2007-10-19 2009-04-22 深圳市壹声通语音科技有限公司 Intelligent cooking apparatus with sound control recognition function
WO2017166317A1 (en) * 2016-04-01 2017-10-05 深圳市赛亿科技开发有限公司 Smart cooking device, and smart cooking system and assistant method thereof
CN109215644A (en) * 2017-07-07 2019-01-15 佛山市顺德区美的电热电器制造有限公司 A kind of control method and device
JP2019074215A (en) * 2017-10-12 2019-05-16 株式会社パロマ Cooking stove
KR20190118385A (en) * 2018-04-10 2019-10-18 (주)소닉더치코리아 Cold Brew Coffee Extractor Adjusting Sonic Vibration Based on Voice Recognizing Order
CN110522289A (en) * 2018-05-25 2019-12-03 潘通 A kind of control method of intelligent electric cooker, device and intelligent electric cooker
CN108903587A (en) * 2018-05-29 2018-11-30 福来宝电子(深圳)有限公司 Voice auxiliary cooking method, speech processing device and computer readable storage medium
CN109067997A (en) * 2018-08-31 2018-12-21 上海与德通讯技术有限公司 The method and mobile terminal of voice guidance culinary art
CN110786757A (en) * 2019-09-16 2020-02-14 上海纯米电子科技有限公司 Kitchen appliance with intelligent voice control and broadcast functions and control method thereof
CN112128810A (en) * 2020-09-18 2020-12-25 珠海格力电器股份有限公司 Cooking appliance control method and device, cooking appliance and storage medium
CN112256230A (en) * 2020-10-16 2021-01-22 广东美的厨房电器制造有限公司 Menu interaction method and system and storage medium
CN114568948A (en) * 2020-11-30 2022-06-03 浙江绍兴苏泊尔生活电器有限公司 Cooking control method, device and system
CN113475943A (en) * 2021-07-09 2021-10-08 海信家电集团股份有限公司 Menu execution method and device

Also Published As

Publication number Publication date
CN114343437A (en) 2022-04-15

Similar Documents

Publication Publication Date Title
KR102505597B1 (en) Voice user interface shortcuts for an assistant application
CN106681160A (en) Method and device for controlling intelligent equipment
KR102148064B1 (en) Voice displaying
WO2016180101A1 (en) Smart networking method and smart device
CN108683574A (en) A kind of apparatus control method, server and intelligent domestic system
CN103731815B (en) Method and device for achieving mobile phone client software upgrading
US11749278B2 (en) Recommending automated assistant action for inclusion in automated assistant routine
US10599402B2 (en) Techniques to configure a web-based application for bot configuration
US11947984B2 (en) Systems, methods, and apparatus that provide multi-functional links for interacting with an assistant agent
TWI798652B (en) A method and system for automatically generating a data collection module
WO2018121206A1 (en) Verification code data processing method, apparatus and storage medium
CN109447851A (en) One kind prepares for a meal, vegetable providing method, device and equipment
US8543406B2 (en) Method and system for communicating with an interactive voice response (IVR) system
US20240086631A1 (en) Table processing method and apparatus, electronic device, medium and program product
CN110680201A (en) Network system, server, and information processing method
CN110968367A (en) E-commerce commodity field configuration method, device, server and storage medium
CN114343437B (en) Auxiliary cooking system and method based on voice recognition
CN110364155A (en) Voice control error-reporting method, electric appliance and computer readable storage medium
CN106850838A (en) The control method and system of mobile terminal cloud application
EP3848801B1 (en) Speech interaction method and apparatuses
US11625545B2 (en) Systems and methods for improved conversation translation
CN114280950A (en) Control method and device of automatic cooking machine capable of editing recipes
CN110875040B (en) Household appliance control method and system based on product skills
CN114745573A (en) Video control method, client, server and system
CN111159554A (en) Menu generation method and system based on big data, terminal device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant