CN114710697B - Recording method and device for intelligent menu - Google Patents

Recording method and device for intelligent menu Download PDF

Info

Publication number
CN114710697B
CN114710697B CN202210260843.XA CN202210260843A CN114710697B CN 114710697 B CN114710697 B CN 114710697B CN 202210260843 A CN202210260843 A CN 202210260843A CN 114710697 B CN114710697 B CN 114710697B
Authority
CN
China
Prior art keywords
intelligent
video
intelligent menu
menu
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210260843.XA
Other languages
Chinese (zh)
Other versions
CN114710697A (en
Inventor
秦文东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Home Appliances Group Co Ltd
Hisense Shandong Kitchen and Bathroom Co Ltd
Original Assignee
Hisense Home Appliances Group Co Ltd
Hisense Shandong Kitchen and Bathroom Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Home Appliances Group Co Ltd, Hisense Shandong Kitchen and Bathroom Co Ltd filed Critical Hisense Home Appliances Group Co Ltd
Priority to CN202210260843.XA priority Critical patent/CN114710697B/en
Publication of CN114710697A publication Critical patent/CN114710697A/en
Application granted granted Critical
Publication of CN114710697B publication Critical patent/CN114710697B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a recording method and device of an intelligent menu, which relate to the technical field of intelligent household appliances and are used for simplifying operations required to be executed by a user when video related to the intelligent menu is recorded and played, and improving the use experience of the user on the intelligent menu. The recording method of the intelligent menu comprises the following steps: receiving an intelligent menu recording start instruction of a user; responding to an intelligent menu recording start instruction, starting to record an intelligent menu, and starting to record videos related to the intelligent menu; receiving an intelligent menu recording ending instruction of a user; and responding to an intelligent menu recording ending instruction of the user, ending recording the intelligent menu and ending recording the video related to the intelligent menu.

Description

Recording method and device for intelligent menu
Technical Field
The application relates to the technical field of intelligent household appliances, in particular to a recording method and device of an intelligent menu.
Background
With the development of scientific technology, more and more intelligent cooking equipment is entering into daily life of people. According to various intelligent menus stored in the intelligent cooking equipment preset by a manufacturer, a user can call the ready-made menu in the intelligent cooking equipment at any time to cook, delicious dishes are made, the labor intensity of a kitchen is reduced, and the living standard of people is greatly improved.
Because some intelligent recipes stored in advance by the intelligent cooking equipment only have text or voice guidance, when a user wants to re-etch the recipes, the user cannot clearly know the manufacturing process according to the text or voice guidance, and the re-etch recipes are easy to fail.
Disclosure of Invention
The embodiment of the application provides a recording method and device of an intelligent menu, which are used for simplifying operations required to be executed by a user when video related to the intelligent menu is recorded and played, and improving the use experience of the user on the intelligent menu.
In a first aspect, an intelligent cooking apparatus is provided, comprising:
the camera shooting component is used for recording videos related to the intelligent menu;
a controller, coupled to the camera assembly, the controller configured to:
receiving an intelligent menu recording start instruction of a user;
responding to an intelligent menu recording start instruction, starting to record an intelligent menu, and controlling a camera shooting assembly to start to record videos related to the intelligent menu;
receiving an intelligent menu recording ending instruction of a user;
and responding to an intelligent menu recording ending instruction of the user, ending recording the intelligent menu, and controlling the camera shooting assembly to end recording videos related to the intelligent menu.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects: when the intelligent cooking equipment starts to record the intelligent menu, synchronously recording videos related to the intelligent menu; and then, synchronously finishing recording the video by the intelligent cooking equipment when finishing recording the intelligent menu. The whole video recording process does not need manual operation of a user, and is beneficial to simplifying the operation of the user. In addition, the starting time point of the recorded video is the same as the starting time point of the intelligent menu, and the ending time point of the video is the same as the ending time point of the intelligent menu, so that a user does not need to edit the video (such as cutting and the like). Thus, the intelligent menu has matched videos, so that a user can learn the manufacturing process by watching the videos when using the intelligent menu, and the usability of the intelligent menu is improved.
In some embodiments, the intelligent cooking appliance further comprises: and the communicator is used for establishing communication connection with the server. Wherein, the communicator is connected with the controller.
A controller, further configured to:
after finishing recording the intelligent menu, acquiring identification information of the intelligent menu;
and sending the identification information of the intelligent menu and the video related to the intelligent menu to a server.
Therefore, after the server receives the identification information of the intelligent menu and the video related to the intelligent menu, the identification information of the intelligent menu and the video related to the intelligent menu are bound together, a user only needs to search the identification information of the intelligent menu, and the corresponding video related to the intelligent menu can be searched, so that the user experience is better.
In some embodiments, the intelligent cooking appliance further comprises: and the communicator is connected with the controller.
A controller, further configured to:
receiving an execution instruction of an intelligent menu;
responding to an execution instruction of the intelligent menu, executing the intelligent menu, and controlling a display to play a video related to the intelligent menu;
and when the intelligent menu execution is finished, controlling the display to finish playing the video related to the intelligent menu.
Therefore, the intelligent cooking equipment can automatically play the video related to the intelligent menu when the intelligent menu starts to be executed, and automatically end to play the video related to the intelligent menu when the intelligent menu ends, and the user does not need to manually control the play and the end of the video related to the intelligent menu, so that the operation of the user can be simplified.
In some embodiments, the controller is further configured to:
in the execution process of the intelligent menu, the playing progress of the video related to the intelligent menu is adjusted according to the execution progress of the intelligent menu, so that the playing progress of the video related to the intelligent menu is synchronous with the execution progress of the intelligent menu.
Therefore, the execution progress of the intelligent menu can be ensured to be synchronous with the playing progress of the video, the video can be conveniently watched by a user, and the menu corresponding to the intelligent menu can be manufactured according to the guidance of the video.
In some embodiments, the controller is further configured to:
receiving an adjustment operation of a user on the playing progress of the video;
responding to the adjustment operation of the user on the playing progress of the video, and adjusting the playing progress of the video to be a first playing progress;
receiving synchronous progress operation of a user;
and responding to the synchronous progress operation of the user, adjusting the playing progress of the video to be a second playing progress, and synchronizing the second playing progress with the execution progress of the intelligent menu at the current moment.
In this way, the user can watch or preview the relevant video content back and forth by adjusting the video playing progress. And after the related video content is reviewed or previewed, the user can adjust the video to the playing progress synchronous with the execution progress of the intelligent menu at the current moment by only one synchronous operation, so that the experience of the user for watching the video related to the intelligent menu is improved.
In a second aspect, a recording method of an intelligent menu is provided, the method includes:
receiving an intelligent menu recording start instruction of a user;
responding to an intelligent menu recording start instruction, starting to record an intelligent menu, and starting to record videos related to the intelligent menu;
receiving an intelligent menu recording ending instruction of a user;
and responding to an intelligent menu recording ending instruction of the user, ending recording the intelligent menu and ending recording the video related to the intelligent menu.
In some embodiments, the method further comprises:
after finishing recording the intelligent menu, acquiring identification information of the intelligent menu;
and sending the video related to the intelligent menu to a server, and identifying information of the intelligent menu.
In some embodiments, the method further comprises:
receiving an execution instruction of an intelligent menu;
responding to an execution instruction of the intelligent menu, executing the intelligent menu, and playing a video related to the intelligent menu;
and when the intelligent menu execution is finished, finishing playing the video related to the intelligent menu.
In some embodiments, the method further comprises:
in the execution process of the intelligent menu, the playing progress of the video related to the intelligent menu is adjusted according to the execution progress of the intelligent menu, so that the playing progress of the video related to the intelligent menu is synchronous with the execution progress of the intelligent menu.
In some embodiments, the method further comprises:
receiving an adjustment operation of a user on the playing progress of the video;
responding to the adjustment operation of the user on the playing progress of the video, and adjusting the playing progress of the video to be a first playing progress;
receiving synchronous progress operation of a user;
and responding to the synchronous progress operation of the user, adjusting the playing progress of the video to be a second playing progress, and synchronizing the second playing progress with the execution progress of the intelligent menu at the current moment.
In a third aspect, a computer readable storage medium is provided, the computer readable storage medium including computer instructions that, when run on a computer, cause the computer to perform a recording method of an intelligent recipe provided in an embodiment of the present application.
In a fourth aspect, a computer program product is provided that contains computer instructions that, when run on a computer, cause the computer to perform the recording method of the intelligent recipe provided in the embodiments of the present application.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the controller or may be packaged separately from the processor of the controller, which is not limited in this application.
The beneficial effects described in the first aspect to the fourth aspect of the present application may refer to the beneficial effect analysis of the first aspect, and are not described herein.
Drawings
Fig. 1 is a schematic structural diagram of an intelligent kitchen system according to an embodiment of the present application;
fig. 2 is a hardware configuration diagram of a display device according to an embodiment of the present application;
fig. 3 is a hardware configuration diagram of an intelligent cooking apparatus according to an embodiment of the present application;
fig. 4 is a hardware configuration diagram of a video recording device according to an embodiment of the present application;
fig. 5 is a flow chart of a recording method of an intelligent menu according to an embodiment of the present application;
fig. 6 is a schematic view of a video recording scene according to an embodiment of the present application;
fig. 7 is a flowchart of another recording method of an intelligent menu according to an embodiment of the present application;
fig. 8 is a flow chart of an execution method of an intelligent menu according to an embodiment of the present application;
fig. 9 is a flow chart of another implementation method of an intelligent menu according to an embodiment of the present application;
fig. 10 is a schematic diagram of a video playing scene provided in an embodiment of the present application;
fig. 11 is a second schematic view of a video playing scene provided in the embodiment of the present application;
Fig. 12 is a third schematic view of a video playing scene provided in the embodiment of the present application;
fig. 13 is a schematic diagram of a video playing scene provided in an embodiment of the present application;
fig. 14 is a schematic diagram fifth view of a video playing scenario provided in the embodiment of the present application;
fig. 15 is a flowchart of another recording method of an intelligent menu according to an embodiment of the present application;
fig. 16 is a flow chart of another implementation method of an intelligent menu according to an embodiment of the present application;
fig. 17 is a flowchart of another implementation method of an intelligent menu according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present application, it should be noted that, unless explicitly stated and limited otherwise, the terms "connected," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art in a specific context. In addition, when describing a pipeline, the terms "connected" and "connected" as used herein have the meaning of conducting. The specific meaning is to be understood in conjunction with the context.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Fig. 1 is an intelligent kitchen system 10 to which the methods provided in some embodiments of the present application are applicable. As shown in fig. 1, the intelligent kitchen system 10 may include a first intelligent cooking appliance 11, a video recording appliance 12, a first terminal appliance 13, and a server 14.
In some embodiments, the first intelligent cooking apparatus 11 is an apparatus for automatically cooking food. For example, the intelligent cooking appliance may be an intelligent gas cooker, an intelligent oven, an intelligent electric cooker, an intelligent pot, or the like.
In some embodiments, the video recording device 12 may be a stand-alone video camera, a camera with video recording function, or a terminal with video recording function, such as a mobile phone, a tablet computer, a notebook computer, etc.
In some embodiments, a communication connection may be established between the first intelligent cooking apparatus 11 and the video recording apparatus 12 by WIFI, bluetooth, or the like. The first intelligent cooking device 11 may establish a communication connection (only one shown in fig. 1) with one or more video recording devices 12.
In some embodiments, based on the communication connection between first intelligent cooking device 11 and video recording device 12, first intelligent cooking device 11 may send control instructions to video recording device 12 to control video recording device 12 to start or stop recording video. For example, the first intelligent cooking apparatus 11 transmits a video recording start instruction to the video recording apparatus 12 when it starts recording the intelligent recipe, so that the video recorded by the video recording apparatus 12 starts in synchronization with the intelligent recipe recorded by the first intelligent cooking apparatus 11.
In some embodiments, the first intelligent cooking appliance 11 and the video recording appliance 12 may be two appliances that are independent of each other. Alternatively, video recording device 12 may be integrated on first intelligent cooking device 11.
In some embodiments, the first terminal device 13 is a device that provides voice and/or data connectivity to a user. For example, the terminal device may be: a mobile phone, a tablet, a notebook, a palm, a mobile internet device (mobile internet device, MID), a wearable device, a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device.
In some embodiments, an application for controlling the first intelligent cooking apparatus 11 may be installed on the first terminal apparatus 13. The user may control the first intelligent cooking apparatus 11 by using the application on the first terminal apparatus 13. For example, on the first terminal device 13, the user may issue a smart recipe recording start instruction to the first smart cooking device 13 by manipulating the application, so that the first smart cooking device 13 starts recording the smart recipe.
In some embodiments, server 14 may be a device having data processing capabilities as well as data storage capabilities. For example, it may be a server, or a server cluster formed by a plurality of servers, or a cloud computing service center, which is not limited thereto.
In some embodiments, server 14 may establish a communication connection with first intelligent cooking device 11, video recording device 12, and first terminal device 13, respectively. It should be appreciated that the connection may be a wireless connection, such as a Bluetooth connection, wi-Fi connection, etc.; alternatively, the connection may be a wired connection, for example, an optical fiber connection, which is not limited thereto.
In some embodiments, data transfer between server 14 and first intelligent cooking device 11, video recording device 12, and first terminal device 13 may be based on a communication connection between server 14 and first intelligent cooking device 11, video recording device 12, and first terminal device 13. For example, the first intelligent cooking apparatus 11 may transmit a recipe original file generated in the intelligent recipe recording process to the server 14, so that the server 14 generates an intelligent recipe from the recipe original file. For another example, after receiving the video upload instruction from the first intelligent cooking apparatus 11, the video recording apparatus 12 transmits the video related to the intelligent recipe and the identification information of the intelligent recipe to the server 14.
In some embodiments, referring to fig. 1, the intelligent kitchen system 10 may include a second intelligent cooking device 15, a display device 16, a second terminal device 17.
In some embodiments, the second intelligent cooking apparatus 15 is an apparatus for automatically cooking food. For example, the intelligent cooking device may be an intelligent gas cooker, an intelligent oven, an intelligent electric cooker, or the like.
In some embodiments, the display device 16 may be a separate video playing apparatus, or may be other home appliances with a video playing function, for example, a refrigerator configured with a display screen, a range hood, or the like, or may be a terminal device capable of playing video, for example, a mobile phone, a tablet computer, a notebook computer, or the like.
In some embodiments, a communication connection may be established between the second smart cooking device 15 and the display device 16 by WIFI, bluetooth, or the like. A second intelligent cooking device 15 may establish a communication connection with one or more display devices 16 (only one shown in fig. 1).
In some embodiments, based on the communication connection between the second intelligent cooking device 15 and the display device 16, the second intelligent cooking device 15 may send control instructions to the display device 16 to control the display device 16 to begin playing video. For example, the second intelligent cooking appliance 15 transmits a video play instruction to the display device 16 when it starts executing the intelligent recipe, so that the display device 16 plays a video related to the intelligent recipe.
In some embodiments, the second intelligent cooking device 15 and the display device 16 may be two devices that are independent from each other; alternatively, the display device 16 may be integrated on the second intelligent cooking device 15.
In some embodiments, the second terminal device 17 is a device that provides voice and/or data connectivity to a user. For example, the terminal device may be: a mobile phone, a tablet, a notebook, a palm, a mobile internet device (mobile internet device, MID), a wearable device, a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device.
In some embodiments, an application for controlling the second intelligent cooking apparatus 15 may be installed on the second terminal apparatus 17. The user may control the second intelligent cooking device 15 by using the application on the second terminal device 17. For example, on the second terminal device 17, the user may issue a smart recipe execution instruction to the second smart cooking device 17 by manipulating the application, so that the second smart cooking device 17 starts executing the smart recipe.
In some embodiments, the server 14 may establish a communication connection with the second intelligent cooking device 15, the display device 16, and the second terminal device 17, respectively. It should be appreciated that the connection may be a wireless connection, such as a Bluetooth connection, wi-Fi connection, etc.; alternatively, the connection may be a wired connection, for example, an optical fiber connection, which is not limited thereto.
A hardware configuration diagram of the display device 16 in the embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of controller 250, modem 210, communicator 220, detector 230, input/output interface 255, display 275, audio output interface 285, memory 260, power supply 290, user interface 265, or external device interface 240 is included in display apparatus 16.
In some embodiments, the display 275 is configured to receive image signals from the output of the processor 254, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, displaying video content may be receiving various image content transmitted from a web server side.
In some embodiments, display 275 is used to present a user manipulation interface generated in display device 16 and used to control display device 16.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wi-Fi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, display device 16 may transmit control signals and data signals between the communicator 220 and a control equipment (e.g., a remote control) or content providing device.
In some embodiments, detector 230 is a signal that display device 16 uses to capture or interact with the external environment.
In some embodiments, the detector 230 may also be a sound collector 231, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including control instructions for a user to control display device 16, or to collect ambient sound, is used to identify the type of ambient scene so that display device 16 may adapt to ambient noise.
In some embodiments, the detector 230 may also include an image collector 232, such as a camera, webcam, etc., that may be used to collect external environmental scenes, as well as to collect attributes of the user or to interact with the user's gestures. For example, display device 16 may adaptively change display parameters based on the external environmental scene captured by image capture 232. For another example, the display device 16 may recognize the operation intention of the user according to the gesture of the user acquired by the image acquirer 232 to implement the function of interaction with the user.
In some embodiments, the detector 230 includes a light receiver for collecting the ambient light intensity. Alternatively, the display device 16 may adaptively display parameter changes, etc., depending on the intensity of ambient light.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature. Alternatively, the display device 16 may adaptively adjust the display color temperature of the image according to the ambient temperature. The display device 16 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 16 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the input/output interface 255 is used for data transmission between the controller 250 and an external other device or other controllers, such as receiving video signal data and audio signal data, or instruction data, etc. of the external device.
In some embodiments, the external device interface 240 may include: a high definition multimedia interface (high definition multimedia interface, HDMI) interface, an analog or data high definition component input interface, a composite video input interface, a universal serial bus (universal serial bus, USB) input interface, an RGB port, or the like. The external device interface 240 may also include the above-described multiple interfaces forming a composite input/output interface.
In some embodiments, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplifying, mixing, and resonating, and demodulate the audio/video signal from the plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Alternatively, the broadcast television signal may be distinguished as a digital modulation signal, an analog modulation signal, or the like, depending on the modulation type. Alternatively, the broadcast television signal is classified into a digital signal, an analog signal, and the like according to the kind of signal.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. Thus, the set top box outputs the television audio/video signal modulated and demodulated by the received broadcast television signal to the main body equipment.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display device 16. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 16 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 may include at least one of a random access Memory 251 (random access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors (e.g., a graphics processor 253 (Graphics Processing Unit, GPU), a central processor 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication bus). Wherein the communication bus connects the various components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 16 power begins to boot, the CPU runs system boot instructions in ROM 252, copying temporary data of the operating system stored in memory into RAM 251 to facilitate booting or running the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251 so as to start or run the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display device 16 in a pre-power-up mode and/or displaying pictures in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. Optionally, the graphic processor 253 includes an operator for performing an operation by receiving user input of various interactive instructions, and displaying various objects according to display attributes. Further, the graphic processor 253 includes a renderer that renders various objects obtained by the operator, and the rendered objects are displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that may be directly displayed or played by display device 16.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like. And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like. And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display. The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner. The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the graphics processor 253 and the video processor 270 may be integrally configured, or may be separately configured, where the processing of graphics signals output to the display may be performed during the integrally configured process, and where different functions, such as gpu+frame rate conversion (frame rate conversion, FRC) architecture, may be performed during the separately configured process.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output interface 285 is configured to receive the sound signal output by the audio processor 280 under the control of the controller 250. By way of example, the audio output interface 285 may include a speaker 286. In addition to the speakers carried by the display device 16 itself, the audio output interface 285 may also include an external audio output terminal 287 that may be output to a generating device of an external device, such as: external sound interface or earphone interface, etc. The audio output interface 285 may also include a near field communication module in a communication interface, such as: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
In some embodiments, power supply 290 provides power supply support for display device 16 from power input by an external power source under the control of controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display device 16, or may be an external power supply installed in the display device 16, and a power supply interface providing an external power supply in the display device 16.
In some embodiments, the user interface 265 is configured to receive a user input signal and then send the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through control apparatus 100 or terminal device 300, and user interface 265 responds to the user input through controller 250 by displaying device 16.
In some embodiments, a user may input user commands through a graphical user interface displayed on the display 275, and the user input interface receives the user input commands through the graphical user interface. Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget (Widget), etc.
In some embodiments, memory 260 includes storage of various software modules for driving display device 16. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a voice recognition module, a communication module, a display control module, a browser module, and various service modules.
The base module is the underlying software module for signal communication between the various hardware in the display device 16 and for sending processing and control signals to the upper layer modules. The detection module is a management module for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management. The voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display to display the image content, and can be used for playing the information such as the multimedia image content, the UI interface and the like. The communication module is a module for performing control and data communication with an external device. The browser module is a module for performing data communication between the browsing servers. The service module is a module for providing various services and various applications. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
A hardware configuration diagram of the first intelligent cooking apparatus 11 in the embodiment is exemplarily shown in fig. 3. The hardware structure of the second intelligent cooking apparatus 15 may refer to the first intelligent cooking apparatus 11.
In some embodiments, one or more of the controller 310, the communicator 320, the display 330, the user interface 340, the human-machine interaction device 350, the camera assembly 360, or the audio-video transmission device 370 are included in the intelligent cooking appliance 11. The communicator 320, the display 330, the user interface 340, the man-machine interaction device 350, the camera assembly 360 and the audio/video transmission device 370 are all connected to the controller 310.
In some embodiments, the controller 310 is used to control the operation of the intelligent cooking appliance and respond to user operations. The controller 310 may control the overall operation of the intelligent cooking apparatus. For example, in response to the intelligent recipe recording start instruction, the controller controls the intelligent cooking appliance to start recording the intelligent recipe.
In some embodiments, communicator 320 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wi-Fi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The intelligent cooking appliance may transmit control signals and data signals between the video recording device, display device, terminal device, or server through the communicator 320.
In some embodiments, the display 330 may be a liquid crystal display, an organic light-emitting diode (OLED) display. The particular type, size, resolution, etc. of the display are not limited, and those skilled in the art will appreciate that the display may be modified in performance and configuration as desired. The display can be used for a control interface of the intelligent cooking equipment, and a user can adjust control parameters in the cooking process through the control interface displayed by the display.
In some embodiments, the user interface 340 is configured to receive user input signals and then send the received user input signals to the controller 310. The user may input a user command through the display or the terminal device, and after the user interface receives the user input, the intelligent cooking appliance responds to the user input through the controller.
In some embodiments, the human-computer interaction device 350 is configured to implement interaction between a user and the intelligent cooking apparatus. The human-machine interaction device 350 may include one or more of physical keys or a touch-sensitive display panel. For example, the user can select intelligent menu cooking dishes through the man-machine interaction device, and can also send an intelligent menu recording start instruction to the intelligent cooking equipment through the man-machine interaction device.
In some embodiments, camera assembly 360 is used to collect audio and video data in a menu production project. The camera assembly can convert collected audio and video into editable digital signals, and the audio and video transmission device 370 can conveniently transmit the audio and video to other external devices.
In some embodiments, the audio-video transmission device 370 is used to transmit audio and video data to an external other device or server. For example, after the controller acquires the identification information of the intelligent menu, the audio/video transmission device transmits the video related to the intelligent menu and the identification information of the intelligent menu to the server.
A hardware configuration diagram of recording device 12 in an embodiment is schematically shown in fig. 4.
In some embodiments, one or more of controller 410, communicator 420, camera assembly 430, audio and video transmission device 440, or user interface 450 are included in recording apparatus 12. The communicator 420, the camera assembly 430, the audio/video transmission device 440, and the user interface 450 are all connected to the controller 410.
In some embodiments, the controller 410 is used to control the operation of the video recording device and respond to user operations. The controller may control the overall operation of the video recording apparatus. For example, in response to a video recording start instruction, the controller controls the video recording device to start recording a video associated with the smart recipe.
In some embodiments, communicator 420 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wi-Fi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The video recording device 13 may transmit control signals and data signals between the communicator 420 and an external device.
In some embodiments, the camera assembly 430 is used to collect audio and video data in a dish making project. The camera assembly can convert the collected audio and video into editable digital signals, so that the audio and video transmission device 440 can conveniently transmit the audio and video to other external devices.
In some embodiments, the audio and video transmission device 440 is used to transmit audio and video data to an external other device or server. For example, after the communicator receives the video uploading instruction from the intelligent cooking apparatus, the audio/video transmission device transmits the video related to the intelligent menu and the identification information of the intelligent menu to the server.
In some embodiments, the user interface 450 is configured to receive user input signals and then send the received user input signals to the controller 410. The user may input a user command through the display or the terminal device, and after the user interface receives the user input, the display device responds to the user input through the controller.
The intelligent menu has only text or voice guidance, and when a user wants to re-etch the menu, the user cannot clearly know the manufacturing process according to the text or voice guidance, and the re-etch menu is easy to fail. In this regard, embodiments of the present application provide a new smart recipe that also includes video. However, if the video included in the intelligent menu is recorded by controlling the video recording device by the user, on one hand, the user needs more operations in the video recording process, and on the other hand, the recorded video may be asynchronous in time with the intelligent menu, and the user also needs to edit the video.
In view of this, as shown in fig. 5, an embodiment of the present application provides a recording method of an intelligent menu, which includes the following steps:
s101, the first intelligent cooking equipment receives an intelligent menu recording start instruction.
When the user wants to record the intelligent menu, the user can operate a man-machine interaction device (such as a display panel, keys and a voice recognition device) on the first intelligent cooking equipment to give an intelligent menu recording start instruction to the first intelligent cooking equipment.
Alternatively, when the user wants to record a smart recipe, the user may open an APP, such as a smart home APP, on the terminal device for manipulating the first smart cooking device. Furthermore, the user operates on the smart home AAP to issue a smart recipe recording start instruction to the first smart cooking device. For example, as shown in fig. 6, the terminal device displays a function selection interface of the smart recipe, where the function selection interface may include a "start recording smart recipe" control. After the user selects the "start recording intelligent menu" control, the terminal device sends an intelligent menu recording start instruction to the first intelligent cooking device.
S102, responding to an intelligent menu recording start instruction, and starting to record the intelligent menu by the first intelligent cooking equipment.
In some embodiments, after the user instructs to begin recording the intelligent recipe, the user may use the first intelligent cooking device to cook the dishes. In the dish cooking process, the first intelligent cooking device starts the intelligent menu recording function, so that the first intelligent cooking device can record relevant information for generating the intelligent menu. It should be appreciated that the information recorded by the first intelligent cooking appliance during the intelligent recipe recording process is related to the type of the first intelligent cooking appliance.
For example, taking a first intelligent cooking device as an intelligent kitchen range as an example, the intelligent kitchen range can record temperature values of various time points in the cooking process of dishes, firepower gears used at various time points, and the like.
For another example, taking the first intelligent cooking apparatus as an oven, the oven may record temperature values, operation modes, etc. at various time points in the recipe cooking process.
In some embodiments, if the first intelligent cooking apparatus has a recording function, the first intelligent cooking apparatus may further record voice information of the user during cooking of the dishes.
Table 1 illustrates exemplary information about a first intelligent cooking appliance recorded during a recording of an intelligent recipe.
TABLE 1
S103, the first intelligent cooking equipment sends a video recording start instruction to the video recording equipment when the intelligent menu starts to be recorded; correspondingly, the video recording equipment receives a video recording start instruction from the first intelligent cooking equipment.
The video recording start instruction is used for indicating the video recording equipment to start recording videos related to the intelligent menu.
In some embodiments, if the first intelligent cooking appliance is communicatively coupled to only one video recording appliance, the first intelligent cooking appliance directly sends a video recording start instruction to the video recording appliance.
In some embodiments, if the first intelligent cooking appliance is communicatively coupled to the plurality of video recording appliances, the first intelligent cooking appliance may send a video recording start instruction to one or more target video recording appliances of the plurality of video recording appliances. Optionally, the target video recording device may be a video recording device selected by a user from a plurality of video recording devices, or may be a video recording device set by default.
In some embodiments, prior to recording the intelligent recipe, the first intelligent cooking appliance may obtain the video configuration information in response to a user's configuration operation of the video recording appliance. Thus, the first intelligent cooking device can instruct the video recording device to record video according to the video recording configuration information. The video configuration information may include: video mode, video frame rate, video resolution, and video format.
The video recording mode may include a plurality of modes, for example, the video recording mode may include a constant speed mode for recording a regular constant speed video and an overspeed mode for recording a high speed moving object video.
The video frame rate is used for representing the frequency of continuous occurrence of bitmap images recorded in unit time, and a higher frame rate can obtain smoother video pictures.
Video resolution is typically expressed as the number of pixels in each direction, e.g., 640 x 480, 1600 x 1280, etc.
VIDEO formats are VIDEO playing software that assigns identifiers to VIDEO files in order to be able to play them, and common VIDEO formats are audio-VIDEO interleaving (Audio Video Interleaved, AUI), moving picture experts group (Motion Picture Experts Group, MPEG), advanced streaming format (Advanced Streaming format, ASF), streaming media format (FLASH VIDEO, FLV), etc.
S104, responding to a video recording start instruction, and starting recording the video related to the intelligent menu by the video recording equipment.
It should be understood that the video related to the intelligent menu is a video obtained by recording the process of making the menu corresponding to the intelligent menu by the user.
In some embodiments, the recording start time may be recorded when the recording device begins recording video.
In some embodiments, during the recording process of the intelligent recipe, the first intelligent cooking apparatus may periodically send recording time information, where the recording time information is used to characterize a recording duration of the intelligent recipe at the current time. Thus, in the recording process, if other video recording equipment monitors the recording time information, the recording of the video can be automatically started.
S105, the first intelligent cooking equipment receives an intelligent menu recording ending instruction.
When the dish is made and the recording of the intelligent menu can be finished, the user can send an instruction for finishing the recording of the intelligent menu to the intelligent cooking equipment in a direct or indirect mode.
For example, the user directly issues an instruction for ending recording of the intelligent menu to the intelligent cooking apparatus through a man-machine interaction device such as a display panel, a key, a voice recognition device and the like included in the intelligent cooking apparatus.
For another example, the user may indirectly issue an intelligent recipe recording end instruction to the intelligent cooking apparatus through an application installed on the mobile terminal apparatus by operating in the application.
S106, responding to the intelligent menu recording ending instruction, and ending the recording of the intelligent menu by the intelligent cooking equipment.
S107, the intelligent cooking equipment sends a video recording ending instruction to the video recording equipment; correspondingly, the video recording equipment receives a video recording ending instruction from the intelligent cooking equipment.
The video recording ending instruction is used for indicating the video recording equipment to end recording the video related to the intelligent menu.
S108, responding to a video recording ending instruction, and ending recording the video related to the intelligent menu by the video recording equipment.
The technical solution shown in fig. 5 brings at least the following advantages: when the first intelligent cooking equipment starts to record an intelligent menu, synchronously indicating the video recording equipment to record video; and then, when the first intelligent cooking equipment finishes recording the intelligent menu, synchronously indicating the video recording equipment to finish recording the video. The whole video recording process does not need a user to control video recording equipment, and is beneficial to simplifying the operation of the user. In addition, the starting time point of the recorded video is the same as the starting time point of the intelligent menu, and the ending time point of the video is the same as the ending time point of the intelligent menu, so that a user does not need to edit the video.
In some embodiments, as shown in fig. 7, the recording method of the intelligent menu provided in the embodiments of the present application may further include the following steps:
s201, after the intelligent menu is recorded, the first intelligent cooking equipment acquires identification information of the intelligent menu.
Wherein identification information of the intelligent recipe can be used to uniquely identify the intelligent recipe. For example, the identification information of the smart recipe may be implemented as a digital binding character.
In some embodiments, the first intelligent cooking appliance may itself generate the identification information of the intelligent recipe. And the first intelligent cooking equipment can send the identification information of the intelligent menu and the information recorded in the intelligent menu recording process to the server.
In other embodiments, the first intelligent cooking appliance may send information recorded during the intelligent recipe recording process to the server. The server generates an intelligent menu according to the information recorded in the intelligent menu recording process, and distributes identification information of the intelligent menu. And then, the server sends the identification information of the intelligent menu to the first intelligent cooking equipment.
The generation manner of the intelligent menu may refer to related technologies, and will not be described herein.
S202, a first intelligent cooking device sends a video uploading instruction to a video recording device; correspondingly, the video recording device receives a video uploading instruction from the first intelligent cooking device.
The video uploading instruction comprises identification information of the intelligent menu, and the video uploading instruction is used for indicating the video recording equipment to upload videos related to the intelligent menu to the server.
S203, responding to the video uploading instruction, and sending the video related to the intelligent menu and the identification information of the intelligent menu to the server by the video recording equipment.
After receiving the video related to the intelligent menu and the identification information of the intelligent menu sent by the video recording equipment, the server binds the video related to the intelligent menu and the identification information of the intelligent menu together and stores the video related to the intelligent menu. Thus, the user can search the video related to the intelligent menu by only searching the identification information of the intelligent menu.
In some embodiments, a user may download, share, edit, use the smart recipe from a server by querying the identification information of the smart recipe. When other users want to make the dishes, the server can also inquire the identification information of the intelligent menu corresponding to the dishes, so that the video related to the intelligent menu can be found, and the making process of the dishes by the users can be clearer by watching the video.
For an intelligent menu with video, if a user plays the video manually in the use process of the intelligent menu, the playing progress of the video may not be synchronous with the execution progress of the intelligent menu, so that the experience of the user using the intelligent menu is affected.
In view of this, as shown in fig. 8, an embodiment of the present application provides a method for executing an intelligent recipe, which includes the following steps S301-S306.
S301, the second intelligent cooking equipment receives an intelligent menu execution instruction.
When the user wants to re-order dishes, the intelligent menu execution instruction can be directly or indirectly issued to the second intelligent cooking apparatus.
For example, the second intelligent cooking apparatus includes a man-machine interaction device such as a display panel, a key, a voice recognition device, and the user can directly issue an intelligent menu execution instruction to the second intelligent cooking apparatus through the man-machine interaction device.
For another example, an application capable of controlling the intelligent cooking apparatus is installed on the terminal apparatus used by the user, and the user can indirectly issue an intelligent recipe execution instruction to the second intelligent cooking apparatus through the terminal apparatus operating in the application.
It should be understood that the second intelligent cooking apparatus may be the same intelligent cooking apparatus as the first intelligent cooking apparatus, or may be a different intelligent cooking apparatus, which is not limited thereto.
S302, responding to the intelligent menu execution instruction, and executing the intelligent menu by the second intelligent cooking equipment.
S303, the second intelligent cooking equipment sends a video playing instruction to the display equipment; correspondingly, the display device receives a video playing instruction from the second intelligent cooking device.
The video playing instruction is used for instructing the display device to play videos related to the intelligent menu.
In some embodiments, if the second intelligent cooking appliance is communicatively coupled to only one display device, the second intelligent cooking appliance sends a video play instruction directly to the display device.
In some embodiments, if the second intelligent cooking appliance is communicatively coupled to the plurality of display devices, the second intelligent cooking appliance may send video playback instructions to one or more target display devices of the plurality of display devices. Optionally, the target display device may be a display device selected by a user from among the multiple display devices, or may be a display device set by default.
In some embodiments, the second intelligent cooking appliance may obtain the play configuration information in response to a configuration operation of the display device by the user before the user instructs execution of the intelligent recipe. Thus, after the user instructs to execute the intelligent menu, the second intelligent cooking apparatus may instruct the display apparatus to play according to the play configuration information. The playing configuration information may include: play frame size, video play sharpness, etc. Among these, the common choices of the size of the play image are 16: 9. 4: 3. 1:1, etc. Common choices for video playback sharpness are 408P, 720P, 1080P, etc.
S304, responding to the video playing instruction, and playing the video related to the intelligent menu by the display equipment.
In some embodiments, the second intelligent cooking apparatus may periodically send recipe execution progress information for synchronizing a playback progress of the video related to the intelligent recipe with an execution progress of the intelligent recipe during execution of the intelligent recipe.
In some embodiments, the user may also select a new display device to play the video during execution of the smart recipe. The newly added display device may adjust the video to a play progress synchronized with the execution progress of the intelligent recipe according to the recipe execution progress information that the second intelligent cooking device has recently occurred.
In some embodiments, the recording process of the intelligent menu may be different from the executing process of the intelligent menu, such as equipment, operators, and food consumption, so that the executing process of the intelligent menu cannot completely re-etch the recording process of the intelligent menu. In the video playing process, the display device can adjust the playing progress of the video related to the intelligent menu according to the menu execution progress information.
For example, according to the schedule execution progress information sent by the second intelligent cooking apparatus, when playing a video corresponding to a certain stage in the intelligent schedule, the display apparatus may predict that the stage is still executing when the video corresponding to the stage is played according to the previous playing speed, and then the display apparatus slows down the playing speed of the video corresponding to the stage, so that the video playing progress is synchronous with the schedule execution progress.
For another example, when the second intelligent cooking apparatus performs a certain stage in the intelligent recipe and the video corresponding to the stage has been played, the display apparatus may pause playing the video corresponding to the stage, or the display apparatus repeatedly plays the video corresponding to the stage.
For example, the time period required for different steps of making dishes by different users may also be different due to the difference between the intelligent cooking devices used by the different users. The second intelligent cooking device performs the stewing cooking action in 1321-3120 seconds, but the video corresponding to the stage is finished when the video playing is finished to 2400 seconds, and the display device pauses the video related to the intelligent menu.
S305, when the intelligent menu execution is finished, the second intelligent cooking equipment sends a play finishing instruction to the display equipment; correspondingly, the display device receives a play ending instruction sent by the second intelligent cooking device.
The display device is used for displaying the video related to the intelligent menu.
And S306, responding to the playing ending instruction, and ending playing the video related to the intelligent menu by the display equipment.
The technical solution shown in fig. 8 brings at least the following advantages: the second intelligent cooking device can automatically control the display device to play the video related to the intelligent menu when the intelligent menu starts to be executed, and automatically control the display device to end playing the video related to the intelligent menu when the intelligent menu ends. Therefore, the user does not need to manually control the playing and ending of the video related to the intelligent menu, and the operation of the user can be simplified. In addition, the linkage between the second intelligent cooking equipment and the display equipment can ensure that the execution progress of the intelligent menu is synchronous with the playing progress of the video, so that a user can watch the video conveniently, and the menu corresponding to the intelligent menu is manufactured according to the instruction of the video. Therefore, the technical scheme is beneficial to improving the experience of the user in using the intelligent menu.
Optionally, in the process of video playing, as shown in fig. 9, an embodiment of the present application further provides a method for executing an intelligent menu, where the method includes the following steps:
s401, the display equipment receives the adjustment operation of the playing progress of the video related to the intelligent menu by the user.
The adjustment operation of the playing progress of the video may be a fast forward/fast backward operation, a drag operation of the video progress bar, or the like, which is not limited thereto.
S402, responding to an adjustment operation of the playing progress of the video by a user, and adjusting the playing progress of the video to be a first playing progress by the display device.
After the user adjusts the playing progress of the video, the execution progress of the current intelligent menu is different from the playing progress of the video, and if the user wants the playing progress of the video to be synchronized with the execution progress of the current intelligent menu again, a synchronization instruction can be issued to the display device. The synchronization instruction may be issued by the user by clicking a synchronization button on the display device, or may be issued by the user to the display device by voice.
S403, the display device receives the synchronous progress operation of the user.
And S404, responding to the synchronous progress operation of the user, and adjusting the playing progress of the video to a second playing progress by the display equipment, wherein the second playing progress is synchronous with the execution progress of the intelligent menu at the current moment.
By way of example, taking a dish of stewed beef with potatoes as an example, when a user wants to make the dish, the user can search for an intelligent menu corresponding to the dish through an application installed on a mobile phone. As shown in fig. 10, the user selects an intelligent recipe and issues an intelligent recipe execution instruction to the intelligent cooking apparatus. The user pre-selects the range hood and the mobile phone as display equipment for playing videos related to the intelligent menu, and starts playing the video for making the stewed potatoes after the range hood and the mobile phone receive the video playing instruction sent by the intelligent cooking equipment. In the process, the user can drag the progress bar to rewind or fast forward the video. As shown in fig. 11, the video is played to 60 th second. The user drags the progress bar and the video playback starts from 180 seconds as shown in fig. 12. After that, the user wants to synchronize the playing progress of the video related to the smart menu on the mobile phone with the execution progress of the current smart menu again, and only clicks the synchronization key on the display screen, as shown in fig. 13. The mobile phone determines that the intelligent menu is executed until 270 seconds according to the menu execution progress information continuously sent by the intelligent cooking equipment, and adjusts the playing progress of the video related to the intelligent menu to 270 seconds to start playing, as shown in fig. 14.
The technical solution shown in fig. 9 brings at least the following advantages: the user can watch or preview the relevant video content back and forth by adjusting the video playing progress. And after the related video content is reviewed or previewed, the user can adjust the video to the playing progress synchronous with the execution progress of the intelligent menu at the current moment by only one synchronous operation, so that the experience of the user for watching the video related to the intelligent menu is improved.
In the method shown in fig. 5 and 7, the signaling of the interaction between the intelligent cooking device and the video recording device when the video recording device is integrated on the intelligent cooking device may be regarded as the signaling of the interaction between the controller and the camera assembly in the intelligent cooking device.
The following describes an exemplary recording method of the intelligent menu provided in the embodiment of the present application with respect to a scenario in which a video recording device is integrated on an intelligent cooking device.
As shown in fig. 15, the recording method of the intelligent menu includes the following steps:
s501, the controller receives an intelligent menu recording start instruction of a user.
When the user wants to record the intelligent menu, the user can operate a man-machine interaction device (such as a display panel, keys and a voice recognition device) on the intelligent cooking equipment so as to give an intelligent menu recording start instruction to the intelligent cooking equipment.
Alternatively, when the user wants to record a smart recipe, the user can open an APP for manipulating the smart cooking device, such as a smart home APP, on the terminal device. Furthermore, the user operates on the intelligent home AAP to issue an intelligent menu recording start instruction to the intelligent cooking equipment.
S502, responding to an intelligent menu recording start instruction, and starting to record the intelligent menu by the controller and controlling the camera shooting assembly to record videos related to the intelligent menu.
S503, the controller receives an intelligent menu recording ending instruction of the user.
When the dish is made and the recording of the intelligent menu can be finished, the user can send an instruction for finishing the recording of the intelligent menu to the intelligent cooking equipment in a direct or indirect mode.
For example, the user directly issues an instruction for ending recording of the intelligent menu to the intelligent cooking apparatus through a man-machine interaction device such as a display panel, a key, a voice recognition device and the like included in the intelligent cooking apparatus.
For another example, the user may indirectly issue an intelligent recipe recording end instruction to the intelligent cooking apparatus through an application installed on the mobile terminal apparatus by operating in the application.
And S504, responding to an intelligent menu recording ending instruction of a user, ending recording the intelligent menu by the controller, and controlling the camera shooting assembly to end recording videos related to the intelligent menu.
In some embodiments, after finishing recording the intelligent recipe, the controller obtains identification information of the intelligent recipe; and the controller sends the identification information of the intelligent menu and the video related to the intelligent menu to the server.
The technical solution shown in fig. 15 brings at least the following advantages: when the intelligent cooking equipment starts to record the intelligent menu, synchronously recording videos related to the intelligent menu; and then, synchronously finishing recording the video related to the intelligent menu when the intelligent cooking equipment finishes recording the intelligent menu. The whole video recording process does not need manual operation of a user, and is beneficial to simplifying the operation of the user. In addition, the starting time point of the recorded video is the same as the starting time point of the intelligent menu, and the ending time point of the video is the same as the ending time point of the intelligent menu, so that a user does not need to edit the video.
In the methods illustrated in fig. 8 and 9, the signaling of the interaction between the intelligent cooking appliance and the display device when the display device is integrated on the intelligent cooking appliance may be considered as the signaling of the interaction between the controller and the display in the intelligent cooking appliance.
The following describes an exemplary implementation method of the intelligent menu provided in the embodiment of the present application with respect to a scenario in which a display device is integrated on an intelligent cooking device.
As shown in fig. 16, the method further comprises the steps of:
s601, the controller receives an execution instruction of the intelligent menu.
When the user wants to re-etch dishes, the intelligent recipe execution instruction can be issued to the intelligent cooking device in a direct or indirect way.
For example, the intelligent cooking apparatus includes a man-machine interaction device such as a display panel, a key, a voice recognition device, and the like, and a user can directly issue an intelligent menu execution instruction to the intelligent cooking apparatus through the man-machine interaction device.
For another example, an application capable of controlling the intelligent cooking apparatus is installed on the terminal apparatus used by the user, and the user can indirectly issue an intelligent menu execution instruction to the intelligent cooking apparatus through the terminal apparatus operating in the application.
S602, responding to an execution instruction of the intelligent menu, executing the intelligent menu by the controller, and playing video related to the intelligent menu by the controller display.
In some embodiments, in the process of executing the intelligent menu, the controller may periodically adjust the playing progress of the video related to the intelligent menu according to the execution progress of the intelligent menu, so that the playing progress of the video related to the intelligent menu is synchronous with the execution progress of the intelligent menu.
In some embodiments, the recording process of the intelligent menu may be different from the executing process of the intelligent menu, such as equipment, operators, and food consumption, so that the executing process of the intelligent menu cannot completely re-etch the recording process of the intelligent menu. In the video playing process, the intelligent cooking equipment can adjust the playing progress of videos related to the intelligent menu according to the execution progress of the intelligent menu.
For example, according to the execution progress of the intelligent menu, when playing a video corresponding to a certain stage in the intelligent menu, the intelligent cooking apparatus may predict that the stage is still executing when the video corresponding to the stage is played according to the previous play speed, and then the intelligent cooking apparatus slows down the play speed of the video corresponding to the stage, so that the play progress of the video is synchronous with the execution progress of the intelligent menu.
For another example, when the intelligent cooking apparatus performs a certain stage in the intelligent recipe and the video corresponding to the stage has been played, the intelligent cooking apparatus may pause playing the video corresponding to the stage, or the intelligent cooking apparatus may repeatedly play the video corresponding to the stage.
For example, the time period required for the different steps of making the dishes may be different due to the differences between the different intelligent cooking devices. The intelligent cooking device performs the stewing cooking action in 1321-3120 seconds, but the video corresponding to the stage is played to 2400 seconds, and the intelligent cooking device pauses playing the video related to the intelligent menu.
And S603, when the execution of the intelligent menu is finished, the controller controls the display to finish playing the video related to the intelligent menu.
The technical solution shown in fig. 16 brings at least the following advantages: the intelligent cooking device can automatically play the video related to the intelligent menu when the intelligent menu starts and automatically end playing the video related to the intelligent menu when the intelligent menu ends. Therefore, the user does not need to manually control the playing and ending of the video related to the intelligent menu, and the operation of the user can be simplified. In addition, the intelligent cooking equipment adjusts the playing progress of the video according to the execution progress of the intelligent menu, so that the execution progress of the intelligent menu and the playing progress of the video can be synchronous, a user can watch the video conveniently, and a menu corresponding to the intelligent menu is manufactured according to the instruction of the video. Therefore, the technical scheme is beneficial to improving the experience of the user in using the intelligent menu.
In some embodiments, during the process of playing the intelligent recipe-related video by the intelligent cooking appliance, as shown in fig. 17, the method further comprises the steps of:
s604, the controller receives the adjustment operation of the playing progress of the video from the user.
The adjustment operation of the playing progress of the video may be a fast forward/fast backward operation, a drag operation of the video progress bar, or the like, which is not limited thereto.
S605, responding to the adjustment operation of the playing progress of the video by the user, and adjusting the playing progress of the video to be a first playing progress by the controller.
After the user adjusts the playing progress of the video, the execution progress of the current intelligent menu is different from the playing progress of the video, and if the user wants the playing progress of the video to be synchronized with the execution progress of the current intelligent menu again, a synchronization instruction can be issued to the intelligent cooking equipment. The synchronous instruction can be issued by the user by clicking a synchronous button on the intelligent cooking equipment, or issued by the user by voice.
S606, the controller receives the synchronous progress operation of the user.
S607, responding to the synchronous progress operation of the user, and adjusting the playing progress of the video to be a second playing progress by the controller, wherein the second playing progress is synchronous with the execution progress of the intelligent menu at the current moment.
The technical solution shown in fig. 17 brings at least the following advantages: the user can watch or preview the relevant video content back and forth by adjusting the video playing progress. And after the related video content is reviewed or previewed, the user can adjust the playing progress of the video to be synchronous with the execution progress of the intelligent menu at the current moment by only one synchronous operation, so that the experience of the user for watching the related video of the intelligent menu is improved.
The foregoing description of the solution provided in this application has been presented primarily from a method perspective. It will be appreciated that each node, e.g. terminal device, for implementing the above-mentioned functions, comprises corresponding hardware structures and/or software modules for performing each function. Those of skill in the art will readily appreciate that the various illustrative algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The embodiment of the application also provides a computer readable storage medium, which comprises computer execution instructions, when the computer readable storage medium runs on a computer, the computer is caused to execute any video recording and playing method provided by the embodiment.
The embodiment of the application also provides a computer program product containing computer execution instructions, which when run on a computer, cause the computer to execute any one of the video recording and playing methods provided in the above embodiment.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer-executable instructions. When the computer-executable instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are fully or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer-executable instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, from one website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be accessed by a computer or data storage devices including one or more servers, data centers, etc. that can be integrated with the media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Although the present application has been described herein in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the figures, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in connection with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely exemplary illustrations of the present application as defined in the appended claims and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the present application. It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. An intelligent cooking appliance, comprising:
a camera assembly;
a display;
a controller coupled to the camera assembly and the display, the controller configured to:
receiving an intelligent menu recording start instruction of a user;
responding to the intelligent menu recording start instruction, starting to record an intelligent menu, and controlling the camera component to start to record videos related to the intelligent menu;
receiving an intelligent menu recording ending instruction of the user;
responding to an intelligent menu recording ending instruction of the user, ending recording the intelligent menu, and controlling the camera shooting assembly to end recording videos related to the intelligent menu;
receiving an execution instruction of an intelligent menu;
responding to the execution instruction of the intelligent menu, executing the intelligent menu, and controlling the display to play the video related to the intelligent menu;
Receiving an adjustment operation of a user on the playing progress of the video;
responding to the adjustment operation of the user on the playing progress of the video, and adjusting the playing progress of the video to be a first playing progress;
receiving the synchronous progress operation of the user;
and responding to the synchronous progress operation of the user, adjusting the playing progress of the video to be a second playing progress, wherein the second playing progress is synchronous with the execution progress of the intelligent menu at the current moment.
2. The intelligent cooking apparatus of claim 1, wherein the intelligent cooking apparatus further comprises:
the communicator is connected with the controller and is used for establishing communication connection with the server;
the controller is further configured to:
after finishing recording the intelligent menu, acquiring identification information of the intelligent menu;
and sending the identification information of the intelligent menu and the video related to the intelligent menu to the server.
3. The intelligent cooking appliance of claim 1, wherein the controller is further configured to:
and when the intelligent menu execution is finished, controlling the display to finish playing the video related to the intelligent menu.
4. The intelligent cooking appliance of claim 3, wherein the controller is further configured to:
in the execution process of the intelligent menu, according to the execution progress of the intelligent menu, the playing progress of the video related to the intelligent menu is adjusted, so that the playing progress of the video related to the intelligent menu is synchronous with the execution progress of the intelligent menu.
5. A recording method of an intelligent menu, which is applied to an intelligent cooking device, the method comprising:
receiving an intelligent menu recording start instruction of a user;
responding to the intelligent menu recording start instruction, starting to record an intelligent menu, and starting to record videos related to the intelligent menu;
receiving an intelligent menu recording ending instruction of the user;
responding to an intelligent menu recording ending instruction of the user, ending recording the intelligent menu, and ending recording videos related to the intelligent menu;
receiving an execution instruction of an intelligent menu;
responding to the execution instruction of the intelligent menu, executing the intelligent menu, and playing the video related to the intelligent menu;
receiving an adjustment operation of the user on the playing progress of the video;
Responding to the adjustment operation of the user on the playing progress of the video, and adjusting the playing progress of the video to be a first playing progress;
receiving the synchronous progress operation of the user;
and responding to the synchronous progress operation of the user, adjusting the playing progress of the video to be a second playing progress, wherein the second playing progress is synchronous with the execution progress of the intelligent menu at the current moment.
6. The method of claim 5, wherein the method further comprises:
after finishing recording the intelligent menu, acquiring identification information of the intelligent menu;
and sending the video related to the intelligent menu and the identification information of the intelligent menu to a server.
7. The method of claim 5, wherein the method further comprises:
and ending playing the video related to the intelligent menu when the intelligent menu is executed.
8. The method of claim 7, wherein the method further comprises:
in the execution process of the intelligent menu, according to the execution progress of the intelligent menu, the playing progress of the video related to the intelligent menu is adjusted, so that the playing progress of the video related to the intelligent menu is synchronous with the execution progress of the intelligent menu.
CN202210260843.XA 2022-03-16 2022-03-16 Recording method and device for intelligent menu Active CN114710697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210260843.XA CN114710697B (en) 2022-03-16 2022-03-16 Recording method and device for intelligent menu

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210260843.XA CN114710697B (en) 2022-03-16 2022-03-16 Recording method and device for intelligent menu

Publications (2)

Publication Number Publication Date
CN114710697A CN114710697A (en) 2022-07-05
CN114710697B true CN114710697B (en) 2024-04-02

Family

ID=82169530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210260843.XA Active CN114710697B (en) 2022-03-16 2022-03-16 Recording method and device for intelligent menu

Country Status (1)

Country Link
CN (1) CN114710697B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100936048B1 (en) * 2009-03-26 2010-01-08 현대통신 주식회사 Tv for providing a cooking information in kitchen
CN104914898A (en) * 2015-04-17 2015-09-16 珠海优特电力科技股份有限公司 Digital menu generating method and system
CN107844142A (en) * 2016-09-18 2018-03-27 王强 Cooking system, mobile terminal and electronic cookbook generation, auxiliary cooking method
CN109188943A (en) * 2018-09-26 2019-01-11 上海金晋智能科技有限公司 The wisdom kitchen system of cooking is shared based on artificial intelligence
CN111131855A (en) * 2019-12-30 2020-05-08 上海纯米电子科技有限公司 Cooking process sharing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100936048B1 (en) * 2009-03-26 2010-01-08 현대통신 주식회사 Tv for providing a cooking information in kitchen
CN104914898A (en) * 2015-04-17 2015-09-16 珠海优特电力科技股份有限公司 Digital menu generating method and system
CN107844142A (en) * 2016-09-18 2018-03-27 王强 Cooking system, mobile terminal and electronic cookbook generation, auxiliary cooking method
CN109188943A (en) * 2018-09-26 2019-01-11 上海金晋智能科技有限公司 The wisdom kitchen system of cooking is shared based on artificial intelligence
CN111131855A (en) * 2019-12-30 2020-05-08 上海纯米电子科技有限公司 Cooking process sharing method and device

Also Published As

Publication number Publication date
CN114710697A (en) 2022-07-05

Similar Documents

Publication Publication Date Title
WO2021109418A1 (en) Video resource display method, mobile terminal and server
CN111405333A (en) Display apparatus and channel control method
CN111277884A (en) Video playing method and device
CN111836109A (en) Display device, server and method for automatically updating column frame
WO2021169168A1 (en) Video file preview method and display device
CN113259741A (en) Demonstration method and display device for classical viewpoint of episode
CN112214189A (en) Image display method and display device
CN113825032A (en) Media asset playing method and display equipment
KR20130081181A (en) Apparatus of processing a service and method for processing the same
CN112543359A (en) Display device and method for automatically configuring video parameters
CN111176603A (en) Image display method for display equipment and display equipment
CN113438539A (en) Digital television program recording method and display equipment
CN111954059A (en) Screen saver display method and display device
CN112272331B (en) Method for rapidly displaying program channel list and display equipment
CN112040276B (en) Video progress synchronization method, display device and refrigeration device
CN110602540B (en) Volume control method of display equipment and display equipment
CN111741314A (en) Video playing method and display equipment
US12056418B2 (en) Content-based voice output method and display apparatus
CN114710697B (en) Recording method and device for intelligent menu
US12045947B2 (en) Display apparatus and image scaling method
CN113259733B (en) Display device
CN112333520B (en) Program recommendation method, display device and server
WO2020147507A1 (en) Display device and display method
KR102056165B1 (en) Apparatus for receiving broadcasting and method for operating the same
CN112261463A (en) Display device and program recommendation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant