CN111124238A - Storage medium, intelligent panel and timing interaction method thereof - Google Patents

Storage medium, intelligent panel and timing interaction method thereof Download PDF

Info

Publication number
CN111124238A
CN111124238A CN201911194090.1A CN201911194090A CN111124238A CN 111124238 A CN111124238 A CN 111124238A CN 201911194090 A CN201911194090 A CN 201911194090A CN 111124238 A CN111124238 A CN 111124238A
Authority
CN
China
Prior art keywords
scene
intelligent panel
user
intelligent
interactive interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911194090.1A
Other languages
Chinese (zh)
Inventor
赵振宇
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xingluo Intelligent Technology Co Ltd
Original Assignee
Xingluo Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xingluo Intelligent Technology Co Ltd filed Critical Xingluo Intelligent Technology Co Ltd
Priority to CN201911194090.1A priority Critical patent/CN111124238A/en
Publication of CN111124238A publication Critical patent/CN111124238A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home

Abstract

A storage medium, an intelligent panel and a timing interaction method thereof. The application discloses a timing interaction method of an intelligent panel, which comprises the following steps: the intelligent panel acquires a current scene; the intelligent panel provides an interactive interface corresponding to the scene for the user according to the scene and starts timing; the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period; when the intelligent panel does not receive a cancel instruction input by a user at the interactive interface within a preset time period, executing operation corresponding to the scene; and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within a preset time period, the operation corresponding to the scene is not executed. Through the mode, disturbance to the user can be reduced, complexity of user operation is reduced, and human-computer interaction experience is improved.

Description

Storage medium, intelligent panel and timing interaction method thereof
Technical Field
The invention relates to the technical field of intelligent home furnishing, in particular to a storage medium, an intelligent panel and a timing interaction method of the intelligent panel.
Background
The intelligent home is embodied in an internet of things manner under the influence of the internet. The intelligent home connects various devices (such as audio and video devices, lighting systems, curtain control, air conditioner control, security systems, digital cinema systems, audio and video servers, video cabinet systems, network home appliances and the like) in the home together through the Internet of things technology, and provides multiple functions and means such as home appliance control, lighting control, telephone remote control, indoor and outdoor remote control, anti-theft alarm, environment monitoring, heating and ventilation control, infrared forwarding, programmable timing control and the like. Compared with the common home, the intelligent home has the traditional living function, integrates the functions of building, network communication, information household appliance and equipment automation, provides an all-around information interaction function, and even saves funds for various energy expenses.
A man-machine interaction mode in the scene of smart home is an important block directly related to user experience. The traditional smart home has two interaction modes, one is man-machine interaction through an APP of a mobile terminal (for example, a smart phone), the operation of the mode is complex, and corresponding home equipment needs to be found layer by layer for control; the other is that the central control device of the smart home automatically executes control over the home devices according to a scene detection result, but the mode seriously disturbs users, for example, the window curtain is not opened by the users at a certain time point but is opened due to scene misdetection or different habits of the users every day, and generally speaking, the two interaction modes bring poor home experience to the users and have adverse effects on popularization and use of smart home products.
Disclosure of Invention
The technical problem mainly solved by the application is to provide the storage medium, the intelligent panel and the timing interaction method thereof, which can reduce disturbance to a user, reduce complexity of user operation and improve human-computer interaction experience.
In order to solve the above technical problem, one technical solution adopted in the embodiments of the present application is: the timing interaction method for the intelligent panel comprises the following steps: the intelligent panel acquires a current scene; the intelligent panel provides an interactive interface corresponding to the scene for the user according to the scene and starts timing; the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period; when the intelligent panel does not receive a cancel instruction input by a user at the interactive interface within a preset time period, executing operation corresponding to the scene; and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within a preset time period, the operation corresponding to the scene is not executed.
The timing interaction method further comprises the following steps: and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within a preset time period, stopping providing the interactive interface for the user.
The timing interaction method further comprises the following steps: the intelligent panel does not receive a cancel instruction input by the user at the interactive interface within the preset time period and stops providing the interactive interface for the user at the moment when the preset time period is over.
The method comprises the following steps that the intelligent panel provides an interactive interface corresponding to a scene for a user according to the scene and starts timing, and comprises the following steps: the intelligent panel acquires the emergency degree of the scene according to the scene; the intelligent panel sets the duration of a preset time period according to the degree of urgency, wherein the higher the degree of urgency, the shorter the duration.
The step of obtaining the current scene comprises the following steps: the intelligent panel acquires the current time; and the intelligent panel searches the corresponding scene in a pre-stored corresponding relation table of time and scene according to the current time.
The step of obtaining the current scene comprises the following steps: the intelligent panel acquires at least one of data of indoor intelligent household electrical appliances and data of indoor sensing equipment through a communicator of the intelligent panel; the intelligent panel analyzes the current scene according to at least one of the data of the indoor intelligent household electrical appliance and the data of the indoor sensing equipment.
Wherein, the step of providing the interactive interface corresponding to the scene to the user comprises: the smart panel controls a display of the smart panel to display at least one touch button corresponding to the scene according to the scene to allow a user to input a cancel instruction by touching the touch button.
Wherein, the step of providing the interactive interface corresponding to the scene to the user comprises: the intelligent panel controls a loudspeaker of the intelligent panel to play inquiry voice corresponding to the scene according to the scene, and controls a sound pick-up of the intelligent panel to be in a state of collecting surrounding sound so as to allow a user to input a voice cancelling instruction through the sound pick-up.
In order to solve the above technical problem, another technical solution adopted in the embodiment of the present application is: there is provided a smart panel comprising a processor and a memory electrically connected to the processor, the memory for storing a computer program, the processor for invoking the computer program to perform the above method.
In order to solve the above technical problem, another technical solution adopted in the embodiments of the present application is: a storage medium is provided which stores a computer program executable by a processor to implement the above-described method.
The method comprises the steps of obtaining a current scene through an intelligent panel; the intelligent panel provides an interactive interface corresponding to the scene for the user according to the scene and starts timing; the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period; when the intelligent panel does not receive a cancel instruction input by a user at the interactive interface within a preset time period, executing operation corresponding to the scene; when the intelligent panel receives a cancel instruction input by a user at the interactive interface within a preset time period, the operation corresponding to the scene is not executed, the disturbance to the user can be reduced, the complexity of the user operation is reduced, and the human-computer interaction experience is improved.
Drawings
Fig. 1 is a schematic hardware structure diagram of an intelligent home control system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a hardware structure of an intelligent panel according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a timing interaction method of a smart panel according to a first embodiment of the present application;
FIG. 4 is a partial flow chart of a timing interaction method of a smart panel according to a second embodiment of the present application;
FIG. 5 is a partial flow chart of a timing interaction method of a smart panel according to a third embodiment of the present application;
FIG. 6 is a schematic flow chart diagram illustrating one embodiment of obtaining a current scene according to the present application;
FIG. 7 is a schematic flow chart diagram illustrating another embodiment of the present application for obtaining a current scene;
FIG. 8 is a schematic view of a specific flowchart of step S12 in FIG. 1;
fig. 9 is a schematic diagram of a hardware structure of an intelligent panel according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The intelligent panel is an important component of an intelligent household control system, intelligent household appliances are main control objects, and facilities related to household life are highly integrated by utilizing a comprehensive wiring technology, a network communication technology, a safety precaution technology, an automatic control technology and an audio and video technology.
The intelligent panel is a central control system integrating a plurality of subsystems such as lighting, sound, curtains, temperature controllers and sensors, and can realize intelligent control management of residential space light, electric curtains, temperature and humidity, household appliances and the like in various intelligent control modes such as remote control, mobile phone remote control, touch interaction, voice interaction and the like, so that intelligent and comfortable high-quality life is provided for people.
The intelligent panel can be embedded into an indoor wall body, and can also be a placing type intelligent panel placed on a desktop.
Referring to fig. 1, fig. 1 is a schematic diagram of a hardware structure of an intelligent home control system according to an embodiment of the present application.
In this embodiment, the smart home control system 10 may include a smart panel 11, a smart home device 12, an indoor sensing device 13, and a smart home cloud server 14.
The intelligent panel 11 is in communication connection with the intelligent household appliance 12, the indoor sensor device 13 and the intelligent home cloud server 14, and the connection mode can be a wireless or wired mode, which is not limited in the embodiment of the application. For example, the wireless communication connection may be WIFI, mobile internet, Zigbee, or the like, and the wired communication connection may be RJ45 network cable, USB data cable, or the like.
The intelligent household electrical appliance 12 can be an intelligent television, an intelligent curtain, an intelligent window, an intelligent lock, an intelligent refrigerator, an intelligent air purifier, an intelligent air conditioner and the like, and the embodiment of the application does not limit the intelligent window.
The indoor sensing device 13 may be a visual sensor, such as a light sensor, a camera, or the like, a position detection sensor, such as a radar, a proximity sensor, a hall element, or the like, a touch sensor, such as a humidity sensor, a temperature sensor, or the like, an olfactory sensor, such as an odor sensor, a smoke sensor, an air quality sensor, a detector for a special toxic gas (e.g., a carbon monoxide detection sensor), or the like, which is not limited in this embodiment of the present application, and the indoor sensing device 13 is used to collect real-time data in a room.
Referring to fig. 2, fig. 2 is a schematic diagram of a hardware structure of an intelligent panel according to an embodiment of the present application.
In the present embodiment, the smart panel 11 may include a processor 111 and a memory 112 electrically connected to the processor 111, a display 115, a microphone 116, a speaker 117, and a communicator 118.
Referring to fig. 3, fig. 3 is a flowchart illustrating a timing interaction method of an intelligent panel according to a first embodiment of the present application.
In this embodiment, the timing interaction method of the smart panel may include the following steps:
step S31: the intelligent panel acquires a current scene.
Step S32: and the intelligent panel provides an interactive interface corresponding to the scene for the user according to the scene and starts timing.
Step S33: the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period.
Step S34: and when the intelligent panel does not receive a cancel instruction input by the user at the interactive interface within a preset time period, executing operation corresponding to the scene.
Step S35: and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within a preset time period, the operation corresponding to the scene is not executed.
In one embodiment, the step of providing an interactive interface corresponding to a scene to a user comprises: the smart panel controls a display of the smart panel to display at least one touch button corresponding to the scene according to the scene to allow a user to input a cancel instruction by touching the touch button.
In another embodiment, the step of providing an interactive interface corresponding to a scene to a user includes: the intelligent panel controls a loudspeaker of the intelligent panel to play inquiry voice corresponding to the scene according to the scene, and controls a sound pick-up of the intelligent panel to be in a state of collecting surrounding sound so as to allow a user to input a voice cancelling instruction through the sound pick-up.
Referring to fig. 4, fig. 4 is a partial flowchart illustrating a timing interaction method for an intelligent panel according to a second embodiment of the present application.
In this embodiment, the timing interaction method of the smart panel may further include the following steps:
step S41: and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within a preset time period, stopping providing the interactive interface for the user.
Referring to fig. 5, fig. 5 is a partial flowchart illustrating a timing interaction method for an intelligent panel according to a third embodiment of the present application.
Step S51: the intelligent panel does not receive a cancel instruction input by the user at the interactive interface within the preset time period and stops providing the interactive interface for the user at the moment when the preset time period is over.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating an embodiment of obtaining a current scene according to the present application.
In this embodiment, the obtaining of the current scene specifically may include the following steps:
step S61: the processor of the smart panel obtains the current time.
The obtaining time may be obtained from the mobile internet, for example, from the smart home cloud server 14, or may be obtained from a local timer, for example, obtaining the timing time from a local crystal oscillator.
Step S62: and the processor of the intelligent panel searches the corresponding scene in a pre-stored corresponding relation table of time and scene according to the current time.
Optionally, the pre-stored correspondence table between time and scene may be stored, constructed and formed according to habits of users collected in the past. For example, the smart panel 11 records user operation data such as content (for example, types of the smart home appliances that are operated) and corresponding operation time points that are operated each time the user operates the smart panel 11, and sends the operation time points and the operation content to the smart home cloud server 14, the smart home cloud server 14 counts at least one operation content that is most frequently operated and controlled for each time period in a day according to the collected user operation data, and defines each time period as a scene, and associates and stores the scene and the at least one operation content that is most frequently operated and controlled, where each operation content corresponds to at least one interactive interface, so as to form a correspondence table of time-scene-operation content-interactive interface. The intelligent panel 11 may find the corresponding scene and the interactive interface corresponding to the scene in the corresponding relationship table according to the obtained current time.
Referring to fig. 7, fig. 7 is a schematic flowchart illustrating another embodiment of the present application for acquiring a current scene.
In this embodiment, the obtaining of the current scene specifically may include the following steps:
step S71: and the processor of the intelligent panel acquires at least one of the data of the indoor intelligent household electrical appliance equipment and the data of the indoor sensing equipment through the communicator of the intelligent panel.
Step S72: and the processor of the intelligent panel analyzes the current scene according to at least one of the data of the indoor intelligent household electrical appliance and the data of the indoor sensing equipment.
For example, the indoor sensing device 13 includes a smoke sensor, the indoor smart home appliance 12 includes an air purifier, the smoke sensor detects that the smoke amount is greater than a set threshold and the air purifier is in an off state, and then it analyzes that the current scene is "the air purifier needs to be turned on", the smart panel 11 displays that the air purifier is about to be turned on after 10s (a predetermined time period) and please click the cancel touch button if necessary, or outputs a voice that the air purifier is about to be turned on after 10s and please speak a cancel instruction if necessary, and if the smart panel 11 does not receive the cancel instruction of the user within 10s, the smart panel 11 sends a control signal to the smart home appliance to control the air purifier to be turned on.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating an embodiment of step S12 in fig. 1.
In this embodiment, the step of providing, by the smart panel, an interactive interface corresponding to a scene to a user according to the scene and starting timing includes:
step S81: and the intelligent panel acquires the emergency degree of the scene according to the scene.
The intelligent panel is pre-stored with a corresponding relation table of scenes and emergency degrees. For example, one scenario is that an indoor camera monitors that an old person or a child falls down, the scenario is "call required 120", and the corresponding urgency level is 2; the other scene is that the detected smoke amount is larger than a set threshold value, an air purifier needs to be turned on, the corresponding emergency degree is 1, and the larger the emergency degree value is, the higher the emergency degree is.
Step S82: the intelligent panel sets the duration of a preset time period according to the degree of urgency, wherein the higher the degree of urgency, the shorter the duration.
For example, in a scenario where the smart panel acquires "call 120 is needed", the smart panel provides an interactive interface to the user, such as voice inquiry whether call 120 is needed or not, and controls the sound pickup to be in a state where sound can be collected, and if the sound pickup does not receive a voice or touch cancel instruction of the user within 5s, the smart panel calls 120 and reports the home address and the contact number to the emergency center.
Acquiring that the air purifier needs to be opened at the intelligent panel, inquiring whether the air purifier needs to be opened or not by voice, controlling the sound pick-up to be in a state capable of collecting sound, and if the sound pick-up does not receive the voice of a user or a touch control cancelling instruction in 10s, sending an opening control signal to the air purifier by the intelligent panel.
That is, for the scene of "call 120 is needed", the scene of "air purifier needs to be turned on", the emergency degree of the two scenes is different, and the duration of the predetermined time period is also different, namely 5s and 10s respectively.
Referring to fig. 9, fig. 9 is a schematic diagram of a hardware structure of an intelligent panel according to another embodiment of the present application.
In the present embodiment, the smart panel 90 includes a processor 91 and a memory 92 electrically connected to the processor 91, the memory 92 is used for storing a computer program, and the processor 91 is used for calling the computer program to execute the method described in any one of the above embodiments.
The embodiment of the present application also provides a storage medium, which stores a computer program, and the computer program can implement the method of any one of the above embodiments when executed by a processor.
The computer program may be stored in the storage medium in the form of a software product, and includes several instructions for causing a device or a processor to execute all or part of the steps of the method according to the embodiments of the present application.
A storage medium is a medium in computer memory for storing some discrete physical quantity. And the aforementioned storage medium may be: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules or units is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The method comprises the steps of obtaining a current scene through an intelligent panel; the intelligent panel provides an interactive interface corresponding to the scene for the user according to the scene and starts timing; the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period; when the intelligent panel does not receive a cancel instruction input by a user at the interactive interface within a preset time period, executing operation corresponding to the scene; when the intelligent panel receives a cancel instruction input by a user at the interactive interface within a preset time period, the operation corresponding to the scene is not executed, the disturbance to the user can be reduced, the complexity of the user operation is reduced, and the human-computer interaction experience is improved.
The above embodiments are merely examples and are not intended to limit the scope of the present disclosure, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present disclosure or those directly or indirectly applied to other related technical fields are intended to be included in the scope of the present disclosure.

Claims (10)

1. A timing interaction method of an intelligent panel is characterized by comprising the following steps:
the intelligent panel acquires a current scene;
the intelligent panel provides an interactive interface corresponding to the scene for a user according to the scene and starts timing;
the intelligent panel judges whether a cancel instruction input by a user at the interactive interface is received within a preset time period;
when the intelligent panel does not receive a cancel instruction input by the user at the interactive interface within the preset time period, executing operation corresponding to the scene;
and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within the preset time period, the operation corresponding to the scene is not executed.
2. The timed interaction method according to claim 1, characterized in that it further comprises:
and when the intelligent panel receives a cancel instruction input by the user at the interactive interface within the preset time period, stopping providing the interactive interface for the user.
3. The timed interaction method according to claim 1, characterized in that it further comprises:
the intelligent panel does not receive a cancel instruction input by the user at the interactive interface within the preset time period and stops providing the interactive interface for the user at the moment when the preset time period is over.
4. The timed interaction method according to claim 1, wherein the step of providing an interaction interface corresponding to the scene to the user according to the scene and starting timing by the intelligent panel comprises:
the intelligent panel acquires the emergency degree of the scene according to the scene;
and the intelligent panel sets the duration of the preset time period according to the emergency degree, wherein the duration is shorter as the emergency degree is higher.
5. The timed interaction method according to claim 1, wherein said step of acquiring a current scene comprises:
the intelligent panel acquires the current time;
and the intelligent panel searches a corresponding scene in a pre-stored corresponding relation table of time and scene according to the current time.
6. The timed interaction method according to claim 1, wherein said step of acquiring a current scene comprises:
the intelligent panel acquires at least one of data of indoor intelligent household electrical appliances and data of indoor sensing equipment through a communicator of the intelligent panel;
and the intelligent panel analyzes the current scene according to at least one of the data of the indoor intelligent household electrical appliance and the data of the indoor sensing equipment.
7. The timed interaction method according to claim 1, wherein said step of providing said user with an interaction interface corresponding to said scene comprises:
the intelligent panel controls a display of the intelligent panel to display at least one touch button corresponding to the scene according to the scene, so that the user is allowed to input a cancel instruction by touching the touch button.
8. The timed interaction method according to claim 1, wherein said step of providing said user with an interaction interface corresponding to said scene comprises:
the intelligent panel controls a loudspeaker of the intelligent panel to play inquiry voice corresponding to the scene according to the scene, and controls a sound pick-up of the intelligent panel to be in a state capable of collecting ambient sound, so that the user is allowed to input a voice cancelling instruction through the sound pick-up.
9. A smart panel comprising a processor and a memory electrically connected to the processor, the memory for storing a computer program, the processor for invoking the computer program to perform the method of any one of claims 1-8.
10. A storage medium, characterized in that the storage medium stores a computer program executable by a processor to implement the method of any one of claims 1-8.
CN201911194090.1A 2019-11-28 2019-11-28 Storage medium, intelligent panel and timing interaction method thereof Pending CN111124238A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911194090.1A CN111124238A (en) 2019-11-28 2019-11-28 Storage medium, intelligent panel and timing interaction method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911194090.1A CN111124238A (en) 2019-11-28 2019-11-28 Storage medium, intelligent panel and timing interaction method thereof

Publications (1)

Publication Number Publication Date
CN111124238A true CN111124238A (en) 2020-05-08

Family

ID=70497051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911194090.1A Pending CN111124238A (en) 2019-11-28 2019-11-28 Storage medium, intelligent panel and timing interaction method thereof

Country Status (1)

Country Link
CN (1) CN111124238A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204357A (en) * 2015-09-18 2015-12-30 小米科技有限责任公司 Contextual model regulating method and device for intelligent household equipment
CN108768805A (en) * 2018-05-29 2018-11-06 四川斐讯信息技术有限公司 A kind of smart home mode switching method and system
US20190286083A1 (en) * 2015-10-19 2019-09-19 Ademco Inc. Method of smart scene management using big data pattern analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204357A (en) * 2015-09-18 2015-12-30 小米科技有限责任公司 Contextual model regulating method and device for intelligent household equipment
US20190286083A1 (en) * 2015-10-19 2019-09-19 Ademco Inc. Method of smart scene management using big data pattern analysis
CN108768805A (en) * 2018-05-29 2018-11-06 四川斐讯信息技术有限公司 A kind of smart home mode switching method and system

Similar Documents

Publication Publication Date Title
CN109240111B (en) Intelligent home control method, device and system and intelligent gateway
JP6030808B2 (en) Intelligent remote control method, router, terminal, device, program, and recording medium
CN106603350B (en) Information display method and device
CN113268005A (en) Intelligent scene control method, system, device and storage medium
CN111487884A (en) Storage medium, and intelligent household scene generation device and method
CN110762799B (en) Storage medium, intelligent panel and air conditioner control method based on intelligent panel
CN111123723A (en) Grouping interaction method, electronic device and storage medium
CN110950204A (en) Call calling method based on intelligent panel, intelligent panel and storage medium
CN111025930A (en) Intelligent home control method, intelligent home control equipment and storage medium
CN111240221A (en) Storage medium, intelligent panel and equipment control method based on intelligent panel
CN110647050B (en) Storage medium, intelligent panel and multi-level interaction method thereof
CN111061160A (en) Storage medium, intelligent household control equipment and control method
CN107368044A (en) A kind of real-time control method of intelligent electric appliance, system
CN113589699A (en) Intelligent household scene control method and device
CN110941198A (en) Storage medium, smart panel and power-saving booting method thereof
CN110995551A (en) Storage medium, intelligent panel and interaction method thereof
CN111147935A (en) Control method of television, intelligent household control equipment and storage medium
CN111124238A (en) Storage medium, intelligent panel and timing interaction method thereof
CN111126163A (en) Intelligent panel, interaction method based on face angle detection and storage medium
CN110941196A (en) Intelligent panel, multi-level interaction method based on angle detection and storage medium
CN112306011A (en) Equipment control method and device
CN111918108A (en) Linkage control method and system, computer equipment and readable storage medium
CN111009305A (en) Storage medium, intelligent panel and food material recommendation method thereof
CN111182349A (en) Storage medium, interactive device and video playing method thereof
CN111093112A (en) Storage medium, interaction device and video shadow rendering and playing method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200508