WO2023221995A1 - Procédé de commande de dispositif intelligent et dispositif électronique - Google Patents

Procédé de commande de dispositif intelligent et dispositif électronique Download PDF

Info

Publication number
WO2023221995A1
WO2023221995A1 PCT/CN2023/094602 CN2023094602W WO2023221995A1 WO 2023221995 A1 WO2023221995 A1 WO 2023221995A1 CN 2023094602 W CN2023094602 W CN 2023094602W WO 2023221995 A1 WO2023221995 A1 WO 2023221995A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
target
intention
control
user
Prior art date
Application number
PCT/CN2023/094602
Other languages
English (en)
Chinese (zh)
Inventor
曾立
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023221995A1 publication Critical patent/WO2023221995A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • This application relates to the field of terminal technology, and in particular to intelligent device control methods and electronic devices.
  • embodiments of the present application provide an intelligent device control method and electronic device.
  • the technical solution provided by the embodiments of this application can edit scenes at the granularity of intent without setting multiple devices corresponding to the intent one by one. It can reduce the complexity of scene editing and improve the efficiency of device control.
  • an intelligent device control method is provided, which is applied to a first electronic device or a component (such as a chip system) that supports the first electronic device to implement related functions.
  • the method includes: displaying a first interface, the first interface including identification of one or more intentions; at least one intention among the one or more intentions corresponding to a plurality of electronic devices; in response to the user selecting from the one or more intentions The operation of selecting a target intention is to add the target intention to a target scene; when the triggering conditions of the target scene are met, control one or more target electronic devices to execute the target intention.
  • the one or more target electronic devices are electronic devices corresponding to the target intention.
  • the smart panel displays an interface 404 (the first interface).
  • the interface 404 includes identification of multiple intentions (such as the intention cards “Lights Fully On” and “Lights Fully Off” in area 404a , "Combined light (brightness 60%)”).
  • the smart panel adds the target intention to the viewing mode scene. (i.e. target scenario).
  • the smart panel controls one or more target electronic devices to execute the target intention.
  • the target electronic device includes a second electronic device and a third electronic device, and controlling one or more target electronic devices to execute the target intention includes:
  • the smart panel sends the first control instruction to the light 1 corresponding to the combined light intention, and sends the second control instruction to the light 2 corresponding to the combined light intention.
  • Light 1 and Lamp 2 jointly performs the combined lamp intent (brightness 60%), and the total brightness of lamp 1 and lamp 2 is 60%.
  • the target intention includes a first target intention and a second target intention;
  • the target electronic device includes a second electronic device and a third electronic device;
  • Controlling one or more target electronic devices to perform the target intent includes:
  • the viewing mode scene includes a combined light intention (first target intention) and a temperature intention (second target intention).
  • the smart panel sends a first control instruction to the lamp (second electronic device) in the living room.
  • the first control instruction is used to control the lamp in the living room to turn on and adjust the brightness to 60%.
  • the smart panel also sends a second control instruction to, for example, the air conditioner (third electronic device) in the living room.
  • the second control instruction is used to control the air conditioner to perform the temperature adjustment intention.
  • the first interface also includes identification of multiple spaces in the whole house; the first space and the second space of the plurality of spaces; the identification of the first space is selected; the one The or multiple intentions include intentions that are executable by the electronic device in the first space.
  • the interface 404 may also include identification of multiple spaces in the whole house (such as the whole house, the living room (an example of the first space), and the dining room (an example of the second space)). logo). Among them, the logo of the living room is selected (for example, filled with black and displayed).
  • the plurality of intentions contained in area 404a of interface 404 include intentions that are executable by electronic devices in the living room.
  • the method further includes:
  • the smart panel receives the user's operation on the logo of the restaurant and displays the interface 1101 (ie, the second interface).
  • the interface 1101 includes the logo of the intention that can be executed by the electronic device in the restaurant.
  • the identity of the second space is selected; the one or more intentions are multiple intentions, and the multiple intentions also include intentions that are executable by electronic devices in the second space.
  • the logo of the restaurant and the logo of the living room are both selected.
  • the multiple intentions of the area 404a of the interface 1203 include: intentions that can be executed by the electronic devices in the restaurant and intentions that can be executed by the electronic devices in the living room. Intentions (like Humidity’s Intention Cards (mid-range)).
  • Wuxi can edit and set up multiple devices one by one. .
  • the identification of the intention executable by the electronic device in the first space and the identification of the intention executable by the electronic device in the second space have different user interface UI effects.
  • the identification of intentions executable by electronic devices in the restaurant and the identification of intentions executable by electronic devices in the living room have different user interface UI effects.
  • the user can clearly identify the intentions belonging to different spaces, so that the user can quickly select the required intention in the required space and add the required intention to the target scene, which improves the efficiency of scene editing.
  • control parameter corresponding to the target intention is the first control parameter
  • the method further includes: receiving a second operation of user input;
  • the second control parameter input by the user through the first control is received, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
  • the target intention is a combination light intention
  • the initial brightness (first control parameter) corresponding to the combination light intention is 30%
  • the smart panel receives the user's operation of long pressing the combination light intention card 404k (Second operation).
  • the smart panel displays a brightness adjustment bar 404m (first control).
  • the smart panel receives the latest brightness (second control parameter) input by the user through the brightness adjustment bar 404m, so that the brightness corresponding to the combination lamp intention is adjusted to the latest brightness (for example, 60%).
  • the user can adjust the control parameters corresponding to the target intention in the target scene, which can meet the user's device control needs and improve the flexibility of device control.
  • the one or more intentions belong to one or more subsystems.
  • the one or more intentions may correspond to the lighting subsystem, heating and cooling fresh air subsystem, sunshade subsystem, etc.
  • the one or more intentions include all lights on, all lights off, and combination lights on (brightness parameter is 60%)
  • the corresponding subsystem is the lighting subsystem.
  • the one or more intentions include fully on lights, fully off lights, turning on combination lights (brightness parameter is 60%)
  • constant temperature intention, constant humidity intention the corresponding subsystems are lighting subsystem and heating and cooling fresh air subsystem.
  • an intelligent device control method applied to a first electronic device, and the method includes:
  • a fourth interface is displayed, the fourth interface includes the target intention and an identification of one or more electronic devices in the first group, the first A group is associated with the user information; the one or more electronic devices are used to execute the target intent, and the one or more electronic devices include the target electronic device;
  • the target electronic device is instructed to execute the target intention, and the user information includes any one or more of the following information: the user's location, the user's behavior, and the current time.
  • the smart panel displays an interface 1603 (the third interface), and the interface 1603 includes identification of multiple intentions.
  • the smart panel can jump to the interface 1607 (the fourth interface) corresponding to the music playback intention as shown in Figure 16B.
  • the interface 1607 includes the music playback intention. (target intent) and the identification of one or more electronic devices in the living room group (such as the identification of speaker 1 and the identification of speaker 2).
  • the living room group is determined based on user information.
  • the smart panel receives the user's feedback on the target electricity in the living room group.
  • the smart panel instructs the speaker 1 to perform the music playback intention.
  • the target electronic device group that the user wants to control can be determined based on the user information, and then the identification of the target electronic device group and the identification of each electronic device in the target electronic device group can be displayed, so that the user can browse the target electronic device
  • the identification of each electronic device in the device group makes it easy to quickly find the target electronic device you want to control, and then control the target electronic device, which helps improve the efficiency of device control.
  • receiving the user's operation on the identification of the target electronic device includes:
  • the fourth interface further includes an identification of a second group, and the second group includes other electronic devices that can execute the target intention.
  • the interface 1607 not only includes the identification of the electronic devices in the living room (the first group), but also includes the identification of the second group such as the master bedroom, the secondary bedroom, and the study room. In this way, it is convenient for users to find and understand the equipment status of each space in the whole house.
  • the distance between the identifier of the second group and the identifier of the target intention is greater than the distance between the identifier of the first group and the identifier of the target intention.
  • the distance between the logo of the second group such as the study and the logo of the central intention (music playing intention 163) is greater than the distance between the logo of the living room (ie the first group) and the logo of the central intention.
  • the distance between the logo of the living room and the logo of the central intention is close, it is conducive to quickly operating the logo of the electronic device in the living room, so as to control the corresponding electronic device.
  • the fourth interface further includes a third control parameter of the target electronic device, and the third control parameter is determined based on the user information.
  • the interface 1601 also includes third control parameters such as recommended brightness and recommended color temperature.
  • the method further includes:
  • the fourth control parameter corresponding to the target electronic device input by the user is received, so that the control parameter of the target electronic device is adjusted to the fourth control parameter.
  • the smart panel can also receive a color temperature (such as 5500K) input by the user. Subsequently, the smart panel can control the lights in the study to turn on and adjust the color temperature to 5500K.
  • the one or more intentions belong to one or more subsystems.
  • lighting subsystem audio and video subsystem.
  • an intelligent device control method including:
  • the first electronic device displays a first interface, the first interface includes identification of multiple intentions; at least one intention among the one or more intentions corresponds to multiple electronic devices;
  • the first electronic device adds the target intention to the target scene in response to the user's operation of selecting a target intention from the one or more intentions;
  • the first electronic device When the trigger condition of the target scene is met, the first electronic device sends a control instruction to one or more target electronic devices, where the one or more target electronic devices are electronic devices corresponding to the target intention;
  • the target electronic device receives the control instruction from the first electronic device and executes the target intention according to the control instruction.
  • the target electronic device includes a second electronic device and a third electronic device
  • the first electronic device sends a control instruction to one or more target electronic devices, including: sending a first control instruction to the second electronic device, and sending a second control instruction to the third electronic device, so that the The second electronic device and the third electronic device jointly execute the target intention;
  • the target electronic device receives the control instruction from the first electronic device and executes the target intention according to the control instruction, including: the second electronic device is configured to receive the control instruction from the first electronic device. the first control instruction, and execute the target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the target intention according to the second The control instructions execute the target intent.
  • the target intention includes a first target intention and a second target intention;
  • the target electronic device includes a second electronic device and a third electronic device;
  • the first electronic device sends control instructions to one or more target electronic devices, including:
  • the target electronic device receives the control instruction from the first electronic device and executes the target intention according to the control instruction, including: the second electronic device is configured to receive the control instruction from the first electronic device. the first control instruction, and indicating the first target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and according to the The second control instruction executes the second target intention.
  • the first interface also includes identification of multiple spaces in the whole house; the first space and the second space of the plurality of spaces; the identification of the first space is selected; the one The or multiple intentions include intentions that are executable by the electronic device in the first space.
  • the method further includes: the first electronic device receiving the user's operation on the identification of the second space and displaying a second interface, the second interface including the second Identification of the intent executable by electronic devices in space.
  • the identity of the second space is selected; the one or more intentions are multiple intentions, and the multiple intentions also include intentions that are executable by electronic devices in the second space.
  • the identification of the intention executable by the electronic device in the first space and the identification of the intention executable by the electronic device in the second space have different user interface UI effects.
  • control parameter corresponding to the target intention is the first control parameter
  • the method further includes: the first electronic device receiving a second operation input by the user; in response to the second operation, displaying a first control, the first control being used to input a second control corresponding to the target intention.
  • Parameter receiving the second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
  • the one or more intentions belong to one or more subsystems.
  • an intelligent device control method including:
  • the first electronic device displays a third interface, the third interface includes identification of one or more intentions, and at least one intention among the one or more intentions corresponds to multiple electronic devices;
  • the first electronic device displays a fourth interface in response to the user's operation of selecting a target intention from the one or more intentions, the fourth interface including the target intention and an identification of one or more electronic devices in the first group,
  • the first group is associated with the user information; the one or more electronic devices are used to execute the target intention, and the one or more electronic devices include a target electronic device;
  • the first electronic device receives a user's operation on the identification of the target electronic device
  • the first electronic device sends a control instruction to the target electronic device.
  • the control instruction is used to instruct the target electronic device to execute the target intention.
  • the user information includes any one or more of the following: Information: the user's location, the user's behavior, and the current time.
  • the target electronic device receives the control instruction from the first electronic device and executes the target intention according to the control instruction.
  • receiving the user's operation on the identification of the target electronic device includes:
  • the fourth interface further includes an identification of a second group, and the second group includes other electronic devices that can execute the target intention.
  • the distance between the identifier of the second group and the identifier of the target intention is greater than the distance between the identifier of the first group and the identifier of the target intention.
  • the fourth interface further includes a third control parameter of the target electronic device, and the third control parameter is determined based on the user information.
  • the method further includes:
  • the fourth control parameter corresponding to the target electronic device input by the user is received, so that the control parameter of the target electronic device is adjusted to the fourth control parameter.
  • an intelligent equipment control system including:
  • a first electronic device configured to display a first interface, where the first interface includes identification of one or more intentions; at least one of the one or more intentions corresponds to multiple electronic devices;
  • the first electronic device is further configured to send a control instruction to one or more target electronic devices when the trigger condition of the target scene is met, wherein the one or more target electronic devices correspond to the target intention.
  • electronic equipment
  • the target electronic device is configured to receive the control instruction from the first electronic device and execute the target intention according to the control instruction.
  • the target electronic device includes a second electronic device and a third electronic device
  • the first electronic device is used to send a control instruction to one or more target electronic devices, including: sending a first control instruction to the second electronic device, and sending a second control instruction to the third electronic device, to causing the second electronic device and the third electronic device to jointly execute the target intention;
  • the target electronic device is configured to receive the control instruction from the first electronic device and execute the target intention according to the control instruction, including: the second electronic device is configured to receive the control instruction from the first electronic device.
  • the device receives the first control instruction and executes the target intention according to the first control instruction; the third electronic device is used to receive the second control instruction from the first electronic device and execute the target intention according to the first control instruction.
  • the second control instruction executes the Goal intention.
  • the target intention includes a first target intention and a second target intention;
  • the target electronic device includes a second electronic device and a third electronic device;
  • the first electronic device is used to send control instructions to one or more target electronic devices, including:
  • the target electronic device is configured to receive the control instruction from the first electronic device and execute the target intention according to the control instruction, including: the second electronic device is configured to receive the control instruction from the first electronic device.
  • the device receives the first control instruction and indicates the first target intention according to the first control instruction; the third electronic device is used to receive the second control instruction from the first electronic device, and The second target intention is executed according to the second control instruction.
  • the first interface also includes identification of multiple spaces in the whole house; the first space and the second space of the plurality of spaces; the identification of the first space is selected; the one The or multiple intentions include intentions that are executable by the electronic device in the first space.
  • the first electronic device is also used for:
  • the identity of the second space is selected; the one or more intentions are multiple intentions, and the multiple intentions also include intentions that are executable by electronic devices in the second space.
  • the identification of the intention executable by the electronic device in the first space and the identification of the intention executable by the electronic device in the second space have different user interface UI effects.
  • control parameter corresponding to the target intention is the first control parameter
  • the first electronic device is also used to perform the following operations:
  • the second control parameter input by the user through the first control is received, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
  • the one or more intentions belong to one or more subsystems.
  • an intelligent equipment control system including:
  • a first electronic device configured to display a third interface, where the third interface includes identification of one or more intentions, at least one of the one or more intentions corresponding to multiple electronic devices;
  • the first electronic device is further configured to, in response to the user's operation of selecting a target intention from the one or more intentions, display a fourth interface, the fourth interface including the target intention and one or more electronic devices in the first group.
  • the identification of the device, the first group is associated with the user information; the one or more electronic devices are used to execute the target intention, and the one or more electronic devices include the target electronic device;
  • the first electronic device is also configured to receive a user's operation on the identification of the target electronic device
  • the first electronic device is further configured to send a control instruction to the target electronic device in response to the operation.
  • the control instruction is used to instruct the target electronic device to execute the target intention.
  • the user information includes any of the following: Item or multiple pieces of information: the user's location, the user's behavior, and the current time.
  • the target electronic device is configured to receive a control instruction from the first electronic device and execute the target intention according to the control instruction.
  • receiving the user's operation on the identification of the target electronic device includes:
  • the fourth interface further includes an identification of a second group, and the second group includes other electronic devices that can execute the target intention.
  • the distance between the identifier of the second group and the identifier of the target intention is greater than the distance between the identifier of the first group and the identifier of the target intention.
  • the fourth interface further includes a third control parameter of the target electronic device, and the third control parameter is determined based on the user information.
  • the first electronic device is also used for:
  • the fourth control parameter corresponding to the target electronic device input by the user is received, so that the control parameter of the target electronic device is adjusted to the fourth control parameter.
  • the one or more intentions belong to one or more subsystems.
  • embodiments of the present application provide an electronic device that has the function of implementing the smart device control method described in any of the above aspects and any of the possible implementations.
  • This function can be implemented by hardware, or can be implemented by hardware and corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • a computer-readable storage medium stores a computer program (which may also be referred to as instructions or codes).
  • the computer program When the computer program is executed by an electronic device, it causes the electronic device to perform any aspect or the method of any implementation in any aspect.
  • embodiments of the present application provide a computer program product, which when the computer program product is run on an electronic device, causes the electronic device to execute any aspect or the method of any implementation in any aspect.
  • inventions of the present application provide a circuit system.
  • the circuit system includes a processing circuit, and the processing circuit is configured to perform any aspect or the method of any implementation in any aspect.
  • embodiments of the present application provide a chip system, including at least one processor and at least one interface circuit.
  • the at least one interface circuit is used to perform transceiver functions and send instructions to at least one processor.
  • the processor executes instructions
  • at least one processor executes any aspect or a method of any implementation in any aspect.
  • Figure 1 is a schematic diagram of a scene editing method in related technologies
  • FIG. 2 is a schematic diagram of the system architecture provided by the embodiment of this application.
  • Figure 3A is a schematic diagram of a whole-house division subsystem provided by an embodiment of the present application.
  • Figure 3B is a schematic diagram of the subsystems and atomic capabilities corresponding to the leaving home scenario provided by the embodiment of the present application;
  • Figure 4 is a schematic diagram of the hardware structure of the first electronic device provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the software structure of the first electronic device provided by the embodiment of the present application.
  • FIG. 6 is a schematic diagram of the system architecture provided by the embodiment of this application.
  • Figures 7-9 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 10 is a schematic diagram of the correlation of scenarios, equipment, and subsystems provided by the embodiment of this application;
  • Figure 11 is a schematic diagram 10 of the interface provided by the embodiment of the present application.
  • Figure 12A is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 12B is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 12C is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 13 is a schematic diagram of a movie viewing mode scene provided by an embodiment of the present application.
  • FIGS 14 and 15 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 16A is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 16B is a schematic diagram of the interface provided by the embodiment of the present application.
  • FIGS 17-22 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 23 is a schematic structural diagram of a first electronic device provided by an embodiment of the present application.
  • Smart home equipment is intelligent equipment, including audio and video equipment (such as large screen equipment, Bluetooth speakers, etc.), lighting equipment (such as ceiling lamps, table lamps, spotlights, etc.), environmental Environmental control equipment (such as air conditioners, air purifiers, etc.), anti-theft alarm equipment (such as human body sensors, cameras, etc.), etc.
  • the air conditioner can receive control commands sent by the user through the mobile phone.
  • the air conditioner receives the "turn on" command input by the user through the mobile phone and can be started automatically.
  • the air conditioner receives the command "Adjust the temperature to 26°C" input by the user through the mobile phone, and can automatically adjust the temperature to 26°C.
  • a smart home application (such as a smart life application) is installed in the electronic device (such as a mobile phone).
  • the electronic device can be paired with the smart home device through the smart home application to manage and control the smart home device.
  • the electronic device needs to establish a connection with the smart home device in advance and configure the smart home device.
  • a smart life application is installed in the mobile phone
  • the smart life application is launched on the mobile phone, and the scene interface 201 shown in (a) in Figure 1 is displayed.
  • the interface 202 shown in (b) in Figure 1 is displayed.
  • the mobile phone receives the user's operation to create the scene, such as receiving the user's addition of conditions for controlling smart home devices and tasks that need to be performed by the smart home device. Task operation.
  • the user can click the add condition control 26 and add a condition for controlling the smart home device, such as adding a trigger condition "when the scene card is clicked" as shown in (f) of Figure 1 .
  • the mobile phone detects that the user clicks on the scene card of the scene, it can control the smart home device to perform the corresponding task.
  • the interface 203 shown in (c) of Figure 1 is displayed.
  • the mobile phone detects the user's click on the smart device control 23 it determines that the user needs to add a task to control the smart device, and displays the interface 204 shown in (d) of Figure 1.
  • the interface 204 includes controllable smart devices.
  • the mobile phone can jump to the interface 205 shown in (e) of Figure 1.
  • Interface 205 includes information that controls executable operations.
  • the mobile phone can jump to the interface 206 shown in (f) of Figure 1.
  • the interface 206 includes the control 24. After detecting the user's click on the control 24, the mobile phone can jump to the interface 203 shown in (c) of Figure 1. After that, the user can click the "smart device” option 23, and the mobile phone can jump to the interface 204 shown in (d) of Figure 1, and select other devices as the devices that need to be controlled in the scene to be created.
  • FIG. 2 is a schematic diagram of an intelligent device control system to which this method is applicable.
  • the smart device control system can manage and control smart devices on a home basis.
  • a family can also be called a whole house, and the whole house can be divided into different spaces.
  • the whole house includes the entrance hallway, kitchen, dining room, living room, balcony, master bedroom, secondary bedroom, bathroom, etc.
  • the whole-house system may include a first electronic device 100 that may be used to control a second electronic device 200 (such as an Internet of things (IoT) device).
  • the first electronic device 100 includes but is not limited to a mobile phone, a PC, a tablet computer, a smart home control panel (which may be referred to as a smart panel), etc.
  • the first electronic device 100 may be equipped with a device for controlling the second electronic device. 200 applications.
  • the application can be a system pre-installed application or a non-pre-installed application (such as an application downloaded from the application market).
  • the system pre-installed application includes a part of the system application (such as a service, component, or plug-in in the system application), or an independent application pre-installed in the first electronic device 100 .
  • independent applications have independent application icons.
  • the application may be a smart life application.
  • the first electronic device 100 can also control the second electronic device 200 through the control center.
  • the control center may be a shortcut control page displayed by the first electronic device 100 in response to the user's sliding operation from the upper right corner or top of the screen.
  • the first electronic device 100 can also control the second electronic device 200 through the corresponding function menu on the negative screen.
  • the negative screen may be the system service capability entry page displayed by the first electronic device 100 in response to the user's right swipe operation on the leftmost main interface.
  • the embodiments of the present application do not limit the specific manner in which the first electronic device 100 controls the second electronic device 200 .
  • a second electronic device 200 (for example, an IoT device) is also provided throughout the house.
  • the second electronic device 200 may also be called a controlled device, and the second electronic device 200 may be controlled by the first electronic device 100 .
  • the kitchen is equipped with rice cookers or electric pressure cookers, gas equipment, etc.;
  • the living room is equipped with speakers (such as smart speakers), TVs (such as smart TVs, also called smart screens, large screens, etc.), routing equipment, etc.
  • the second electronic device 200 includes but is not limited to a smart TV, a smart speaker, a smart TV, and a body fat scale.
  • Lamps such as ceiling lamps, smart desk lamps, aromatherapy lamps, etc.
  • sweeping robots smart clothes drying racks, smart rice cookers, air purifiers, humidifiers, desktop computers, routing equipment, smart sockets, water dispensers, smart refrigerators, smart air conditioners, smart Smart home devices such as switches and smart door locks.
  • the second electronic device 200 may not be a smart home device, but may be other types of devices, such as a personal computer (PC), tablet, mobile phone, smart remote control, etc.
  • PC personal computer
  • tablet mobile phone
  • smart remote control etc.
  • the embodiment of the present application does not limit the specific form of the second electronic device 200.
  • a certain device among the second electronic devices 200 may serve as a master device for controlling other second electronic devices 200 .
  • second electronic devices 200 there are many types of second electronic devices 200, and subsystems can be divided according to the functions of the second electronic device 200, such as lighting subsystems, environment subsystems, security subsystems, etc.
  • Each subsystem can correspond to one or more intents.
  • Each subsystem includes one or more devices.
  • intent is used to express user expectations, which may include, for example: turning on/off lights, playing music, turning off purification, turning on purification, fully opening curtains, fully opening curtains, constant temperature, constant humidity, etc.
  • the lighting subsystem includes various types of lights (including but not limited to ambient lights), and the corresponding intentions of the lighting subsystem may include: fully on lights, fully off lights, and turning on combination lights (brightness parameter is 60%).
  • the heating and cooling fresh air subsystem includes various types of equipment that can adjust temperature and humidity.
  • the corresponding purposes of the cooling and heating fresh air subsystem include: constant temperature purpose, constant humidity purpose, constant purification purpose, etc.
  • the sunshade subsystem includes various devices that can achieve sunshade.
  • the intentions that the sunshade subsystem can achieve include: fully opening the curtains and fully closing the curtains.
  • each intention can be implemented by one or more devices, or that each intention corresponds to one or more devices.
  • Each device can implement one or more intents.
  • the intention to adjust the light can be represented by lamps and/or Or curtain implementation.
  • curtains they can not only adjust light, but also achieve sunshade and sun protection.
  • the first electronic device 100 can instruct the second electronic device 200 to execute the user's intention.
  • multiple devices in one or more subsystems can be freely combined and integrated into a hyper terminal.
  • Each device can become a functional module for each other, realizing mutual assistance and resource sharing.
  • the system may also include a central device 300.
  • the central device 300 is also called a hub, a central control system, a host, etc.
  • the central device 300 can be used to divide the equipment in the whole house into multiple subsystems, abstract the capabilities of the devices in the subsystems according to the subsystem granularity, and form the atomic capabilities of the subsystems.
  • the atomic capabilities can adapt to different type device.
  • the hub device 300 can also generate a configuration file of the subsystem according to the atomic capabilities of the subsystem.
  • Figure 3A shows the subsystems included in the whole house, including but not limited to the following subsystems: security, lighting, network, heating and cooling fresh air, audio and video entertainment, furniture, water use, energy consumption, home appliances, and sunshade.
  • the atomic capabilities of different subsystems need to be called to meet the user's device control needs.
  • the security subsystem, lighting subsystem, and heating and cooling fresh air subsystem need to be called in the leaving home scenario.
  • the central device 300 abstracts the capabilities of each device in the lighting subsystem (including but not limited to the living room lamp, floor lamp, and curtain shown in Figure 3B) to form one or more atomic capabilities of the lighting subsystem (such as lighting mode, switch control, precise dimming, etc.).
  • Table 1 below shows a possible example of a configuration file for a lighting subsystem. From Table 1, the lighting subsystem can be abstracted into atomic capabilities such as lighting mode, switch control, and precise dimming.
  • the subsystem configuration file also includes the identification of the devices included in the subsystem.
  • the atomic capabilities of a subsystem may be obtained by abstracting the capabilities of some devices in the subsystem, or may be obtained by abstracting the capabilities of all devices in the subsystem.
  • An atomic capability of a subsystem corresponds to one or more devices in the subsystem.
  • the atomic capability of ambient light can be realized by multiple smart lights, that is, multiple smart lights can be used to jointly create light and shadow effects with a specific atmosphere.
  • the atomic ability of adjusting brightness can be realized by multiple smart lamps. Among them, the brightness of smart lamp A is adjusted to brightness A, the brightness of smart lamp B is adjusted to brightness B, the brightness of smart lamp C is adjusted to brightness C, and the brightness of smart lamp C is adjusted to brightness C. A, C, and B jointly create a light and shadow scene whose brightness is the target brightness.
  • one intention can correspond to one or more atomic capabilities
  • one atomic capability can also correspond to one or more intentions.
  • the embodiments of this application do not limit the specific correspondence between intentions and atomic capabilities.
  • intentions can include: turning on/off lights, music playing, constant temperature, constant humidity, purification off, curtains fully open, curtains fully closed, etc.
  • the intention to turn on the light may correspond to the sleep light, reading light, ambient light, partition lighting, and full light shown in Table 1.
  • Atomic abilities such as on, main light, etc.
  • the first electronic device 100 can use the executable intentions of the subsystems as the granularity of scene arrangement or editing, instead of using the device as the granularity of scene arrangement. .
  • an intention can be executed by multiple smart home devices. Therefore, in some cases, the user performs one scene orchestration based on intent granularity, which is equivalent to performing multiple scene orchestrations based on device granularity, which greatly It simplifies the user's scene arrangement operations and improves the efficiency of scene arrangement. Or, in other designs, users can add multiple intents to the scene to be created, each of which can be executed by one or more smart home devices.
  • the atomic capabilities of a subsystem can be updated on a periodic or preset basis.
  • the lighting subsystem includes reading lights, wall and door light strips, ceiling light strips, etc. Later, if the reading light is damaged and it is detected that the reading light is not online for a long time, the relevant atomic capabilities of the reading light can be removed from the lighting subsystem and the configuration file of the lighting subsystem can be updated.
  • the central device of each room or each area or the central device of the whole house can exist alone, or can be integrated with the second electronic device or the first electronic device into one device, or it can be a device in other forms. This application does not limit this.
  • the system may also include a server.
  • the server can be used to maintain subsystem configuration files.
  • the system also includes a routing device (such as a router).
  • a routing device such as a router. Routing equipment is used to connect to a local area network or the Internet, using a specific protocol to select and set the path for sending signals.
  • the second electronic device 200 is connected to the router and transmits data with devices in the local area network or devices in the Internet through the Wi-Fi channel established by the router.
  • the hub device 300 can be integrated with the routing device into one device.
  • the hub device 300 and the routing device are integrated into the routing device, that is, the routing device has the function of the hub device 300 .
  • the routing device can be one or more routing devices among the sub-master routing devices, or it can be an independent routing device.
  • the above content is only an example of a system to which the device control method is applicable.
  • the system may also include more or less devices, or different device layout locations, etc.
  • the first electronic device 100, the second electronic device 200, the server, and the central device 300 in the embodiment of the present application can be implemented by different devices.
  • the server and central device in the embodiment of the present application can be implemented by the device in Figure 4.
  • Figure 4 shows a schematic diagram of the hardware structure of the device provided by the embodiment of the present application.
  • the device includes at least one processor 501, communication line 502, memory 503 and at least one communication interface 504.
  • the memory 503 may also be included in the processor 501.
  • the structures illustrated in the embodiments of the present application do not constitute specific limitations on the electronic equipment.
  • the electronic device may include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 501 can be a general central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more processors used to control the execution of the program of the present application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication line 502 may include a path that carries information between the above-mentioned components.
  • the communication interface may be a module, Circuits, buses, interfaces, transceivers, or other devices that enable communication functions and are used to communicate with other devices.
  • the transceiver can be an independently configured transmitter, which can be used to send information to other devices.
  • the transceiver can also be an independently configured receiver, which can be used to receive information from other devices. The device receives the information.
  • the transceiver may also be a component that integrates the functions of sending and receiving information. The embodiments of this application do not limit the specific implementation of the transceiver.
  • the memory 503 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory (RAM)) or other type that can store information and instructions.
  • a dynamic storage device can also be an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disc storage (including compressed optical discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be used by a computer Any other medium for access, but not limited to this.
  • the memory may exist independently and be connected to the processor through the communication line 502 . Memory can also be integrated with the processor.
  • the memory 503 is used to store computer execution instructions for implementing the solution of the present application, and is controlled by the processor 501 for execution.
  • the processor 501 is used to execute computer execution instructions stored in the memory 503, thereby implementing the methods provided by the following embodiments of the application.
  • the computer execution instructions in the embodiments of the present application may also be called application codes, instructions, computer programs or other names, which are not specifically limited in the embodiments of the present application.
  • the processor 501 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 4 .
  • the electronic device may include multiple processors, such as the processor 501 and the processor 507 in FIG. 4 .
  • processors may be a single-CPU processor or a multi-CPU processor.
  • a processor here may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • the structure illustrated in Figure 4 does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the device shown in Figure 4 may be a general device or a special device, and the embodiment of this application does not limit the type of device.
  • the device is a smart panel, which is a dedicated device used to control smart home devices.
  • the device is a mobile phone, which is a general device that can control smart home devices.
  • the software system of the electronic device may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention uses a layered architecture
  • the system is taken as an example to illustrate the software structure of the electronic device.
  • FIG. 5 is a software structure block diagram of the first electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime (Android runtime) and system libraries, and kernel layer.
  • the application layer can include a series of applications.
  • applications can include camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • the application program also includes smart home management applications and basic services.
  • basic services open the management capabilities of smart devices to the system.
  • the smart home management application can call the basic service to query the smart home device to be controlled, and/or call the basic service to control the smart home device.
  • the smart home management application can be a smart life application.
  • Smart home applications can also be other applications with similar functions.
  • the smart home management application may be an original system application or a third-party application.
  • the embodiments of this application do not limit the types of smart home management applications.
  • smart life applications are mainly used as examples.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications. Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views. For example, a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the first electronic device 100 . For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a prompt sound is emitted, the terminal vibrates, and the indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the first electronic device 100 includes smart life applications and smart life basic services (such as HiLinkSVC).
  • Smart life applications provide a smart user interface (UI) for the whole house.
  • UI smart user interface
  • smart life applications call basic smart life services through the interface provided by software tool development kits (such as HiLink Kit SDK (software development kit)), so that various UIs can be presented through smart life applications.
  • the UI includes a UI for intent management or a UI for scene management, etc.
  • the UI may include an interface in the form of a card.
  • the smart life application can provide interface 402 as shown in Figure 7 and so on. Users can control smart home devices through the interface provided by the smart life application.
  • the smart life basic service can store the configuration files of each subsystem.
  • the configuration file of the subsystem can be obtained from the first server.
  • Smart life basic services can include the following modules:
  • the intent management module is used to manage the intent of the subsystem according to the configuration file of the subsystem.
  • the user can combine certain intentions to form a new intention.
  • users can combine lights, curtains, fresh air equipment and other intentions to generate a new "comfortable environment" intention.
  • the device control parameters corresponding to the intention can be adjusted.
  • intents may be deleted or added.
  • new intentions such as birthday party, warm homecoming, whole-house dehumidification, and whole-house ventilation can be generated.
  • the hub device 300 includes a device list management module, a blacklist management module, a fine-tuning data management module, and a batch management module.
  • the device list management module is used to obtain any one or more of the following information: the subsystem list under Home/Region/Room, the configuration file of each subsystem, and the instantiation in each subsystem A list of devices and a list of intentions that each subsystem can implement.
  • the blacklist management module is used to add/delete/modify/query blacklist devices in each subsystem.
  • the combination light can be one or more lights.
  • the batch management module is used to divide the equipment in the whole house into multiple subsystems and abstract the capabilities of the subsystems to obtain one or more atomic capabilities that the subsystems can implement.
  • the hub device 300 stores a list of all devices, a list of blacklisted devices, and intended fine-tuning parameters.
  • the hub device 300 may obtain the device list, the list of blacklisted devices, and the intended fine-tuning parameters from the server.
  • each electronic device may include more or fewer modules than shown in the figure, or combine certain modules, or split certain modules, or have different module layouts.
  • the illustrated modules may be implemented in hardware, software, or a combination of software and hardware.
  • a smart device control panel which may be referred to as a smart panel
  • the application for managing smart home devices as a smart life application as an example.
  • the electronic device after the electronic device is connected to the local area network, it can discover other electronic devices that are connected to the same local area network and/or logged in to the same account. For example, a user registers for a smart life application and obtains the username and password of the account. During the subsequent network configuration process of new electronic equipment, users can log in to the account through other electronic devices that have been configured to the network (such as mobile phones) to assist in the network configuration of the new electronic equipment. Then, the server divides the electronic devices under the same account into the same home to implement home-based electronic device management. Optionally, the server manages one or more homes, which include one or more subsystems. Each subsystem includes one or more electronic devices.
  • the smart panel After logging in to the smart life application, the smart panel detects the user's operation of adding an electronic device and sends the device information of the newly added electronic device to the server.
  • the server determines the electronic device ID and sends the electronic device to the server.
  • the device is divided into the home corresponding to the current login account of the smart panel to complete the network configuration of the electronic device.
  • the smart panel sends the device information of the newly added electronic device to the server, and the server determines the electronic device ID and divides the electronic device into the home corresponding to the smart panel.
  • the user can edit the scene through the electronic device.
  • the first electronic device displays an interface 401, and the interface 401 includes scene options 401a.
  • the electronic device can jump to interface 402 as shown in (b) of Figure 7 .
  • the interface 402 includes cards corresponding to the whole house and cards corresponding to each space in the whole house (such as the card 402a corresponding to the living room).
  • the card corresponding to each space also includes controls for setting the corresponding space scene. Taking the card 402a corresponding to the living room as an example, the card 402a is provided with a control 402b for setting the living room scene.
  • the same intention may correspond to different presentation forms, for example, it may be displayed in different forms of intention cards in the interface.
  • the intention to turn on the light as an example, as a possible implementation method, for the intention to turn on the light, in some situations (such as some fine lighting scenes or ambient lighting scenes) or in certain spaces, the user adds the intention to turn on the light to the target scene. , it may be that you want to turn on all the lights in a certain space in the target scene, then the smart panel displays the intention to turn on the lights with the "all lights on" intention card.
  • the smart panel can display the intention of specific lights in the whole house space.
  • Cards for example, can display specific intention cards such as wall and door light strips, ceiling light strips, etc.
  • the smart panel displays the intention to turn on the lights in the restaurant in the form of a "lights fully on” intention card. Users can add the "lights on” intent to the meal scene. In this way, later, when the trigger conditions of the meal scene are met, the smart panel can control the lights in the restaurant to be fully turned on to meet the user's lighting needs in the meal scene.
  • the smart panel displays the "ceiling light strip" intention card.
  • the smart panel can also display intent cards for other lights in the living room. Users can add intent cards for multiple lights in the living room to the target scene. Subsequently, when the trigger conditions of the target scene are met, the smart panel can To control multiple lights in the living room to turn on.
  • the electronic device can jump to the interface 403 shown in (c) of Figure 7.
  • the interface 403 includes one or more scenes corresponding to the living room. For example, entering the living room scene, leaving the living room scene, reading scene, party mode scene.
  • interface 403 also includes scene creation controls 403a.
  • the electronic device may provide an interface through which the user can input information such as the name of the scene to be created, the trigger condition to be created, and so on.
  • the electronic device can jump to interface 404 as shown in (d) of Figure 7 , and the user can edit the scene to be created through interface 404.
  • the trigger condition for the movie viewing mode scene to be created is: the user clicks on the card.
  • the scene editing function when providing the scene editing function to the user, the scene editing function can be provided to the user at the granularity of intention.
  • the user It is no longer necessary to select the devices that need to be controlled in the scene to be created one by one. Instead, you can select the intention, and the system will accurately calculate the best device or devices to complete the intention.
  • the interface 404 includes an area 404a and an area 404b.
  • Area 404b may be called a scene editing area.
  • Area 404a may be called an intention display area, and area 404a includes identification of one or more intentions corresponding to multiple subsystems.
  • Each intent can be implemented by one or more devices.
  • the identifier of the intention can be presented in the form of a card, and the card used to express the intention can be called an intention card.
  • the smart panel can query the configuration files of each subsystem and determine the atomic capabilities of each subsystem and the corresponding intentions of each atomic capability based on the configuration files. For example, the smart panel can query the configuration file of the lighting subsystem shown in Table 1, and based on the configuration file, determine the atomic capabilities of the lighting subsystem and the intentions corresponding to each atomic capability, and can display the corresponding intentions of the lighting subsystem.
  • the following intention cards are presented in the area 404a of the interface 404: the lights corresponding to the lighting subsystem are fully on, the lights are fully off, and the combination light is on (brightness parameter is 60%); the heating and cooling fresh air subsystem corresponds to The constant temperature, constant humidity and purification are turned off; the curtains corresponding to the sunshade subsystem are fully open and the curtains are fully closed.
  • the electronic device may add the intention card selected by the user to the scene to be created. For example, as shown in (d) of Figure 7, in response to the user's operation of dragging the temperature adjustment card 404d from the area 404a to the area 404b, the electronic device adds the temperature adjustment card to the scene to be created, that is, the intention of temperature adjustment is added to the scene to be created. Create a scene.
  • the user's operation of dragging the temperature adjustment card 404d from the area 404a to the area 404b can also be the user's click operation on the temperature adjustment card 404d, which is not limited in this application.
  • the electronic device in response to the user's operation of dragging the combined light (brightness 60%) card and the curtain full closing card from area 404a to area 404b, the electronic device adds the combined light intention (brightness 60%) and the curtain full closing intention to the waiting area.
  • the created viewing mode scene includes the intention to turn on the combination light (brightness parameter is 60%), the intention to adjust the temperature (adjust the temperature to a constant temperature of 24 degrees), and the intention to fully close the curtains.
  • the user can drag other intention cards, and the electronic device can add the corresponding intention to the scene to be created based on the user's drag operation.
  • the interface 404 also includes a control for displaying hidden intention cards in the corresponding subsystem.
  • the interface 404 includes a control 404f.
  • the electronic device can display the interface 405 as shown in Figure 7 (e), and the area of the interface 405 404a medium package Including other intention cards in the lighting subsystem, such as the intention card to turn on the combination light (color temperature 5500 Kelvin (K)).
  • the user can drag the intention card that turns on the combination light (color temperature 5500K) from area 404a to area 404b, thereby adding the intention to turn on the combination light (color temperature 6500K) to the viewing mode scene to be created.
  • the first electronic device may display the hidden intention card in the corresponding subsystem based on other operations of the user. For example, as shown in (a) of Figure 8, if the first electronic device detects that the user performs a left sliding operation near the label 404g of the lighting subsystem of the interface 404, then as shown in (b) of Figure 8, the electronic device may display The intention card 404h for turning off the hidden reading light in the lighting subsystem and the intention card 404i for turning off the wall door light strip are hidden.
  • the user can drag the intention card 404h to turn off the reading light and the intention to turn off the wall door light strip 404i from the area 404a to the area 404b, thereby adding the intention to turn off the reading light and the intention to turn off the wall door light strip to the scene to be created. It can be seen that in this solution, the user can create the target scene by making a simple selection on the current interface (such as dragging the intention card to the scene editing area 404b), which avoids the need for the user to repeat the operation as shown in Figure 1 Switching multiple interfaces for scene editing leads to low scene editing efficiency.
  • the user when creating a scene, the user can combine multiple intentions to generate a scene that can meet the device control needs. For example, combining the intentions of turning on warm lights, curtains, and fresh air equipment to generate a "comfortable environment" scene.
  • the first electronic device may receive the user's modification operation on the intention and modify the intention.
  • the intention card in the intention display area 404a such as the intention card 404h to turn off the reading light, the intention card 404k to turn on and adjust the brightness of the combination light
  • the smart panel adds these intentions to the viewing mode scene.
  • the electronic device can display the brightness adjustment bar 404m on the intention card 404k.
  • the user can drag the brightness adjustment bar 404m to adjust the brightness of the combination light (for example, adjust the brightness of the combination light to 30%).
  • the electronic device can receive the user's operations on other intention cards and adjust the control parameters of the electronic device that can achieve the corresponding intention.
  • Figure 10 shows the relationship between scenarios and devices or subsystems.
  • a scene created by a user is usually associated with multiple devices (for example, each scene is associated with an average of 7 devices). Therefore, in the scenario editing scheme with device granularity, the user needs to switch to the interfaces of multiple devices (such as the interface of the air conditioner and the interfaces of other devices shown in Figure 1), and add the corresponding devices to a scene to be created.
  • the user needs to adjust the brightness of multiple lights in the living room when editing a scene, then using the scene editing method of related technologies, the user needs to arrange the multiple lights one by one.
  • each scene is associated with 1.2 subsystems (or other number of subsystems).
  • Each subsystem The system consists of one or more devices, and scene editing is performed at the granularity of the subsystem's intent. Among them, an intention can usually be executed by multiple devices. Therefore, in some cases, a user's scene editing based on intent granularity is equivalent to multiple scene editing based on device granularity. This enables unified control of devices across multiple spaces without having to orchestrate each device individually through multiple interfaces.
  • the user when editing a scene, the user only needs to drag the combination light intention (brightness parameter 60%) logo of the lighting subsystem to the scene editing area through the current interface to complete the scene editing. In this way, the user's operations when arranging scenes are simplified, and the efficiency of scene arranging can be improved.
  • the An electronic device can save the movie viewing mode scene created this time.
  • the electronic device displays an interface 501.
  • the interface 501 includes a card 501a of the viewing mode scene created through the above process.
  • a control 501b for modifying the scene in the viewing mode can be set on the card 501a.
  • the scene can be executed when the trigger condition is met. For example, as shown in (a) of Figure 11, if the user's operation of clicking card 501a is detected, the electronic device can execute each intention corresponding to the above-created viewing mode scene, that is, the electronic device indicates that the brightness parameter 60% can be achieved One or more lights in the living room are turned on; the electronic device indicates that the reading lights, wall door light strips, and ceiling light strips are turned off; the electronic device indicates that all curtains in the living room are closed; the electronic device indicates equipment that can achieve temperature adjustment purposes (such as air conditioner A in the living room) Adjust the temperature to 24 degrees.
  • the electronic device after detecting the user's click on the control 501b on the card 501a, the electronic device can pop up a pop-up window 502 as shown in Figure 11 (b).
  • the pop-up window 502 includes each intention card included in the movie viewing mode scene.
  • the electronic device may display the brightness adjustment bar 502b on the logo 502a. The user can drag the brightness adjustment bar 502b to adjust the brightness of the combination light (for example, adjust the brightness of the combination light to 30%).
  • the electronic device can receive the user's operations on other intention cards and adjust the control parameters of the electronic device that can achieve the corresponding intention.
  • users can also add or delete intents to created scenarios.
  • the pop-up window 502 also includes an intention display area 502d of the living room-related subsystem.
  • the user can perform a sliding operation in the intention display area 502d to display the intention cards of each subsystem.
  • the intention display area 502 displays the intention card of the lighting subsystem (turning on combination lights, reading lights, and wall door light strips).
  • the smart panel in response to the user performing a swipe-up operation in the intention display area 502d, the smart panel displays the intention cards (temperature, humidity, purification) of the heating and cooling fresh air subsystem in the intention display area 502d.
  • the scene editing interface 404 may include a space identification bar 1104.
  • the space identification column 1104 includes identification of multiple spaces throughout the house. For example, space signs for the whole house, living room, dining room, etc.
  • the space identification bar 1104 may be displayed when the smart panel opens the scene editing interface 404 . Or, in other examples, after the smart panel displays the scene editing interface 404, the space identification bar 1104 is displayed in response to a specific operation by the user. For example, after the smart panel displays the scene editing interface 404, in response to the user's right sliding operation from the left edge of the screen, the smart panel displays the space identification bar 1104 on the scene editing interface 404.
  • the first electronic device may switch to the target space.
  • scene editing interface For example, as shown in (a) of Figure 12A, the smart panel displays the scene editing interface 404 of the living room. Afterwards, as shown in (b) of FIG. 12A , after detecting the user's operation of clicking on the restaurant space label 1102, the smart panel displays the scene editing interface 1101 of the restaurant. It can be seen that by operating the space tag set in the current interface, the user can control the smart panel to switch between spaces, so that the user can easily and quickly enter the scene editing interface of other spaces, and the scene editing efficiency is high.
  • the space sidebar 1104 may also include a control 1103.
  • Control 1103 is used to add space labels (such as study room, bathroom, etc. space labels).
  • FIG. 12A mainly takes an example of an interface presenting an intention identifier corresponding to a certain space.
  • an interface may present an intention identifier corresponding to multiple spaces.
  • the smart panel displays an interface 1203.
  • the interface 1203 includes labels for multiple spaces (such as a restaurant label 1203a and a living room label 1203b).
  • the smart panel displays the intention card corresponding to the restaurant and the intention card corresponding to the living room in the intention display area 404b.
  • the UI effect of the intention card corresponding to the restaurant can be different from the UI effect of the intention card corresponding to the living room, so that the user can distinguish the intention cards of different spaces.
  • the UI effect of the intent card corresponding to the restaurant is similar to the UI effect of the space label 1203a of the restaurant (for example, if the background color of the space label 1203a corresponding to the restaurant is red, then the background color of the intent card corresponding to the restaurant is also set to red), the UI effect of the intention card corresponding to the living room is similar to the UI effect of the space label 1203b of the living room.
  • the smart panel displays an interface 1201 , and the interface 1201 includes a space identification bar 1204 .
  • the UI effect of each space identifier in the space identifier bar 1204 and the space identifier bar 1104 shown in (a) of FIG. 12A may be different.
  • the shape of the space sign such as the living room is circular
  • the shape of the space sign of the living room is a rectangle.
  • the space identifier may be displayed together with the interface 1201.
  • the smart panel may display the space identification bar 1204 on the interface 1201 in response to the user's specific operation (such as a right swipe operation on the left edge of the screen).
  • the space identification bar includes control 1201a.
  • the smart panel may pop up a pop-up window 1201b.
  • the smart panel may display the space label 1201c of the restaurant in the interface 1201 .
  • the smart panel can display the intention card corresponding to the restaurant (such as card 1201d) and the intention card corresponding to the living room in the interface 1201.
  • the UI effects of the intention identifiers corresponding to different spaces can be different, so that the user can identify the space to which the corresponding intention belongs.
  • the frame of the intention card corresponding to the living room is displayed normally, and the frame of the intention card corresponding to the restaurant is displayed in bold.
  • the user can adjust parameters of the viewing mode scene. For example, after receiving the user's operation to adjust the scene parameters, the smart panel can pop up a pop-up window 1202 as shown in (c) of Figure 12C. The user can adjust the scene parameters (such as the brightness of the combination light) through the pop-up window 1202. Adjustment.
  • the first electronic device detects whether a trigger condition for executing the target scene is met, and when the trigger condition is met, the first electronic device executes the target scene. For example, taking the trigger condition for executing the viewing mode scene as clicking on the card corresponding to the viewing mode scene, when the smart panel detects that the user clicks on the card 501a as shown in (a) of Figure 11, the smart panel may instruct the viewer to Perform corresponding operations on the devices in one or more spaces associated with the shadow mode scene.
  • the smart panel obtains the intention corresponding to the viewing mode scene.
  • the brightness of the combination light is 60%
  • the reading light is turned off
  • the wall door light strip is turned off.
  • the ceiling light strip is turned off
  • the temperature is 24 degrees
  • the curtains are fully closed, the cards are dragged from the intention display area 404a to the scene editing area 404b.
  • the smart panel determines that the corresponding intention of the viewing mode scene is: turn on the combination light in the living room and adjust the brightness of the combination light to 60%, turn off the reading light in the living room, turn off the wall door light strip in the living room, turn off the ceiling light strip in the living room, Adjust the temperature in the living room to 24 degrees (target temperature) and close all curtains in the living room. Then, the smart panel can determine the target device that can realize the one or more target intentions based on one or more target intentions corresponding to the movie viewing mode scene, and instruct the target device to perform corresponding operations.
  • the smart panel can obtain the parameters of multiple lights in the living room and determine one or more target lights that can achieve 60% brightness based on the parameters of the multiple lights.
  • the target lamp may be one or more lamps that can achieve 60% brightness and have the best energy-saving effect.
  • the target light can be one or more lights that can achieve 60% brightness and have a color temperature that is optimal for movie viewing.
  • the target light can also be other types of lights.
  • the smart panel determines that the target lights that need to be controlled in the viewing mode scene are Lamp 1 - Lamp 3.
  • users can also combine multiple intentions to create new scenarios such as birthday parties, warm home, whole-house dehumidification, and whole-house ventilation.
  • the smart panel can obtain the parameters of a device in the living room that can implement temperature control functions (which can be referred to as a temperature control device for short), and determine the target temperature control device for adjusting the temperature of the living room to the target temperature from multiple temperature control devices.
  • the target temperature control device can be an electric heater or air conditioner or other temperature control device.
  • smart panels can also identify other target devices to use to achieve the target intent.
  • the above only lists exemplary methods for determining the target device. It should be understood that the first electronic device (such as a smart panel) can also use other methods to determine the target device that can achieve the target intention.
  • Figure 14 shows another example of the scene editing interface provided by the embodiment of the present application.
  • users can switch to the scene editing interface of the corresponding space through the space label.
  • the smart panel displays a scene editing interface 1401 for the whole house.
  • the scene editing interface 1401 includes labels 1401a for each scene (which may be referred to as scene labels).
  • scene tags include but are not limited to tags for leaving home scenes, tags for returning home scenes, tags for sleeping scenes, tags for reading scenes, etc. Users can select the label of a scene to edit the scene.
  • the smart panel can display the intent near the center of the display screen.
  • the intent displayed near the center of the display screen can be called a central intent.
  • the smart panel displays the central intention (leaving home) near the center of the display screen, and the user can edit the leaving home scene through the interface 1401.
  • the interface 1401 also includes a control 1401b, which is used to add scene tags.
  • interface 1401 also includes identification of various subsystems of the whole house. For example, it includes the identification of the security subsystem 1401c. For another example, it also includes the logo of the audio and video subsystem, the logo of the lighting subsystem, and the logo of the heating and cooling fresh air subsystem.
  • the user can select the identity of the target subsystem from the identities of each subsystem, and the smart panel can add the achievable intentions of the target subsystem to the away-from-home scenario. For example, as shown in (a) of Figure 14, the user drags the logo 1401c of the security subsystem in the direction of the central intention (leaving home). As shown in (b) of Figure 14, when it is detected that the logo 1401c of the security subsystem is in contact with the center. After an intent (leaving home) collision (called a collision operation) or proximity (for example, the distance is less than a preset threshold), the smart panel determines that the user wants to add a security subsystem-achievable intent to the leaving home scene.
  • an intent leaving home
  • smart panels add intents achievable by default to security subsystems to away-from-home scenarios.
  • smart panels can provide an interface through which users can adjust the intent that the security subsystem can achieve.
  • the smart panel pops up a pop-up window 1401d.
  • the pop-up window 1401d includes cards with one or more intentions that can be realized by the security subsystem (such as smart door lock closing cards, infrared curtain fully opening cards).
  • the user can add the corresponding intent to the target scene by selecting one or more intent cards.
  • one or more intent cards are activated by default, that is, if the user does not remove the intent, the intent is added to the target scene by default.
  • the user can delete some intentions, or the user can make certain intentions invalid in the away-from-home scenario through certain operations.
  • the intention 1401e will be deactivated, that is, the intention 1401e will not be added in the leaving-home scene. , but the card with intention 1401e can be retained in the pop-up window 1401d.
  • the UI effect of the deactivated intention is different from the UI effect of the activated intention.
  • the intention 1401e of the deactivation state is displayed with a UI effect with a bold border, or the intention of the deactivation state is displayed in gray (not shown in the figure).
  • the smart panel displays a prompt box 1401g to prompt the user whether to remove Selected intent.
  • the smart panel removes the intent 1401f from the pop-up window 1401d, that is, it no longer displays the intent 1401f on the pop-up window 1401d.
  • the pop-up window 1401d also includes a control 1401h, which is used to add intent to the target scene (such as leaving home scene). Similar to the solution in the previous embodiment, in the solution corresponding to Figure 14, the intended parameters can also be adjusted. For the specific method of adjusting the intention parameters, please refer to the relevant descriptions of the previous embodiments, and will not be described again here.
  • the smart panel can recommend to the user intents to be added to the scene for each scene.
  • the smart panel can determine the common intentions contained in the movie-watching mode scene based on big data analysis or other methods, and based on the common intentions in the movie-watching mode scene, As shown in Figure 15, cards with commonly used intentions are displayed in area 404a of the scene editing interface 406.
  • cards with commonly used intentions in the movie viewing mode scene include: intention cards for combination lights (brightness 60%), intention cards for temperature, Intention card to close the curtains completely.
  • the scene editing interface 406 may also include a control 404f.
  • the smart panel may display other intention cards corresponding to the lighting subsystem. Users can select some intent cards from other intent cards and add the selected intent cards to the scene to be created.
  • Embodiments of the present application also provide a smart device control method.
  • the first electronic device can determine a group of second electronic devices that the user desires to control (which can be called a target electronic device group) based on user information and user intentions, and can Display the identification of the target electronic device group and the target intent that can be achieved in the target electronic device group (such as Identification of each electronic device indicating the intention to turn on the light.
  • the target electronic device group may be an electronic device group in the living room, an electronic device group in the master bedroom, etc.
  • the identification of the target electronic device group can be living room, master bedroom, etc.
  • the smart panel displays an interface 401, and the interface 401 includes a whole-house control option 401b.
  • the smart panel can jump to the interface 1602 shown in (a) of Figure 16A.
  • the intention identification column 161 is displayed in the interface 1602 , and the intention identification column 161 is used to display the intention identification of the intention.
  • the interface 1602 includes the identification of the electronic devices in the master bedroom that can realize the intention of turning on the lights (for example, the identification of the ceiling lamp, spotlight, table lamp, and floor lamp in the interface 1602 as shown in (a) of FIG. 16A ).
  • Users can operate the signs of electronic devices in the master bedroom to control the corresponding electronic devices. For example, the user moves the sign of the spotlight to collide with the sign of the central intention (turning on the light).
  • the smart panel controls the spotlight to turn on.
  • the user moves the logo of the spotlight and the logo of the floor lamp toward the center intention.
  • the smart panel controls the spotlight and the floor lamp to turn on.
  • the smart panel determines the target electronic device in the master bedroom that the user wants to control, and controls the target.
  • the electronic device executes the intention to turn on the light.
  • the smart panel assumes that the user wants to control all the devices in the master bedroom that can achieve the intention of turning on the lights, then the smart panel controls the master bedroom to achieve the intention of turning on the lights. All equipment (such as ceiling lights, spotlights, table lamps, floor lamps) execute the intention of turning on the lights.
  • the smart panel determines that the current time period is 8:00 pm (usually the time to go home from get off work) based on the user information, and the smart panel determines that the user wants to To control the turning on of the ceiling light in the master bedroom.
  • the interface 1602 may also include identification of other electronic device groups that can realize the intention of turning on the lights (such as identification of the living room, second bedroom, and study room).
  • identification of other electronic device groups that can realize the intention of turning on the lights and the central intention is the first distance
  • the electronic device group that the user wants to control (which can be called the target electronic device group)
  • the distance between the logo and the central intention is the second distance
  • the first distance is greater than the second distance. That is to say, the logo of the target electronic device group is closer to the central intention, so that the user can drag the logo of the target electronic device in the target electronic device group until it collides with the central intention, thereby facilitating the user to control the target electronic device.
  • the distance between the logo of the master bedroom and the central intention is closer, which can shorten the time for the user to perform the collision operation. Drag distance to reduce device control time and improve device control efficiency.
  • the user is in the living room, and when the user opens the control interface corresponding to the intention to turn on the lights, the smart panel can determine based on the user's location that the user wants to control the electronic equipment in the living room, and the smart panel displays Interface 1603 includes identification of electronic devices in the living room that can realize the intention of turning on lights, such as identification of ceiling lights, spotlights, desk lamps, and floor lamps in interface 1603 as shown in (b) of Figure 16A .
  • the smart panel can determine based on the current time that the user wants to control the electronic equipment in the master bedroom, and then display it on the control interface.
  • the logos of each curtain in the master bedroom can be used to realize the curtain closing intention.
  • the user can conveniently control and close the curtains in the master bedroom by operating the logos of the curtains in the master bedroom.
  • the smart panel can also display on the control interface the identification of other device groups that can realize the curtain closing intention.
  • the user can switch intents. For example, as shown in (b) of Figure 16A, the smart panel displays an interface 1603 corresponding to the intention to turn on the lights 162. Subsequently, as shown in FIG. 16B, after detecting that the user clicks on the mark of the music playback intention in the intention mark bar 1607, the smart panel displays the interface 1607 corresponding to the music playback intention.
  • the smart panel in addition to the target electronic device group determined by the smart panel based on user information, the user also wants to control devices in other electronic device groups.
  • the smart panel can control the target electronic device group according to the user's operation. Identify other electronic devices the user wants to control.
  • the smart panel in response to the user's operation on the identification of the electronic device group, the smart panel can display the identification of the electronic device in the electronic device group, and the user can control the electronic device through the identification of the electronic device.
  • the user can drag the logo of the corresponding electronic device in the master bedroom (such as the logo of the ceiling light and spotlight) to the Collision with the central intention (turn on the lights), or drag the logo of the corresponding electronic device in the master bedroom to a position where the distance from the central intention is less than a certain threshold, to trigger the smart panel to control the corresponding electronic device to execute the intention to turn on the lights.
  • the logo of the corresponding electronic device in the master bedroom such as the logo of the ceiling light and spotlight
  • the smart panel can display the pop-up window 170 shown in Figure 17,
  • the pop-up window 170 includes the identification of each electronic device in the second bedroom that can realize the intention of turning on the light (such as the identification of a desk lamp and a ceiling light).
  • the smart panel controls the corresponding electronic device to execute the intention of turning on the light. For example, in response to the user clicking on the logo of the desk lamp and ceiling light in the second bedroom, the smart panel controls the desk lamp and ceiling light in the second bedroom to turn on.
  • the smart panel may display the identity of the target electronic device group, but does not display the identity of each electronic device in the target electronic device group.
  • the user is located in the master bedroom.
  • the smart panel can determine based on the user's location that the user wants to control the electronic equipment in the master bedroom, then The smart panel displays an interface 1605.
  • the interface 1605 includes a logo 164 of the master bedroom.
  • the interface 1605 may also include: identification of other electronic device groups that can realize the intention of turning on the lights, such as identification of the living room, second bedroom, and study room.
  • identification of other electronic device groups that can realize the intention of turning on the lights, such as identification of the living room, second bedroom, and study room.
  • the distance between the master bedroom logo and the central intention is smaller than the distance between the other group logos and the central intention. In this way, it is convenient for the user to drag the master bedroom logo to collide with the central intention.
  • the smart panel determines on its own the target electronic device in the master bedroom that the user wants to control that can achieve the intention of turning on the lights. For example, the smart panel determines that the user wants to control all the devices in the master bedroom that can turn on the lights. For another example, based on user information, the smart panel determines that the device in the master bedroom that the user wants to control that can achieve the intention of turning on the lights is a spotlight.
  • the smart panel pops up a pop-up window that includes devices in the master bedroom that can realize the intention of turning on the lights.
  • the smart panel determines the target electronic device that the user wants to control.
  • the smart panel may employ other methods to determine the target electronic device that the user wants to control.
  • the user is located in the living room, and when the user opens the control interface corresponding to the intention to turn on the lights, the smart panel can determine based on the user's location that the user wants to control the electronic equipment in the living room, then the smart panel The panel displays an interface 1606, and the interface 1606 includes a logo 172 of the living room.
  • the interface 1606 includes a logo 172 of the living room.
  • the distance between graphs is smaller than the distance between other group identities and the central intent. In this way, it is convenient for users to drag the living room logo to collide with the central intention.
  • a smart panel can determine control parameters of a target electronic device in a group of electronic devices.
  • the smart panel displays an interface 401, and the interface 401 includes a whole-house control option 401b. After receiving the operation of the user clicking on the whole house control option 401b, the smart panel can jump to the interface 1601 shown in (b) of Figure 19.
  • the interface 1601 displays an intention identification bar 161, and the intention identification bar 161 is used to display the intention of the intention. logo.
  • the light-on intention indicator 162 displayed in the intention indicator column 161 is highlighted to indicate that the interface 1601 currently displayed on the smart panel is the control interface for the light-on intention corresponding to the light-on intention indicator 162 .
  • the central intention currently displayed by the smart panel is the intention to turn on the light.
  • the user can click the sign 162 of the intention to turn on the lights, and in response to the user's operation, the smart panel can display the intention to turn on the lights near the center of the screen (called the center intention 163).
  • the smart panel can display a group of electronic devices that can realize the intention to turn on the lights around the central intention (the intention to turn on the lights).
  • the master bedroom contains electronic devices that can realize the intention of turning on the lights. Therefore, the smart panel displays the identification of the master bedroom group (such as the circle indicated by reference numeral 164 displayed on the interface 1601).
  • the smart panel can also display on the interface 1601 a group of other electronic devices that can realize the intention of turning on the lights (such as a group of electronic devices in the living room, second bedroom, and study room).
  • the smart panel can obtain authorized user information, determine the control parameters of the devices in the corresponding group based on the user information, and prompt the user with the corresponding control parameters.
  • user information includes but is not limited to one or more of the following information: user location, time, user behavior.
  • the smart panel learns that users usually read or study in the study room from 8:30 to 10:00 pm based on the user information obtained historically. Then, if the current time is 9:00 pm, when the user opens the interface 1601, the smart panel can display the recommended control parameters of the devices in each corresponding group on the interface 1601.
  • the recommended control parameters are suitable for reading or learning scenarios.
  • the smart panel in (b) of Figure 19 displays a study mark as shown in reference numeral 165 on the interface 1601, a reading suggestion brightness of 30% as shown in reference numeral 166, and a reading suggestion as shown in reference numeral 167. Reading recommended color temperature is 5000K.
  • users can adjust the recommended control parameters displayed on the smart panel. For example, as shown in (c) of FIG. 19 , the user can long press the recommended color temperature card shown as mark 167 to adjust to the new color temperature of 5500K. After that, as shown in (d) of FIG. 19 , the user can drag the study room logo 165 in the direction of the central intention (turning on the lights). After detecting that the study room logo 165 collides with the central intention, the smart panel can send a control instruction to a target device in the study room that can realize the lighting intention (brightness is 30%, color temperature is 5500K), and the target device executes the lighting intention. There can be one or more target devices.
  • the target devices are lamp 1 and lamp 2, the combined brightness of lamp 1 and lamp 2 is 30%, and the combined color temperature is 5500K.
  • the target device is lamp 3, the brightness of lamp 3 is the recommended brightness of 30%, and the color temperature is the recommended color temperature of 5500K.
  • the smart panel can recommend to the user device control parameters that are suitable for the current scene (such as reading or learning scenes) and can achieve the target intention based on user information. For example, it can recommend to users the brightness parameters and color temperature parameters of devices that are suitable for reading scenes and can achieve the intention of turning on the lights. It can be seen that the smart panel can provide users with personalized guidance related to device control, so as to effectively guide users to control smart devices.
  • the smart panel displays an interface 1604.
  • the interface 1604 includes the Electronic devices that can realize the intention to turn on the light (such as spotlights, ceiling lights, etc.) and the recommended brightness (30%) and recommended color temperature (5500K) corresponding to the intention to turn on the light.
  • the smart panel in response to the user's operation of dragging the spotlight logo toward the center, the smart panel controls the spotlight in the master bedroom to turn on.
  • the brightness of the spotlight is 30% and the color temperature is 5500K.
  • the recommended control parameters corresponding to different electronic device groups can be presented as different UI effects.
  • a card 166 with recommended brightness and a card 167 with recommended color temperature for electronic devices in the study that can turn on lights is displayed with the first UI effect (such as a slash from upper right to lower left).
  • the first UI effect such as a slash from upper right to lower left.
  • the user can drag the sign of the study room in the direction of the central intention (turn on the lights).
  • the smart panel After detecting that the study room logo collides with the central intention, the smart panel can send control instructions to the target device in the study room that can achieve the intention to turn on the lights (brightness is 30%, color temperature is 5000K), and the target device executes the intention to turn on the lights. For another example, the user can drag the logo of the living room in the direction of the central intention (turning on the lights). After detecting that the logo of the living room collides with the central intention, the smart panel can send control instructions to the target device in the living room that can achieve the intention to turn on the lights (brightness is 35%, color temperature is 5500K), and the target device executes the intention to turn on the lights.
  • the above example uses the smart panel to determine the recommended control parameters of the corresponding electronic device based on user information.
  • the smart panel can also determine the recommended control parameters of the corresponding electronic device based on other information.
  • the user can also select multiple intentions, and the smart panel can combine multiple intentions. For example, as shown in FIG. 22 , after detecting the user's operation of selecting the intention to turn on the lights 162 and the intention to play music from the intention identification bar 161 , the smart panel displays the central intention (turning on the lights, playing music) near the center of the screen. In response to the user dragging the master bedroom logo to collide with the central intention, or the user dragging the master bedroom logo to be less than a certain distance from the central intention, the smart panel can control the lights in the master bedroom to turn on and control the speakers in the master bedroom. Wait for the device to perform music playback.
  • the smart panel can also determine the device control parameters corresponding to the intention to turn on lights and the intention to play music based on user information. For example, if the current time is 10:00 pm, the smart panel determines that the recommended volume corresponding to the music playback intention is xx decibels (dB), and displays a card with the recommended volume as marked 168. In this way, it is possible to effectively guide the user to control related devices with music playback intentions, so as to avoid disturbing nearby neighbors during music playback.
  • dB decibels
  • the layout of the scene editing interface can also be to display the trigger conditions of the scene in the right column of the scene editing interface, and to display the intention cards of the subsystems in the lower area of the scene editing interface.
  • the operations on the controls in the interface are only examples.
  • the user can trigger the first electronic device to perform the target operation by performing a certain operation on the target control in the interface.
  • Other operations may also be defined, and when it is detected that the user performs the other operations on the target control, the first electronic device performs the target operation.
  • groups may also be divided according to other dividing standards.
  • groups can be divided according to the distance between the device and the user. Devices whose distance from the user is less than a first threshold can be divided into the first group.
  • the smart panel can display the identification of each device in the first group, so that the user can control the nearby devices.
  • the smart panel can determine the electronic device group that the user wants to control based on the user information, and can determine recommended control parameters for the target electronic device in the electronic device group based on the user information.
  • the devices in the embodiments of the present application include hardware structures and/or software modules corresponding to each function.
  • the embodiments of this application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art can use different methods to implement the described functions for each specific application, but such implementation should not be considered to be beyond the scope of the technical solutions of the embodiments of the present application.
  • Embodiments of the present application can divide the electronic device into functional units according to the above method examples.
  • each functional unit can be divided corresponding to each function, or two or more functions can be integrated into one processing unit.
  • the above integrated units can be implemented in the form of hardware or software functional units. It should be noted that the division of units in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • FIG 23 shows a schematic block diagram of the first electronic device provided in the embodiment of the present application.
  • the information transmission device may be the above-mentioned receiving device or sending device.
  • the first electronic device 3300 may exist in the form of software, or may be a chip that can be used in the device.
  • the first electronic device 3300 includes: a processing unit 3303, a transceiver unit 3302, and a display unit 3301.
  • the transceiver unit 3302 can also be divided into a sending unit (not shown in Figure 23) and a receiving unit (not shown in Figure 23).
  • the sending unit is used to support the first electronic device 3300 in sending information to other devices.
  • the receiving unit is used to support the first electronic device 3300 to receive information from other devices.
  • Display unit 3301 is used to support display content.
  • the first electronic device 3300 may also include a storage unit 1701 (not shown in the figure) for storing program codes and data of the first electronic device 3300.
  • the data may include but is not limited to original data or intermediate data.
  • the processing unit 3303 may be a controller or the processor 501 and/or the processor 507 shown in FIG. 4 , for example, it may be a central processing unit (Central Processing Unit, CPU), a general-purpose processor, a digital signal Processing (Digital Signal Processing, DSP), Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof. It may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with this disclosure.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of DSP and microprocessors, and so on.
  • the transceiver unit 3302 may be the communication interface 504 shown in Figure 4, or may be a transceiver circuit, transceiver, radio frequency device, etc.
  • the storage unit 1701 may be the memory 503 shown in FIG. 4 .
  • the display unit 3301 may include a display screen.
  • An embodiment of the present application also provides an electronic device, including one or more processors and one or more memories.
  • the one or more memories are coupled to one or more processors.
  • the one or more memories are used to store computer program codes.
  • the computer program codes include computer instructions.
  • the electronic device causes the electronic device to execute The above related method steps implement the information transmission method in the above embodiment.
  • An embodiment of the present application also provides a chip system, including: a processor, the processor is coupled to a memory, and the memory is used to store programs or instructions. When the program or instructions are executed by the processor, the The chip system implements the method in any of the above method embodiments.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be implemented in hardware or software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software code stored in memory.
  • the memory may be integrated with the processor or may be provided separately from the processor, which is not limited by this application.
  • the memory can be a non-transient processor, such as a read-only memory ROM, which can be integrated on the same chip as the processor, or can be separately provided on different chips.
  • This application describes the type of memory, and the relationship between the memory and the processor. There is no specific limitation on how the processor is configured.
  • the chip system can be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a system on chip (SoC), or It can be a central processing unit (CPU), a network processor (NP), a digital signal processing circuit (DSP), or a microcontroller unit , MCU), it can also be a programmable logic device (PLD) or other integrated chip.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller unit
  • PLD programmable logic device
  • each step in the above method embodiment can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the method steps disclosed in conjunction with the embodiments of this application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • Computer instructions are stored in the computer-readable storage medium.
  • the electronic device causes the electronic device to execute the above related method steps to implement the above embodiments. information transmission method.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform the above related steps to implement the information transmission method in the above embodiment.
  • inventions of the present application also provide a device.
  • the device may be a component or module.
  • the device may include a connected processor and a memory.
  • the memory is used to store computer execution instructions. When the device runs When the processor executes the computer execution instructions stored in the memory, the device executes the information transmission method in each of the above method embodiments.
  • the electronic devices, computer-readable storage media, computer program products or chips provided by the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects they can achieve can be referred to the above provided The beneficial effects of the corresponding methods will not be described again here.
  • This embodiment can divide the electronic device into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • the disclosed method can be implemented in other ways.
  • the terminal device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division.
  • there may be other division methods, such as multiple units or components. can be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of modules or units, which may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in various embodiments of the application.
  • the aforementioned storage media include: flash memory, mobile hard drive Various media that can store program instructions, such as disk, read-only memory, random access memory, magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé de commande de dispositif intelligent et dispositif électronique. Un scénario peut être modifié avec une intention en tant que granularité, il n'est pas nécessaire de régler, un par un, une pluralité de dispositifs correspondant à l'intention, si bien que la complexité de la modification de scénario peut être réduite, et l'efficacité de commande de dispositif peut être améliorée. Le procédé peut être appliqué à un premier dispositif électronique ou à un composant (tel qu'un système de puce) qui supporte le premier dispositif électronique pour mettre en œuvre des fonctions associées. Le procédé consiste à : afficher une première interface, la première interface comprenant un identifiant d'une ou plusieurs intentions, et au moins l'une de la ou des intentions correspondant à une pluralité de dispositifs électroniques ; en réponse à une opération d'un utilisateur sélectionnant une intention cible parmi la ou les intentions, ajouter l'intention cible dans un scénario cible ; et lorsqu'une condition de déclenchement du scénario cible est satisfaite, commander un ou plusieurs dispositifs électroniques cibles pour exécuter l'intention cible, le ou les dispositifs électroniques cibles étant des dispositifs électroniques correspondant à l'intention cible.
PCT/CN2023/094602 2022-05-19 2023-05-16 Procédé de commande de dispositif intelligent et dispositif électronique WO2023221995A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210546887.9A CN117130284A (zh) 2022-05-19 2022-05-19 智能设备控制方法及电子设备
CN202210546887.9 2022-05-19

Publications (1)

Publication Number Publication Date
WO2023221995A1 true WO2023221995A1 (fr) 2023-11-23

Family

ID=88834670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/094602 WO2023221995A1 (fr) 2022-05-19 2023-05-16 Procédé de commande de dispositif intelligent et dispositif électronique

Country Status (2)

Country Link
CN (1) CN117130284A (fr)
WO (1) WO2023221995A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460328A (zh) * 2014-10-29 2015-03-25 小米科技有限责任公司 基于设定场景模式的智能设备控制方法和装置
CN105652671A (zh) * 2015-12-25 2016-06-08 小米科技有限责任公司 智能设备工作模式的设置方法和装置
CN106873551A (zh) * 2016-11-30 2017-06-20 芜湖美智空调设备有限公司 一种不同家电间的联动方法及系统
WO2017141219A1 (fr) * 2016-02-18 2017-08-24 Tekoia Ltd. Architecture de commande à distance de dispositifs d'ido (internet des objets)
CN111176133A (zh) * 2020-02-11 2020-05-19 青岛海信智慧家居系统股份有限公司 一种智能家居场景的确定方法及装置
CN111176517A (zh) * 2019-12-31 2020-05-19 青岛海尔科技有限公司 用于场景设置的方法、装置及手机
CN112180754A (zh) * 2020-10-20 2021-01-05 珠海格力电器股份有限公司 智能控制场景的设定方法、设备控制系统
CN113114779A (zh) * 2021-04-23 2021-07-13 杭州萤石软件有限公司 物联网设备联动的配置方法、终端、系统
CN113412457A (zh) * 2019-05-16 2021-09-17 深圳市欢太科技有限公司 场景推送方法、装置、系统、电子设备以及存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460328A (zh) * 2014-10-29 2015-03-25 小米科技有限责任公司 基于设定场景模式的智能设备控制方法和装置
CN105652671A (zh) * 2015-12-25 2016-06-08 小米科技有限责任公司 智能设备工作模式的设置方法和装置
WO2017141219A1 (fr) * 2016-02-18 2017-08-24 Tekoia Ltd. Architecture de commande à distance de dispositifs d'ido (internet des objets)
CN106873551A (zh) * 2016-11-30 2017-06-20 芜湖美智空调设备有限公司 一种不同家电间的联动方法及系统
CN113412457A (zh) * 2019-05-16 2021-09-17 深圳市欢太科技有限公司 场景推送方法、装置、系统、电子设备以及存储介质
CN111176517A (zh) * 2019-12-31 2020-05-19 青岛海尔科技有限公司 用于场景设置的方法、装置及手机
CN111176133A (zh) * 2020-02-11 2020-05-19 青岛海信智慧家居系统股份有限公司 一种智能家居场景的确定方法及装置
CN112180754A (zh) * 2020-10-20 2021-01-05 珠海格力电器股份有限公司 智能控制场景的设定方法、设备控制系统
CN113114779A (zh) * 2021-04-23 2021-07-13 杭州萤石软件有限公司 物联网设备联动的配置方法、终端、系统

Also Published As

Publication number Publication date
CN117130284A (zh) 2023-11-28

Similar Documents

Publication Publication Date Title
JP7254894B2 (ja) 接続式照明システム
US11985716B2 (en) Discovery of connected devices to determine control capabilities and meta-information
CN108111948B (zh) 在语音接口设备处的服务器提供的视觉输出
US10158536B2 (en) Systems and methods for interaction with an IoT device
US9306763B2 (en) Providing a user interface for devices of a home automation system
US7047092B2 (en) Home automation contextual user interface
CN108431765B (zh) 设备应用的生成
JP2022547319A (ja) ホームオートメーションシステムのための3次元仮想ルームに基づくユーザーインターフェイス
US20150128050A1 (en) User interface for internet of everything environment
CN107948231B (zh) 基于场景的服务提供方法、系统和操作系统
EP2744152A2 (fr) Appareil de terminal d'utilisateur, dispositif de réseau et procédé de commande de celui-ci
KR102628856B1 (ko) 전자 장치 간 콘텐츠 공유 시스템 및 전자 장치의 콘텐츠 공유 방법
JP2013539329A (ja) モバイル電話機によってホストされる会議コントロール
CN109240098B (zh) 设备配置方法、装置、终端设备及存储介质
US20180198872A1 (en) Method, system and device for providing service
WO2024045985A1 (fr) Procédé de commande d'écran, appareil de commande d'écran, dispositif électronique, programme et support
TWI738832B (zh) 基於場景的應用操作方法、裝置、終端設備和操作系統
WO2016123847A1 (fr) Procédé et dispositif de commande d'application
WO2023221995A1 (fr) Procédé de commande de dispositif intelligent et dispositif électronique
CN116301501A (zh) 区域设置方法、装置、电子设备及存储介质
WO2019027829A2 (fr) Synchronisation entre plusieurs dispositifs pour des expériences immersives
CN114967485A (zh) 设备控制方法、装置、电子设备及存储介质
JP7312162B2 (ja) 小画面の仮想部屋ベースのユーザインタフェイス
WO2024067308A1 (fr) Procédé de commande de dispositif intelligent, dispositif électronique, et système
WO2024131708A1 (fr) Procédé de configuration d'environnement et dispositif

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23806946

Country of ref document: EP

Kind code of ref document: A1