WO2023174429A1 - 智能设备控制方法及电子设备 - Google Patents

智能设备控制方法及电子设备 Download PDF

Info

Publication number
WO2023174429A1
WO2023174429A1 PCT/CN2023/082333 CN2023082333W WO2023174429A1 WO 2023174429 A1 WO2023174429 A1 WO 2023174429A1 CN 2023082333 W CN2023082333 W CN 2023082333W WO 2023174429 A1 WO2023174429 A1 WO 2023174429A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
electronic device
interface
electronic devices
Prior art date
Application number
PCT/CN2023/082333
Other languages
English (en)
French (fr)
Inventor
高晓强
李乐
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023174429A1 publication Critical patent/WO2023174429A1/zh

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Definitions

  • Embodiments of the present application relate to the field of communication technology, and in particular to intelligent device control methods and electronic devices.
  • embodiments of the present application provide an intelligent device control method and electronic device.
  • the technical solution provided by the embodiments of this application can automatically switch the device control interface used to control smart devices for users, thereby simplifying the device control process and improving user experience.
  • an intelligent device control method is provided, which is applied to a first electronic device.
  • the method includes:
  • the first user information that has been authorized by the user is obtained, and a first interface is displayed according to the first user information.
  • the first interface includes m second electronic devices associated with the first user information. information, m is an integer greater than 1;
  • a first operation is detected that acts on a control on the first interface for controlling n electronic devices among the m second electronic devices to perform a first target operation, where n is an integer not less than 1 and not greater than m. ;
  • control the n electronic devices In response to the first operation, control the n electronic devices to perform the first target operation;
  • the second interface includes k second user information associated with the second user information.
  • k is an integer greater than 1;
  • a second operation is detected that acts on a control on the second interface for controlling j electronic devices among the k second electronic devices to perform a second target operation, where j is an integer not less than 1 and not greater than k. ;
  • the j electronic devices are controlled to perform the second target operation.
  • first user information and the second user information include any one or more of the following information: location information, time information, and behavior information.
  • the device control method provided by this application This method does not rely on the user's manual search operation, but can automatically and intelligently switch to the corresponding device control interface, and recommend the second electronic device associated with the user's information to the user through the device control interface, thereby simplifying user operations and improving device control efficiency. At the same time, it improves the user’s interactive experience.
  • the m second electronic devices include information about the first home where the user is at the first moment. Electronic devices in the first home where the moment is;
  • the k second electronic devices include the information about the second home where the user is at the second time. Electronic devices in second homes.
  • the first electronic device can display the device control interface associated with the user's location to the user according to the user's location information, so that the user can conveniently control the device near his or her location through the device control interface.
  • Second electronic device (such as smart home device).
  • the m second electronic devices include all electronic devices in the first space; the first interface does not include information about other electronic devices except the m second electronic devices;
  • the second user information also includes information about the second space where the user is in the second home at the second moment
  • the k second electronic devices include the Electronic devices in the second space
  • the first interface does not include information about other electronic devices except the k second electronic devices.
  • the first user information also includes information about the first space where the user is in the first home at the first moment
  • the m second electronic devices Including the electronic equipment in the first space where the user is located at the first time
  • the identification information of the electronic equipment in the first space where the user is located at the first time is displayed on the first interface as The preset user interface UI effect is highlighted, and/or the identification information of the electronic device of the user in the first space at the first moment is ranked in front of other electronic devices on the first interface;
  • the k second electronic devices include the user
  • the identification information of the electronic device located in the second space of the user at the second moment is displayed on the second interface with a preset UI effect.
  • the identification information of the electronic device in the second space where the user is located at the second moment is highlighted and/or arranged in front of other electronic devices on the second interface.
  • the m second electronic devices are included in the m electronic devices that are close to the user at a time;
  • the k second electronic devices include distance information between the user and the second electronic device at the second time. Describe k electronic devices that are close to each other.
  • the information of the m second electronic devices is displayed in the first interface in the form of a pop-up window, and/or the information of the k second electronic devices is displayed in the form of a pop-up window.
  • the second interface In the second interface;
  • the method also includes: stopping displaying the pop-up window after a preset period of time since the pop-up window is displayed.
  • the information of the m second electronic devices is displayed on the first page with a preset UI effect.
  • the information of the k second electronic devices is displayed in the second interface with a preset UI effect.
  • the electronic device can prompt the user to pay attention to important information in the first interface and/or the second interface, thereby helping the user to more quickly find the second electronic device he wants to control from the interface.
  • the m second electronic devices are the first behavior performed by the user at the first moment.
  • the user will control m electronic devices;
  • the k second electronic devices are when the user performs the second behavior, k electronic devices that the user will control.
  • the m second electronic devices are the m electronic devices that the user will control at the first time
  • the k second electronic devices are the k electronic devices that the user will control at the second time.
  • the method further includes:
  • a third interface is displayed according to the first user information, and the third interface is used to recommend target execution scenarios to the user.
  • the method further includes:
  • the target execution scenario is executed.
  • an intelligent device control device may be a first electronic device or a component capable of completing the functions of the first electronic device (or a component that supports the first electronic device to complete corresponding functions, such as a chip system).
  • the device includes:
  • a processing unit configured to obtain the first user information that has been authorized by the user at the first moment
  • a display unit configured to display a first interface according to the first user information, where the first interface includes information on m second electronic devices associated with the first user information, where m is an integer greater than 1;
  • An input unit configured to detect a first operation on a control on the first interface for controlling n electronic devices among the m second electronic devices to perform a first target operation, where n is not less than 1 and An integer not greater than m;
  • the processing unit is also configured to control the n electronic devices to perform the first target operation in response to the first operation;
  • the processing unit is also configured to obtain the second user information that has been authorized by the user at the second moment;
  • the display unit is further configured to display a second interface according to the second user information, where the second interface includes information on k second electronic devices associated with the second user information, where k is an integer greater than 1;
  • the input unit is also used to detect a second operation acting on a control on the second interface for controlling j electronic devices among the k second electronic devices to perform a second target operation, where j is not less than 1 and is an integer not greater than k;
  • the processing unit is also configured to control the j electronic devices to perform the second target operation in response to the second operation.
  • first user information and the second user information include any one or more of the following information: location information, time information, and behavior information.
  • the m second electronic devices include information about the first home where the user is at the first moment. Electronic devices in the first home where the moment is;
  • the k second electronic devices include the information about the second home where the user is at the second time. Electronic devices in second homes.
  • the m second electronic devices include all electronic devices in the first space; the first interface does not include information about other electronic devices except the m second electronic devices;
  • the second user information also includes information about the second space where the user is in the second home at the second moment
  • the k second electronic devices include the Electronic devices in the second space
  • the first interface does not include information about other electronic devices except the k second electronic devices.
  • the first user information also includes information about the first space where the user is in the first home at the first moment
  • the m second electronic devices Including the electronic equipment in the first space where the user is located at the first time
  • the identification information of the electronic equipment in the first space where the user is located at the first time is displayed on the first interface as The preset user interface UI effect is highlighted, and/or the identification information of the electronic device of the user in the first space at the first moment is ranked in front of other electronic devices on the first interface;
  • the k second electronic devices include the user
  • the identification information of the electronic device located in the second space of the user at the second moment is displayed on the second interface with a preset UI effect.
  • the identification information of the electronic device in the second space where the user is located at the second moment is highlighted and/or arranged in front of other electronic devices on the second interface.
  • the m second electronic devices are included in the m electronic devices that are close to the user at a time;
  • the k second electronic devices include distance information between the user and the second electronic device at the second time. Describe k electronic devices that are close to each other.
  • the information of the m second electronic devices is displayed in the first interface in the form of a pop-up window, and/or the information of the k second electronic devices is displayed in the form of a pop-up window.
  • the second interface In the second interface;
  • the display unit is also configured to stop displaying the pop-up window after a preset period of time since the pop-up window is displayed.
  • the information of the m second electronic devices is displayed in the first interface with a preset UI effect, and/or the information of the k second electronic devices is displayed with a preset UI effect.
  • the effect is displayed in the second interface.
  • the m second electronic devices are the first behavior performed by the user at the first moment.
  • the user will control m electronic devices;
  • the k second electronic devices are when the user performs the second behavior, k electronic devices that the user will control.
  • the m second electronic devices are the m electronic devices that the user will control at the first time
  • the k second electronic devices are the k electronic devices that the user will control at the second time.
  • the display unit is further configured to display a third interface according to the first user information, and the third interface is used to recommend target execution scenarios to the user.
  • the input unit is also used to receive a third operation input by the user on the third interface
  • the processing unit is further configured to add the target execution scenario in response to the third operation; and execute the target execution scenario when a trigger condition of the target execution scenario is met.
  • embodiments of the present application provide an electronic device that has the function of implementing the smart device control method described in the above first aspect and any possible implementation manner.
  • This function can be implemented by hardware, or can be implemented by hardware and corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • a computer-readable storage medium stores a computer program (which may also be referred to as instructions or codes).
  • the computer program When the computer program is executed by an electronic device, it causes the electronic device to perform the method of the first aspect or any one of the embodiments of the first aspect.
  • embodiments of the present application provide a computer program product, which when the computer program product is run on an electronic device, causes the electronic device to execute the method of the first aspect or any one of the implementation modes of the first aspect.
  • inventions of the present application provide a circuit system.
  • the circuit system includes a processing circuit, and the processing circuit is configured to execute the method of the first aspect or any one of the implementation modes of the first aspect.
  • embodiments of the present application provide a chip system, including at least one processor and at least one interface circuit.
  • the at least one interface circuit is used to perform transceiver functions and send instructions to at least one processor.
  • at least one processor When at least one processor When executing instructions, at least one processor performs the method of the first aspect or any one of the implementations of the first aspect.
  • Figure 1A is a schematic diagram of a family scene provided by an embodiment of the present application.
  • Figure 1B is a schematic diagram of coordinate transformation provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of the hardware structure of the first electronic device provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of the software structure of the electronic device provided by the embodiment of the present application.
  • Figure 4 is a schematic diagram of the interface provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 6A is a schematic flowchart of an intelligent device control method provided by an embodiment of the present application.
  • Figure 6B is a schematic flowchart of an intelligent device control method provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of the positioning method provided by the embodiment of the present application.
  • FIGS 8-11 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 12 is a schematic flowchart of an intelligent device control method provided by an embodiment of the present application.
  • FIGS 13-16 are schematic interface diagrams provided by embodiments of the present application.
  • Figure 17 is a schematic flowchart of an intelligent device control method provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of the interface provided by the embodiment of the present application.
  • Figure 19 is a schematic flowchart of an intelligent device control method provided by an embodiment of the present application.
  • Figure 20 is a schematic structural diagram of an intelligent device control device provided by an embodiment of the present application.
  • FIG. 1A is a schematic diagram of an intelligent device control system to which this method is applicable.
  • the smart device control system can manage and control smart devices on a household basis.
  • a family can also be called a whole house, and the whole house can be divided into different spaces.
  • the whole house includes the entrance hallway, kitchen, dining room, living room, balcony, master bedroom, secondary bedroom, bathroom, etc.
  • the whole-house system may include a first electronic device 100 for controlling a second electronic device 200 (such as an Internet of things (IoT) device).
  • the first electronic device 100 includes but is not limited to a mobile phone, a PC, a tablet computer, etc.
  • the first electronic device 100 may be installed with an application program for controlling the second electronic device 200 .
  • the application can be a system pre-installed application or a non-pre-installed application (such as an application downloaded from the application market). It should be understood that system pre-installed applications include part of system applications. Parts (such as services, components, or plug-ins in system applications), or independent applications pre-installed in the first electronic device, have independent application icons.
  • the application program can be a smart life application.
  • the first electronic device 100 can also control the second electronic device 200 through the control center.
  • the control center may be a shortcut control page displayed by the first electronic device 100 in response to the user's sliding operation from the upper right corner or top of the screen.
  • the first electronic device 100 can also control the second electronic device 200 through the corresponding function menu on the negative screen.
  • the negative screen may be the system service capability entry page displayed by the first electronic device 100 in response to the user's right swipe operation on the leftmost main interface.
  • At least one third electronic device 300 is provided throughout the house.
  • each room or area includes at least one third electronic device 300 .
  • the third electronic device 300 is used to locate the second electronic device 200, and/or the first electronic device 100, and/or the user, and provide the location information of the second electronic device 200, and/or the first electronic device 300.
  • the location information of the device 100 and/or the location information of the user is reported to the central device 400 .
  • the third electronic device 300 may include a sensor, which is responsible for collecting the user's spatial location information.
  • the third electronic device 300 may be a camera, which collects image information of other devices and/or users, and determines the location information of each device and/or user based thereon.
  • the third electronic device 300 detects through a sensor that the space where the user is located is the master bedroom.
  • the third electronic device 300 can collect the user's spatial location information in real time or periodically or according to other strategies.
  • the third electronic device 300 includes an ultra-wideband (Ultra-Wide Band, UWB) module and/or a millimeter wave radar module.
  • the third electronic device 300 locates the second electronic device 200 and/or the first electronic device 100 through the UWB module.
  • the third electronic device 300 locates the second electronic device 200 and/or the first electronic device 100 through the millimeter wave radar module.
  • the third electronic device 300 includes a wireless fidelity (Wi-Fi) module, and the third electronic device 300 performs the processing on the second electronic device 200 and/or the first electronic device 100 through the Wi-Fi module. position.
  • the third electronic device 300 can perform joint positioning through the above-mentioned multiple modules.
  • the third electronic device 300 can also detect at least one of the user's physiological characteristics, identity category, body posture and other information, and upload it to the central device 400 in a wired or wireless manner.
  • a second electronic device 200 (for example, an IoT device) is also provided throughout the house.
  • the second electronic device 200 may also be called a controlled device, and the second electronic device 200 may be controlled by the first electronic device 100 .
  • the kitchen is equipped with a rice cooker or electric pressure cooker, gas equipment, etc.
  • the living room is equipped with speakers (such as smart speakers), TVs (such as smart TVs, also called smart screens, large screens, etc.), routing equipment, etc.
  • the balcony is equipped with Clothes drying rack (for example, smart clothes drying rack, etc.);
  • the restaurant is equipped with a sweeping robot, etc.;
  • the master bedroom is equipped with a TV (for example, smart TV), speakers (for example, smart speakers), floor lamps (for example, smart floor lamps), routing equipment, etc.;
  • the second bedroom is equipped with a desk lamp (for example, a smart desk lamp), a speaker (for example, a smart speaker), etc.;
  • the bathroom is equipped with a body fat scale, etc.
  • the second electronic device 200 includes but is not limited to smart TVs, smart speakers, smart lamps (such as absorbers). Ceiling lamps, smart desk lamps, aromatherapy lamps, etc.), sweeping robots, body fat scales, smart clothes drying racks, smart rice cookers, air purifiers, humidifiers, desktop computers, routing equipment, smart sockets, water dispensers, smart refrigerators, smart air conditioners, smart Smart home devices such as switches and smart door locks.
  • the second electronic device 200 can also It is not a smart home device, but a portable device, such as a personal computer (PC), tablet, mobile phone, smart remote control, etc. The embodiment of the present application does not limit the specific form of the second electronic device 200.
  • the system may also include a central device 400.
  • the central device 400 is also called a hub, a central control system, a host, etc.
  • the hub device 400 may be used to receive information (such as positioning information) sent by the third electronic device 300 .
  • the hub device 400 can determine the space where the user and/or the first electronic device 100 are located based on the positioning information and the household type information, and determine the specific location (such as coordinates) of the user and/or the first electronic device 100 in the space.
  • the central device 400 also notifies or controls the second electronic device 200 based on the received information (including but not limited to positioning information).
  • the central device 400 when a user wakes up a smart speaker through voice, the central device 400 notifies or controls one or more smart speakers closest to the user to wake up based on the locations of multiple smart speakers in the whole house. For example, when the user moves from one room to another in the house, the central device 400 controls the smart speakers in the room where the user left to stop playing audio, and controls the smart speakers in the room where the user entered to start playing (for example, continuing to play) the audio.
  • the central device 400 can also be used to construct a whole-house map based on the floor plan of the house, establish a whole-house coordinate system, and convert the location information obtained by each third electronic device 300 into the whole-house coordinate system.
  • the location information of the second electronic device 200 and/or the first electronic device 100 and/or the user detected and acquired by each third electronic device 300 can be converted into the whole-house coordinate system, and the second electronic device 200 or the user can be determined. specific locations throughout the house.
  • the third electronic device establishes a coordinate system as shown in (a) of FIG. 1B (called the first coordinate system).
  • Oe is the origin
  • Xe is the X-axis
  • Ye is the Y-axis
  • Ze is the Z-axis.
  • the central device 400 establishes a whole-house coordinate system as shown in (b) of Figure 1B based on the whole-house floor plan.
  • Oh is the origin
  • Xh is the X axis
  • Yh is the Y axis
  • Zh is the Z axis.
  • the first coordinate system can be converted into a whole-house coordinate system, and the coordinates of the midpoint of the first coordinate system can be converted into a whole-house coordinate system.
  • the central device 400 can obtain the coordinates of the corresponding point Ob' of point Ob in the whole-house coordinate system in the form of vectors.
  • the distance between two points is the same in different coordinate systems, but the direction representation of the vector formed by the two points in different coordinate systems may be different. For example, if you want to convert the coordinates of point Ob in the first coordinate system to the coordinates of point Ob' corresponding to point Ob in the whole house coordinate system, you can use vectors to convert.
  • the distance of the vector (O_e O_b) ⁇ in the first coordinate system and (O_h O_b') ⁇ the distance in the whole house coordinate system (both L) is the same, but the direction of vector (O_e O_b) ⁇ expressed in the first coordinate system is different from the direction of vector (O_h O_b') ⁇ expressed in the whole house coordinate system.
  • the vector (O_e O_b) ⁇ the direction represented by the first coordinate system
  • the vector (O_h O_b') ⁇ using the whole house coordinates can be obtained
  • the direction represented by the system; combined with the coordinates of the Oe point, the Ob point in the first coordinate system, and the coordinates of the Oh point in the whole house coordinate system, the corresponding point Ob' of the Ob point in the whole house coordinate system can be obtained coordinate.
  • the central device 400 can convert the coordinate information obtained by the third electronic device 300 into coordinates in the whole-house coordinate system.
  • the central device 400 can also convert the position information obtained by other electronic devices into the whole-house coordinate system.
  • the conversion method can be referred to the above method, which will not be described again.
  • the third electronic device 300 communicates with the second electronic device 200 through wired or wireless means.
  • the second electronic device 200 and the third electronic device 300 can communicate with the central device through wired (such as power line communication (PLC)) and/or wireless (such as Wi-Fi, Bluetooth, etc.) methods. 400 connections.
  • wired such as power line communication (PLC)
  • wireless such as Wi-Fi, Bluetooth, etc.
  • 400 connections can be understood that the ways in which the second electronic device 200 and the third electronic device 300 are connected to the central device 400 may be the same or different.
  • both the second electronic device 200 and the third electronic device 300 are connected to the central device 400 through wireless means.
  • the second electronic device 200 is connected to the central device 400 in a wireless manner
  • the third electronic device 300 is connected to the central device 400 in a wired manner.
  • the smart speakers, smart TVs, body fat scales, sweeping robots and other devices in the second electronic device 200 are connected to the central device 400 through wireless (such as Wi-Fi).
  • wireless such as Wi-Fi
  • Devices such as smart door locks are connected to the central device 400 through wired means (such as PLC).
  • the central device in each room or each area or the central device in the whole house can exist alone, or can be integrated with the third electronic device or the first electronic device into one device, or can also be combined with the third electronic device and the first electronic device.
  • Electronic equipment integrated into one device This application does not limit this.
  • the system also includes a routing device (such as a router).
  • a routing device such as a router.
  • Routing equipment is used to connect to a local area network or the Internet, using a specific protocol to select and set the path for sending signals.
  • one or more routers are deployed throughout the house to form a local area network, or to access the local area network or the Internet.
  • the second electronic device 200 and/or the third electronic device 300 are connected to the router, and perform data transmission with devices in the local area network or devices in the Internet through the Wi-Fi channel established by the router.
  • the hub device 400 can be integrated with the routing device into one device.
  • the hub device 400 and the routing device are integrated into the routing device, that is, the routing device has the function of the hub device 400 .
  • the routing device can be one or more routing devices among the sub-master routing devices, or it can be an independent routing device.
  • the above content is only an example of a system to which the device control method is applicable.
  • the system may also include more or less devices, or different device layout locations, etc.
  • FIG. 2 shows a schematic structural diagram of a first electronic device 100.
  • the first electronic device 100 may include a processor 310, a memory 320, a universal serial bus (USB) interface 330, a power module 340, a UWB module 350, a wireless communication module 360, etc.
  • the first electronic device 100 may also include an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, a headphone interface 370D, a display screen 380, and the like.
  • the first electronic device 100 may also include a sensor module 390 and the like.
  • the structure illustrated in FIG. 2 does not constitute a specific limitation on the first electronic device 100 .
  • the first electronic device 100 may include more or less components than shown in the figures, or combine some components, or split some components, or arrange different components.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the interface connection relationship between the modules illustrated in FIG. 2 is only a schematic illustration and does not constitute a structural limitation of the first electronic device 100 .
  • the first electronic device 100 may also adopt an interface connection method different from that shown in FIG. 2 , or a combination of multiple interface connection methods.
  • the processor 310 may include one or more processing units, and different processing units may be independent devices or integrated into one or more processors.
  • the processor 310 is a central processing unit (CPU), an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present application.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • Memory 320 may be used to store computer executable program code, which includes instructions.
  • the memory 320 may also store data processed by the processor 310.
  • the memory 320 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the processor 310 executes various functional applications and data processing of the first electronic device 100 by executing instructions stored in the memory 320 and/or instructions stored in the memory provided in the processor.
  • the wireless communication module 360 can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (BT), and global network that are applied on the first electronic device 100. Solutions for wireless communications such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR), and ZigBee plan.
  • Wireless communication module 360 may be one or more devices including at least one communication processing module.
  • the wireless communication module 360 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 310 .
  • the wireless communication module 360 can also receive the signal to be sent from the processor 310, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna for radiation.
  • the number of antennas of the wireless communication module 360, the UWB module 350 and the millimeter wave radar module 160 in Figure 2 is only an exemplary illustration. It can be understood that the communication module 360, the UWB module 350 and the millimeter wave radar module 160 may include more or fewer antennas, which is not limited in the embodiment of the present application.
  • the UWB module 350 may provide a wireless communication solution based on UWB technology applied on the first electronic device 100 .
  • the duration of the UWB signal flying in the air can be calculated by detecting the UWB signal and combining it with certain positioning algorithms. The duration is multiplied by the transmission rate of the UWB signal in the air (such as the speed of light) to obtain the first electronic device 100 and the first electronic device 100.
  • the distance between two electronic devices 200 (such as IoT devices).
  • the first electronic device 100 can also determine the direction of the second electronic device 200 relative to the first electronic device 100 based on the phase difference between the UWB signals sent by the second electronic device 200 reaching different antennas of the first electronic device 100 (That is, where the UWB signal comes from).
  • the USB interface 330 is an interface that complies with the USB standard specifications. Specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 330 can be used to connect a charger to charge the first electronic device 100, and can also be used to transmit data between the first electronic device 100 and peripheral devices.
  • the power module 340 is used to power various components of the first electronic device 100, such as the processor 310, the memory 320, etc.
  • the first electronic device 100 can implement audio functions through the audio module 370, the speaker 370A, the receiver 370B, the microphone 370C, the headphone interface 370D, and the application processor. Such as audio playback, recording, etc.
  • the audio module 370 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be provided in the processor 310 , or some functional modules of the audio module 370 may be provided in the processor 310 .
  • Speaker 370A also called “speaker” is used to convert audio electrical signals into sound signals.
  • first electronic device 100 can listen to audio through speaker 370A.
  • Receiver 370B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • Microphone 370C also known as “microphone” and “microphone”, is used to convert sound signals into electrical signals. The user can speak by approaching the microphone 370C with the human mouth, and input the sound signal to the microphone 370C.
  • the headphone interface 370D is used to connect wired headphones.
  • the headphone interface 370D can be a USB interface 330, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA Cellular Telecommunications Industry Association of the USA
  • the display screen 380 is used to display images, videos, etc.
  • Display 380 includes a display panel.
  • the sensor module 390 includes an inertial measurement unit (inertial measurement unit, IMU) module and the like.
  • IMU modules can include gyroscopes, accelerometers, etc.
  • a gyroscope and an accelerometer may be used to determine the movement posture of the first electronic device 100 .
  • the first electronic device 100 further includes a filter (such as a Kalman filter).
  • a filter such as a Kalman filter
  • the output of the IMU module and the output of the UWB module 350 can be superimposed, and the superimposed signal can be input to a Kalman filter for filtering, thereby reducing errors.
  • the structure of the first electronic device 100, the third electronic device 300, and the central device 400 can refer to the structure of the device shown in Figure 2.
  • the device has more or fewer components than that shown in Figure 2, or some components are combined, some components are separated, or some components are arranged differently.
  • the software system of the electronic device can adopt a layered architecture, an event-driven architecture, a microkernel architecture, and a microservice architecture, or cloud architecture.
  • the embodiment of the present invention uses a layered architecture
  • the system is taken as an example to illustrate the software structure of the electronic device.
  • FIG. 3 is a software structure block diagram of the first electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of applications.
  • applications can include camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and other applications.
  • the application program also includes smart home management applications and basic services.
  • basic services open the management capabilities of smart devices to the system.
  • the smart home management application can call the basic service to query the smart home device to be controlled, and/or call the basic service to control the smart home device.
  • the smart home management application can be a smart life application.
  • Smart home applications can also be other applications with similar functions.
  • the smart home management application may be an original system application or a third-party application.
  • the embodiments of this application do not limit the types of smart home management applications.
  • smart life applications are mainly used as examples.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window managers, content providers, view systems, Call manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications. Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views. For example, a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the first electronic device 100 . For example, call status management (including connected, hung up, etc.).
  • the resource manager provides various resources to applications, such as localized strings, icons, pictures, layout files, video files, etc.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a prompt sound is emitted, the terminal vibrates, and the indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the software architecture shown in Figure 3 is only a software architecture applicable to electronic devices, and the embodiments of the present application do not limit the software architecture of electronic devices.
  • some functional modules may be located at a different level than shown in Figure 3.
  • the basic service can also be provided in the framework layer, and the embodiments of the present application are not limited to this.
  • the mobile phone can obtain user information that has been authorized by the user in real time, and based on the user information, present the corresponding device control interface to the user to meet the user's current device control needs.
  • user information includes but is not limited to the following information: location information, time information, and behavior information.
  • Case 1 The user information is location information.
  • the mobile phone obtains the user's location information and presents the corresponding device control interface to the user based on the user's location information.
  • the user Jack returns home (Jack's home) and opens the smart life application.
  • the mobile phone detects that the user is currently at Jack's home, and displays the device control corresponding to "Jack's home" in the smart life application as shown in Figure 4. Interface 401.
  • Figure 6A shows the flow of a device control method according to an embodiment of the present application.
  • the method includes:
  • the first electronic device 100 establishes a connection with the routing device.
  • the first electronic device 100 may be a mobile phone or other device used to control smart home and other devices.
  • the first electronic device 100 can control the smart home device in different ways.
  • the first electronic device 100 controls the smart home device through a smart life application as an example.
  • routing device (such as a router) can be integrated with the hub device 400 or can be an independent device.
  • routing device and the hub device 400 are independent devices as an example.
  • the routing device reports the network information of the first electronic device 100 to the hub device 400.
  • the routing device after the routing device establishes a connection with the first electronic device 100 (such as a mobile phone), it can report the network information of the first electronic device 100 to the central device 400 .
  • Network information includes but is not limited to the name, logo, etc. of the network.
  • the routing device reports to the hub device 400 the name of the Wi-Fi network to which the first electronic device 100 is connected.
  • the central device 400 determines the information of the home where the first electronic device 100 is located based on the network information of the first electronic device 100.
  • the routing device reports the name of the Wi-Fi network to which the first electronic device 100 is connected to the hub device 400. Since the Wi-Fi network names corresponding to different homes are usually different, the hub device 400 can determine the first Wi-Fi network based on this. In which home the electronic device 100 is currently located.
  • the third electronic device 300 measures the distance from the first electronic device 100.
  • the third electronic device 300 may include a sensor, which is responsible for collecting the user's spatial location information.
  • the third electronic device 300 may include communication modules such as UWB and Wi-Fi, and implement control over the first electronic device through these communication modules. 100 positioning functions. In this embodiment, positioning achieved through the UWB module is used as an example for explanation.
  • each third electronic device 300 sends a UWB signal to the first electronic device 100 (such as a mobile phone) through the UWB module, and waits for the signal from the first electronic device 100 to be received. Based on the UWB signal fed back by the device 100, the third electronic device 300 can detect the time of flight of the UWB signal, and calculate the distance r from itself to the first electronic device 100 based on the flight time.
  • the first electronic device 100 such as a mobile phone
  • the third electronic device 300 reports the distance between the third electronic device 300 and the first electronic device 100 to the central device 400.
  • the third electronic device 300 shown in FIG. 7 reports the distance r between each of the third electronic devices 100 and the first electronic device 100 to the central device 400.
  • the central device 400 determines information about the space where the first electronic device 100 is located based on the distance information between the third electronic device 300 and the first electronic device 100 and the floor plan.
  • the central device 400 has a coordinate system as shown in Figure 7, in which the coordinate origin is O.
  • the central device 400 determines a sphere with the third electronic device 300 as the center and r3 as the radius based on the distance r3 between the third electronic device 300 and the first electronic device 100 in the upper left corner;
  • the distance r1 between an electronic device 100 determines a sphere with the third electronic device 300 as the center and r1 as the radius; according to the distance r2 between the third electronic device 300 and the first electronic device 100 in the middle, determine the sphere with the third electronic device 300 as the center and r1 as the radius.
  • the electronic device 300 is a sphere with the center and r2 as the radius.
  • the central device 400 takes the intersection point A of the three spheres as the position of the first electronic device 100 .
  • the location of the first electronic device 100 may be considered the location of the user.
  • the first electronic device 100 detects the user's operation of opening the smart life application.
  • the user when the user opens a smart life application, the user may open a smart life application running in the background or a smart life application running in the foreground.
  • the user may open the smart life application through the operation interface. For example, the user clicks on the icon of the smart life application on the desktop to trigger the first electronic device 100 to open the smart life application. Or, users can open smart life applications through voice commands.
  • the first electronic device 100 sends a location query request to the central device 400. This location query request is used to query the user's location.
  • the user's location includes multiple dimensions.
  • the user's location may include the home where the user is located, and/or the space where the user is located in the home.
  • the first electronic device 100 calls the basic service through the smart life application, and the basic service calls the communication module through the corresponding driver to send a location query request to the central device 400 .
  • step S108 is an optional step.
  • the central device 400 can actively send the location information to the first electronic device 100 periodically or according to other strategies.
  • the first electronic device 100 can store the location information of the home and space where the user is located, and can update the location information of the home and space where the user is located.
  • S110 may be executed.
  • the central device 400 feeds back the user's location information to the first electronic device 100.
  • the central device 400 feeds back to the first electronic device 100 information about the home where the user is located and/or information about the space where the user is located in the home.
  • the first electronic device 100 displays the first interface of the smart life application according to the user's location information.
  • the first interface is a control interface of a second electronic device (such as an IoT device) corresponding to the user's current home.
  • a second electronic device such as an IoT device
  • the user Jack returns home (Jack's home) and opens the smart life application.
  • the mobile phone detects that the user is currently at Jack's home, and the smart automatic display is as shown in Figure 4 shows the device control interface 401 (first interface) corresponding to "Jack's Home" in the smart life application.
  • the user can control the devices in his current home through the device control interface 401.
  • Such equipment In the control method, users are not required to perform multiple operations in the interface, which can reduce the complexity of user operations and improve equipment management efficiency.
  • users can control smart home devices at a "spatial" granularity. For example, assume that the last time the user used the smart life application, he switched to the device control interface under the "Space” tab. In response to the user's operation of opening the smart life application this time, as shown in (a) of Figure 8 , the mobile phone displays the device control interface 701 under the "Space” label 703 based on the information of the user's current space (located in the master bedroom).
  • the device control interface 701 includes a device control card 702 corresponding to the master bedroom. In this way, the user can control the smart home devices in the master bedroom through the device control card 702 corresponding to the master bedroom.
  • the card 702 may include equipment in the master bedroom, such as lights, curtains, etc.
  • the card 702 may also include tasks corresponding to the equipment in the master bedroom. For example, temperature control tasks, purification tasks, etc.
  • the cards include as many devices as possible in the master bedroom and their corresponding tasks. In this way, users can preview most of the devices and corresponding tasks in the master bedroom through cards, so that they can select the device they want to control and the tasks they need to perform from these devices and tasks.
  • the device control interface 701 may also include a button 706.
  • the user can click the button 706 to display space cards for the entire house, for example, display cards corresponding to the bathroom, second bedroom, balcony and other spaces.
  • the mobile phone obtains that the user is in the bathroom. Then, when the user opens the smart life application in the bathroom, the mobile phone can display the device control interface 704 as shown in (b) of Figure 8 , the device control interface 704 includes a card 705 corresponding to the bathroom.
  • the card 705 includes as many devices in the bathroom and tasks corresponding to each device as possible, so that the user can operate and control the smart home devices in the bathroom.
  • the equipment control interface 701 includes an equipment control card 702 corresponding to the master bedroom. The user can control the smart home devices in the master bedroom through the device control card 702 corresponding to the master bedroom.
  • the mobile phone in the scenario where the user controls the smart home device through the smart life application of the mobile phone, the mobile phone can display the device control interface of the device in the space to the user according to the user's spatial information, so that the user can You can easily control the smart home devices in the space through the device control interface of the devices in the space.
  • the device control interface under the "Space” label only includes cards for the space where the user is currently located.
  • the device control interface under the "Space” label may include cards for the space where the user is currently located. , and cards in other spaces.
  • the mobile phone can display the card in the space where the user is currently located in a specific way to remind the user to pay attention to the card.
  • the mobile phone can display the card corresponding to the current space at the top of the device control interface, and/or the mobile phone can also display the card with a specific user interface (UI) effect.
  • UI user interface
  • the mobile phone can display the device control interface 801 as shown in (a) of Figure 9 .
  • the interface 801 includes, in the interface 801, a card 802 for the master bedroom is displayed at the top of the interface 801, so that Because the user pays attention to the card and controls the smart home devices in the master bedroom through the card operation.
  • the mobile phone obtains that the user is in the bathroom. Then, when the user opens the smart life application in the bathroom, the mobile phone can display the device control interface 803 as shown in (b) of Figure 9. In one example, in the interface 803, a bathroom-specific card 804 is displayed at the top of the interface 803 to facilitate the user to operate and control the smart home devices in the bathroom.
  • the mobile phone can display the device control interface 901 as shown in (a) of Figure 10 .
  • the cards 902 for the master bedroom are displayed with specific UI effects (such as border flashing, color changes, etc.) to facilitate the user to operate and control the smart home devices in the master bedroom.
  • the mobile phone obtains that the user is in the bathroom. Then, when the user opens the smart life application in the bathroom, the mobile phone can display the device control interface 903 as shown in (b) of Figure 10. In one example, in the interface 903, the toilet card 904 is displayed with a specific UI effect.
  • FIG. 6B shows another process of the device control method according to the embodiment of the present application. The method includes the following steps:
  • the first electronic device 100 establishes a connection with the routing device.
  • steps S201-S206 please refer to the relevant description of the embodiment corresponding to FIG. 6B.
  • the routing device reports the network information of the first electronic device 100 to the hub device 400.
  • the central device 400 determines the information of the home where the first electronic device 100 is located based on the network information of the first electronic device 100.
  • the third electronic device 300 measures the distance from the first electronic device 100.
  • the third electronic device 300 reports the distance between the third electronic device 300 and the first electronic device 100 to the central device 400.
  • the first electronic device 100 detects the user's operation of opening the smart life application.
  • the first electronic device 100 sends a location query request to the central device 400.
  • This home query request is used to query the user's location information.
  • the central device 400 feeds back to the first electronic device 100 the information about the user's home, the distance information between the first electronic device 100 and the third electronic device 300, and the information about the floor plan.
  • the central device 400 feeds back to the first electronic device 100 that the user's current home is "Jack's home", and the distances r1 and r2 between the first electronic device 100 and the third electronic device 300 as shown in Figure 7 . r3, and floor plan information.
  • the hub device 400 can send the home information where the user is located and the information used to determine the space where the user is located (such as the above distance r1, r2, r3, and floor plan information).
  • the hub device 400 may also receive different requests from the first electronic device 100 and feed back different information based on the different requests.
  • the hub device 400 receives a home query request from the first electronic device 100, and feeds back information about the user's home to the first electronic device 100 based on the home query request.
  • the hub device 400 receives the space query request from the first electronic device 100, and feeds back to the first electronic device 100 information for determining the space where the user is located based on the space query request (such as the above-mentioned distances r1, r2, r3, and the information of the floor plan). ). Or, the central device 400 actively sends a message to the first electronic device 100 Send information about the home where the user is located, and/or information used to determine the space where the user is located.
  • the first electronic device 100 determines the information of the space where the first electronic device 100 is located based on the distance information between the third electronic device 300 and the first electronic device 100 and the floor plan.
  • the first electronic device 100 determines the intersection point A of the three spheres as the location of the first electronic device 100 based on r1, r2, r3 as shown in FIG. 7 and the floor plan.
  • the first electronic device 100 displays the first interface of the smart life application based on the information of the user's home and the information of the space where the user is located.
  • the first interface includes information about the second electronic device (such as an IoT device) in the user's space.
  • the second electronic device such as an IoT device
  • the first interface 701 is the equipment control interface corresponding to the master bedroom of Jack's house.
  • the equipment control interface includes a card 702 corresponding to the master bedroom. In this way, it is convenient for the user to use the card 702 to control the current master bedroom. smart home devices to control. That is to say, the user does not need to search and select from multiple cards 702 in multiple spaces, but can easily and intuitively obtain the device control card corresponding to the current space, which can simplify the user's operation.
  • Embodiments of the present application also provide a device control method.
  • the first electronic device 100 can automatically and intelligently recommend smart home devices to be controlled to the user.
  • the user clicks on the card 902 corresponding to the master bedroom, and the mobile phone can jump to the device details interface 1101 corresponding to the master bedroom as shown in Figure 11.
  • the equipment details interface 1101 includes all equipment in the master bedroom.
  • the mobile phone can also display a pop-up window 1102 in the interface 1101.
  • the pop-up window 1102 is used to recommend smart home devices to be controlled to the user. For example, the mobile phone recommends the speakers and TVs closest to the user through the pop-up window 1102.
  • the smart device is recommended to the user in the form of a pop-up window 1102.
  • the pop-up window 1102 may be displayed in the interface 1101 as shown in Figure 11, or the pop-up window 1102 may be displayed in any other possible interface.
  • the mobile phone can display the pop-up window 1102 in the interface 401 shown in Figure 4.
  • Figure 12 shows the method flow corresponding to the device control scenario shown in Figure 11. As shown in Figure 12, the method includes:
  • the first electronic device 100 detects the user's operation of opening the smart life application.
  • the first electronic device 100 sends a space query request to the central device 400.
  • the central device 400 feeds back to the first electronic device 100 the information about the space where the first electronic device 100 is located and the information about the space where the second electronic device 200 is located.
  • the second electronic device 200 may be a smart home device to be controlled.
  • the solution corresponding to Figure 12 mainly takes the first electronic device 100 querying the central device 400 for the space where the first electronic device 100 is located as an example.
  • the first electronic device 100 can also determine the first electronic device 100 on its own. The space where an electronic device 100 is located, and/or the first electronic device 100 can determine the space where the second electronic device 200 is located.
  • the first electronic device 100 sends the UWB signal to the second electronic device 200.
  • the first electronic device 100 can measure the distance between the first electronic device 100 and the second electronic device 200 in various ways.
  • the first electronic device 100 uses the UWB module to send and receive UWB signals to implement ranging and positioning as an example.
  • the second electronic device 200 sends the UWB signal to the first electronic device 100.
  • the second electronic device 200 needs to feed back the UWB signal to the first electronic device 100, so that the first electronic device 100 determines the second electronic device 100 through the fed back UWB signal.
  • the distance between the electronic device 200 and the first electronic device 100 is the distance between the electronic device 200 and the first electronic device 100 .
  • the first electronic device 100 determines the distance between the second electronic device 200 and the first electronic device 100 according to the UWB signal.
  • the first electronic device 100 displays the interface of the smart life application based on the distance between the second electronic device 200 and the first electronic device 100, the information of the space where the first electronic device 100 is located, and the information of the space where the second electronic device 200 is located. .
  • the first electronic device 100 based on the distance between the second electronic device 200 and the first electronic device 100, the information of the space where the first electronic device 100 is located, and the information of the space where the second electronic device 200 is located, Identify N (N is a positive integer) second electronic devices (such as IoT devices) closest to the user's position in the space where the user is currently located. For example, as shown in (a) of FIG. 10 , the user is currently in the master bedroom, and the first electronic device 100 recognizes that the two second electronic devices currently closest to the user are a speaker and a television. Then, as shown in FIG. 11 , the first electronic device 100 can display the interface 1101 of the smart life application.
  • the interface 1101 includes a pop-up window 1102 to recommend to the user the speakers and TVs closest to the user in the master bedroom.
  • the above mainly takes the first electronic device 100 as an example to recommend the device to be controlled to the user through an independent pop-up window.
  • the first electronic device 100 can also display the device to be controlled with a specific UI effect in the interface, so that the user Pay more attention to the equipment to be controlled.
  • the first electronic device 100 displays the cards corresponding to the speaker and the TV with a specific UI effect, so that the user can more quickly notice the cards of the speaker and the TV, and pass Corresponding cards control speakers and TVs.
  • the above description mainly takes the example of the first electronic device 100 recommending the smart device closest to the user in the "Master Bedroom” interface (such as the interface 1103 shown in Figure 13).
  • the first electronic device 100 recommends the smart device closest to the user to the user.
  • the device 100 can also recommend smart devices to users in other forms through other interfaces, and the embodiments of the present application do not limit this.
  • User information is user behavior information.
  • the first electronic device 100 can obtain user behavior information, and if a preset type of user behavior is detected, smart home devices related to the user behavior can be recommended to the user.
  • the first electronic device 100 collects statistics on user behavior information that has been authorized by the user, and learns that when playing games, the user usually operates through a large-screen device such as a television. Then, as shown in (a) of Figure 14, when it is detected that the user opens the game application, the first electronic device 100 can pop up the pop-up window 1402 as shown in (b) of Figure 14 when loading the game interface 1401 to provide information to the user. Users suggest large-screen devices with better gaming experience. As shown in (b) of Figure 14, when it is detected that the user clicks the "Yes" option, the first electronic device 100 controls the display of the game interface on the television, and the user can perform game operations through the television.
  • the first electronic device 100 may recommend to the user one device related to the user's behavior, or may recommend to the user multiple devices related to the user's behavior. For example, if the first electronic device 100 (mobile phone) detects that a TV or a laptop is currently connected, then after detecting that the user opens the game application, the mobile phone can recommend the TV or laptop to the user in the form of a pop-up window. If the user selects a laptop, the mobile phone controls the laptop to display the game-related interface, and the user can play the game through the laptop.
  • the first electronic device 100 mobile phone
  • the mobile phone controls the laptop to display the game-related interface, and the user can play the game through the laptop.
  • the pop-up window 1402 shown in (b) of Figure 14 can also be replaced by controls such as a floating window.
  • the first electronic device 100 stops displaying the floating window after the duration of displaying the floating window reaches a preset duration.
  • the first electronic device 100 in addition to recommending/prompting the user with devices associated with the user's behavior through the interface, can also prompt the user with devices associated with the user's behavior through the interface.
  • Prompts can be provided by voice or other methods. The embodiments of this application do not limit the specific device recommendation methods.
  • User information is time information.
  • the first electronic device 100 can obtain time information, and if the current time is within the target time period, it can recommend the target device associated with the target time period to the user.
  • the first electronic device 100 can count the devices controlled by the user at various time points (or multiple time periods), and can report the statistical results to the central device 400, and the central device 400 can perform data analysis based on the statistical results.
  • the central device 400 can perform data analysis based on the statistical results.
  • the first electronic device 100 displays frequently operated devices in front based on the data analysis results.
  • the first electronic device 100 may determine the device to be controlled associated with the one or more pieces of information based on the user information, and prompt/recommend the device to be controlled to the user.
  • Embodiments of the present application also provide a device control method.
  • the first electronic device 100 can obtain user information and recommend intelligent scenarios (which may also be called execution scenarios) to the user based on the user information.
  • a smart life application is installed in the mobile phone
  • the mobile phone starts the smart life application, and displays the scene interface 1501 as shown in (a) of Figure 15 .
  • the interface 1503 shown in (b) of Figure 15 is displayed.
  • the mobile phone displays the interface 1507 as shown in (c) of Figure 15.
  • the mobile phone detects the user's click on the smart device control 1508, it determines that the user needs to add a task to control the smart device, and can display controllable smart devices for the user to select. Assuming that the smart device selected by the user is an air conditioner and the selected execution task is "off”, then as shown in reference numeral 1512 in the interface 1509 shown in (d) in Figure 15, the mobile phone can display the execution task corresponding to the air conditioner selected by the user. closure".
  • the mobile phone can set a trigger condition according to the user's operation.
  • the set trigger condition is "8 o'clock in the morning".
  • the interface 1503 may also include a card 1504 of recommended scenes.
  • the recommended scenario is determined based on one or more pieces of information in the user information. For example, through statistics of user information authorized by the user, the mobile phone determines that when the user opens a game application on the mobile phone at home, the game will usually be projected to the TV for operation. Then, when the user adds a smart scene, the mobile phone can recommend the smart scene "when opening the game application on the mobile phone at home, turn on the TV" to the user. For example, the smart scene is recommended to the user through the card 1504 in the scene creation interface 1503. When detecting that the user clicks the "Confirm Add" option in card 1504, the mobile phone adds the smart scene to the smart scene to be executed.
  • the mobile phone can display an interface 1513 as shown in (e) of Figure 15 .
  • the interface 1513 includes a card 1514 for the "turn off the air conditioner in the morning" scenario manually created by the user and a card 1515 for the "turn on the TV” scenario recommended by the mobile phone.
  • the first electronic device recommends smart scenes to the user based on user information, and adds the smart scene according to the user's instructions, so that the user does not need to manually input the execution tasks and trigger conditions of the smart scene, and the user can conveniently and quickly set the first smart scene. Add intelligent scenes to electronic devices to improve the user's interactive experience during device control.
  • the first electronic device 100 can set device control-related functions.
  • FIG. 16 shows an exemplary setting interface of the first electronic device 100.
  • the setting interface may include opening Switch 1601, when the switch 1601 is turned on, the first electronic device 100 can perform the above device control method, for example, it can intelligently recommend the nearest device to the user.
  • the first electronic device 100 can also set a time period, usage scenario, etc. for turning on the intelligent recommendation device function, which is not limited in the embodiment of the present application.
  • Figure 17 shows another method flow according to the embodiment of the present application.
  • the method is applied to the first electronic device, and the method includes:
  • the first interface includes information on m second electronic devices associated with the first user information, where m is an integer greater than 1.
  • n is an integer not less than 1 and not greater than m.
  • the second interface includes information on k second electronic devices associated with the second user information; k is an integer greater than 1.
  • first user information and the second user information include any one or more of the following information: location information, time information, and behavior information.
  • the first user information includes any one or more of the following information: information about the first behavior performed by the user at the first moment, information about the first moment, and information about the home where the user is at the first moment. information, information about the first space where the user is at the first moment, information about the distance between the user and the second electronic device at the first moment;
  • the second user information includes any of the following Or a variety of information: information about the second behavior performed by the user at the second moment, information about the second moment, information about the home where the user is at the second moment, information about the user's home at the second moment Information about the second space and distance information between the user and the second electronic device at the second moment.
  • the first electronic device displays the first interface, and controls the second electronic device to perform the first target operation according to the first operation input by the user on the first interface.
  • the first electronic device can obtain new user information (ie, second user information), and according to the second user information, automatically and intelligently switch the displayed interface from the first interface to the second interface, so that The user can control the second electronic device associated with the second user information through the second interface.
  • the first electronic device can obtain the latest user information (second user information), and automatically switch to the interface associated with the second user information based on the second user information to satisfy the user's device requirements. Control needs. In this process, the user does not need to perform cumbersome interface switching operations, which reduces the user's operational complexity, thereby shortening the time required to switch interfaces and improving the efficiency of device control.
  • the m second electronic devices include information about the first home where the user is at the first time. electronic equipment. and/ Or, if the second user information includes information about the second home where the user is at the second time, then the k second electronic devices include electronic devices in the second home where the user is at the second time. equipment.
  • the first family and the second family are the same or different.
  • the m second electronic devices include information about the first space in the first space.
  • the first interface does not include information about other electronic devices except the m second electronic devices.
  • the second user information also includes information about the second space where the user is in the second home at the second moment, then the k second electronic devices include the second Electronic devices in the space; the first interface does not include information about other electronic devices except the k second electronic devices.
  • the first family and the second family are the same or different.
  • the user is in the master bedroom (first space) at the first moment, and the mobile phone can display interface 701 (first interface).
  • the interface 701 includes electronic devices in the master bedroom (first user space). information associated with the second electronic device). Moreover, the interface 701 does not include information about electronic devices in other spaces except the master bedroom. If an operation (i.e., the first operation) acting on the interface 701 to control turning on the light in the master bedroom (i.e., the first target operation) is detected, the mobile phone controls the light to turn on.
  • the above mainly takes the user inputting the first operation through the first interface as an example.
  • the first operation may also be input through other interfaces.
  • the user can click on a blank area of the card 702 to trigger the mobile phone to jump to the details interface of the card 702 .
  • the user can input an operation (first operation) for controlling the light on in the details interface.
  • the mobile phone can display interface 704 (second interface).
  • the interface 704 includes the electronic equipment in the bathroom (the second electronic device associated with the second user information). device) information. Moreover, the interface 704 does not include information about electronic devices in other spaces except the bathroom.
  • the m second electronic devices include information about where the user is.
  • the electronic device is in the first space at the first moment.
  • the first interface also includes information about other electronic devices except the m second electronic devices; the identification information of the electronic devices in the first space where the user is located at the first moment is in the first interface.
  • the display is highlighted with a preset user interface UI effect, and/or the identification information of the electronic device that the user is in the first space at the first moment is ranked in front of other electronic devices on the first interface.
  • the second user information also includes information about the second space where the user is in the second home at the second time
  • the k second electronic devices include the second space electronic equipment in.
  • the second interface also includes information about other electronic devices other than the k second electronic devices; the identification information of the electronic devices in the second space where the user is located at the second moment is in the first
  • the second interface is highlighted with a preset UI effect, and/or the identification information of the electronic device of the user in the second space at the second moment is ranked in front of other electronic devices on the second interface.
  • preset UI effects include but are not limited to one or more of the following effects: color effects, animation effects.
  • the user is in the master bedroom (first space) at the first moment, and the mobile phone can display interface 801 (first interface).
  • the interface 801 includes electronic devices in the master bedroom (first user space). information associated with the second electronic device).
  • the interface 801 also includes information about electronic devices in other spaces (such as dining rooms and living rooms) besides the master bedroom.
  • the card corresponding to the master bedroom is displayed in front of the cards in other spaces, or in other words, the information of the electronic devices in the master bedroom is displayed in front of the electronic devices in other spaces.
  • the mobile phone can display interface 803 (second interface).
  • Interface 803 includes electronic equipment in the bathroom (second electronic device associated with the second user information). device) information.
  • the interface 803 also includes information about electronic devices in other spaces except the bathroom.
  • the card corresponding to the bathroom is displayed in front of the cards in other spaces, or in other words, the information of the electronic devices in the bathroom is displayed in front of the electronic devices in other spaces.
  • the user is in the master bedroom (first space) at the first moment, and the mobile phone can display interface 801 (first interface).
  • the interface 801 includes electronic devices in the master bedroom (first space). information of the second electronic device associated with the user information.
  • the interface 801 also includes information about electronic devices in other spaces (such as dining rooms and living rooms) besides the master bedroom.
  • the card corresponding to the master bedroom is displayed according to the preset UI effect (such as bold), or in other words, the information of the second electronic device in the master bedroom is displayed according to the preset UI effect.
  • the mobile phone can display interface 803 (second interface).
  • Interface 803 includes electronic equipment in the bathroom (second electronic device associated with the second user information). device) information.
  • the interface 803 also includes information about electronic devices in other spaces except the bathroom.
  • the card corresponding to the bathroom is displayed according to the preset UI effect (such as bold), or in other words, the information of the second electronic device in the bathroom is displayed according to the preset UI effect.
  • the m second electronic devices include distance information between the user and the user at the first time. m electronic devices that are relatively close to each other.
  • the second user information includes distance information between the user and the second electronic device at the second time
  • the k second electronic devices include distance information between the user and the second electronic device at the second time. k electronic devices that are relatively close to each other.
  • the m electronic devices that are relatively close to the user may be m devices that are at equal or different distances from the user.
  • the electronic device that is relatively close to the user may be a device that is in the same space as the user.
  • the mobile phone can display the interface 1101 (First interface), used to recommend TV sets and speakers to users.
  • the mobile phone can automatically switch the displayed interface to an interface including the body fat scale. , in order to recommend body fat scales to users.
  • the information of the m second electronic devices is displayed in the first interface in the form of a pop-up window, and/or the information of the k second electronic devices is displayed in the second interface in the form of a pop-up window. in the interface.
  • the information about the TV and speakers is displayed in the interface 1101 in the form of a pop-up window 1102.
  • the preset duration can be set flexibly. For example, still as shown in Figure 11, within 5 seconds from the time when the pop-up window 1102 is displayed, the pop-up window 1102 is stopped from being displayed.
  • the information of the m second electronic devices is displayed in the first interface with a preset UI effect, or the information of the second electronic devices is displayed before the information of other electronic devices.
  • the information of the k second electronic devices is displayed in the second interface with a preset UI effect.
  • recommended devices are displayed in the interface 1103 with preset UI effects.
  • the m second electronic devices are when the user performs the first behavior, m electronic devices that the user will control.
  • the second user information includes information about the second behavior performed by the user at the second moment, then the k second electronic devices are when the user performs the second behavior, k electronic devices that the user will control.
  • the mobile phone learns that the user has opened the game application (the first behavior), and the mobile phone displays the interface 1401 (the first interface).
  • the interface 1401 includes a pop-up window 1402, and the pop-up window 1402 Includes information about the TV set that the user intends to control when opening the game application.
  • the mobile phone learns that the user performs the second behavior (assuming that the electronic device associated with the second behavior is a sweeping robot), and the mobile phone automatically switches the interface to an interface including the sweeping robot in order to recommend the sweeping robot to the user. .
  • the m second electronic devices are the m electronic devices that the user will control at the first time.
  • the k second electronic devices are the k electronic devices that the user will control at the second time.
  • the method further includes: displaying a third interface according to the first user information, the third interface being used to recommend target execution scenarios to the user. Receive a third operation input by the user on the third interface, add the target execution scenario in response to the third operation, and execute the target execution scenario when a trigger condition of the target execution scenario is met. .
  • the mobile phone can display the interface 1503 as shown in (b) of Figure 15 (section 1503). Three interfaces), used to recommend scenarios to users.
  • the mobile phone adds the recommended scene.
  • the triggering conditions of the recommended scene are met, that is, when the user opens the game application on the mobile phone at home, the mobile phone executes the recommended scene and controls the TV to be turned on.
  • the first electronic device can also determine the second electronic device that the user wants to control based on the above multiple user information, and display an interface containing the second electronic device information.
  • the first electronic device determines the second electronic device that the user wants to control based on the user's location information, behavior information, and time information. For example, at time A, when it is detected that the user is opening a game application on the mobile phone in the living room, the mobile phone can recommend to the user to turn on the TV in the living room so that the user can cast the game to the TV for operation. At time B, when it is detected that the user is opening a game application on the mobile phone in the bedroom, the mobile phone can recommend to the user to open the computer in the bedroom so that the user can perform game operations on the computer.
  • the first electronic device determines the second electronic device that the user wants to control based on the user's location information and time information. For example, when it is detected that the user opens the smart life application in the master bedroom at time A, the mobile phone displays the interface 1801 shown in (a) of Figure 18.
  • the interface 1801 includes a card 1802 corresponding to the master bedroom.
  • Card 1802 may include information about some or all of the electronic devices in the master bedroom.
  • the interface 1801 may also include a device card 1803 associated with A at the current moment.
  • Card 1803 includes information about some commonly used devices at current moment A. Users can quickly and easily find the electronic device they want to control through card 1802 and card 1803.
  • the interface shown is 1804.
  • the interface 1804 includes a card 1805 corresponding to the bathroom.
  • Card 1805 may include information about some or all of the electronic devices in the bathroom.
  • the interface 1805 may also include a device card 1806 associated with B at the current time.
  • Card 1806 includes information about some commonly used devices at current time B. Users can quickly and easily find the electronic device they want to control through card 1805 and card 1806.
  • the first electronic device determines the displayed interface based on multiple user information (the displayed interface may include multiple electronic devices associated with information respectively).
  • the first electronic device may also determine based on multiple user information and other policies.
  • the displayed interface is not limited by the embodiments of this application. For example, in other embodiments, priorities may be set for user information. When it is detected that different user information is associated with different devices, the second electronic device associated with the user information with higher priority is preferentially recommended to the user. Alternatively, electronic devices associated with high-priority user information are displayed in front of other electronic devices, or electronic devices associated with high-priority user information are displayed according to a preset UI effect.
  • An embodiment of the present application also provides a device control method, which method is applied to the first electronic device, as shown in Figure 19.
  • the method includes:
  • the third interface is used to recommend target execution scenarios to the user.
  • the mobile phone After the mobile phone obtains the first user information, it learns that when the user opens a game application on the mobile phone at home, he usually casts the game to the TV for operation, then the mobile phone
  • the interface 1503 (third interface) shown in (b) of Figure 15 can be displayed to recommend to the user the target execution scenario of "turn on the TV when opening the game application on the mobile phone at home".
  • the mobile phone After adding the above target execution scenario, if it is detected that the trigger conditions of the above target execution scenario are met, that is, it is detected that the user opens a game application on the mobile phone at home, the mobile phone will execute the target execution scenario and control the turning on of the TV.
  • the first electronic device can also automatically add a target execution scenario associated with the first user information after detecting the first user information.
  • multiple embodiments of the present application can be combined and the combined solution can be implemented.
  • some operations in the processes of each method embodiment are optionally combined, and/or the order of some operations is optionally changed.
  • the execution order between the steps of each process is only exemplary and does not constitute a limitation on the execution order between the steps. Other execution orders are possible between the steps. It is not intended that the order of execution described is the only order in which these operations may be performed.
  • One of ordinary skill in the art will recognize various ways to reorder the operations described herein.
  • the process details involved in a certain embodiment herein are also applicable to other embodiments in a similar manner, or different embodiments can be used in combination.
  • each method embodiment can be implemented individually or in combination.
  • the electronic device in the embodiment of the present application includes a corresponding hardware structure and/or software module to perform each function.
  • the embodiments of this application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art can use different methods to implement the described functions for each specific application, but such implementation should not be considered to be beyond the scope of the technical solutions of the embodiments of the present application.
  • Embodiments of the present application can divide the electronic device into functional units according to the above method examples.
  • each functional unit can be divided corresponding to each function, or two or more functions can be integrated into one processing unit.
  • the above integrated units can be implemented in the form of hardware or software functional units. It should be noted that the division of units in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • FIG 20 shows a schematic block diagram of an intelligent device control device provided in an embodiment of the present application.
  • the device may be the above-mentioned first electronic device or a component with corresponding functions.
  • the device 1700 may exist in the form of software, or may be a chip that can be used in a device.
  • the device 1700 includes: a processing unit 1702 and a communication unit 1703.
  • the communication unit 1703 can also be divided into a sending unit (not shown in Figure 20) and a receiving unit (not shown in Figure 20).
  • the sending unit is used to support the device 1700 in sending information to other electronic devices.
  • the receiving unit is used to support the device 1700 to receive information from other electronic devices.
  • the device 1700 may also include a storage unit 1701 for storing program codes and data of the device 1700.
  • the data may include but is not limited to original data or intermediate data.
  • the processing unit 1702 may be used to support the receiving device to perform S501, etc. in FIG. 19, and/or other processes for the solutions described herein.
  • the communication unit 1703 is used to support communication between the device 1700 and other electronic devices (such as the above-mentioned second electronic device, etc.), for example, to support the execution of S304 in FIG. 12 and so on.
  • the device 1700 may also include an input unit (not shown in Figure 20) for receiving user input information, such as receiving a first operation, a second operation, etc. input by the user.
  • an input unit (not shown in Figure 20) for receiving user input information, such as receiving a first operation, a second operation, etc. input by the user.
  • the device 1700 may also include a display unit (not shown in Figure 20) for displaying interfaces and/or other information.
  • a display unit not shown in Figure 20 for displaying interfaces and/or other information.
  • the processing unit 1702 can be a controller or the processor 310 shown in Figure 2, for example, it can be a central processing unit (Central Processing Unit, CPU), a general-purpose processor, a digital signal processing (Digital Signal Processing, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with this disclosure.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of DSP and microprocessors, and so on.
  • the communication unit 1703 may include the wireless communication module 360 shown in Figure 2, and may also include a transceiver circuit, a transceiver, a radio frequency device, etc.
  • the storage unit 1701 may be the memory 320 shown in FIG. 2 .
  • An embodiment of the present application also provides an electronic device, including one or more processors and one or more memories.
  • the one or more memories are coupled to one or more processors.
  • the one or more memories are used to store computer program codes.
  • the computer program codes include computer instructions.
  • An embodiment of the present application also provides a chip system, including: a processor, the processor is coupled to a memory, and the memory is used to store programs or instructions. When the program or instructions are executed by the processor, the The chip system implements the method in any of the above method embodiments.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be implemented in hardware or software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software code stored in memory.
  • the memory may be integrated with the processor or may be provided separately from the processor, which is not limited by this application.
  • the memory can be a non-transient processor, such as a read-only memory ROM, which can be integrated on the same chip as the processor, or can be separately provided on different chips.
  • This application describes the type of memory, and the relationship between the memory and the processor. There is no specific limitation on how the processor is configured.
  • the chip system can be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a system on chip (SoC), or It can be a central processing unit (CPU), a network processor (NP), a digital signal processing circuit (DSP), or a microcontroller unit , MCU), it can also be a programmable logic device (PLD) or other integrated chip.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller unit
  • PLD programmable logic device
  • each step in the above method embodiment can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the method steps disclosed in conjunction with the embodiments of this application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • Computer instructions are stored in the computer-readable storage medium.
  • the electronic device causes the electronic device to execute the above related method steps to implement the above embodiments. Intelligent device control methods.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to perform the above related steps to implement the intelligent device control method in the above embodiment.
  • inventions of the present application also provide a device.
  • the device may be a component or module.
  • the device may include a connected processor and a memory.
  • the memory is used to store computer execution instructions.
  • the processor When the device is running, the processor The computer execution instructions stored in the executable memory can cause the device to execute the intelligent device control method in each of the above method embodiments.
  • the electronic devices, computer-readable storage media, computer program products or chips provided by the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects they can achieve can be referred to the above provided The beneficial effects of the corresponding methods will not be described again here.
  • the electronic device includes corresponding hardware and/or software modules that perform each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art can use different methods to implement the described functions in conjunction with the embodiments for each specific application, but such implementations should not be considered to be beyond the scope of this application.
  • This embodiment can divide the electronic device into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • the disclosed method can be implemented in other ways.
  • the terminal device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division.
  • there may be other division methods, such as multiple units or components. can be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of modules or units, which may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or contributes to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in various embodiments of the application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program instructions.

Abstract

本申请提供智能设备控制方法及电子设备,涉及终端技术领域。本申请能够自动为用户切换用于控制智能设备的设备控制界面,从而简化设备控制流程,提升用户使用体验。该方法应用于第一电子设备,该方法包括:在第一时刻,获取已经经过用户授权的第一用户信息,并根据第一用户信息,显示第一界面,第一界面包括第一用户信息关联的m个第二电子设备的信息;检测到第一操作;响应于第一操作,控制n个电子设备执行第一目标操作;在第二时刻,获取已经经过用户授权的第二用户信息,并根据第二用户信息,显示第二界面,第二界面包括第二用户信息关联的k个第二电子设备的信息;检测到第二操作;响应于第二操作,控制j个电子设备执行第二目标操作。

Description

智能设备控制方法及电子设备
本申请要求于2022年03月18日提交中国专利局、申请号为202210270161.7、申请名称为“智能设备控制方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及通信技术领域,尤其涉及智能设备控制方法及电子设备。
背景技术
随着技术的发展,用户拥有的设备越来越多。如图1A所示的家庭场景,通过物联网技术将家中的各种设备(如音视频设备、照明系统设备、环境控制设备、安防设备等)连接到一起形成智能家居系统,实现设备的集中控制,为用户提供家电控制、照明控制、防盗报警等多种功能。
但是,由于设备数量众多,用户如需操作某个设备,需要在智能家居应用中切换多个界面,才能找到需要控制的设备,用户操作繁琐,耗费时间。
发明内容
为了解决上述的技术问题,本申请实施例提供了一种智能设备控制方法及电子设备。本申请实施例提供的技术方案,能够自动为用户切换用于控制智能设备的设备控制界面,从而简化设备控制流程,提升用户使用体验。
为了实现上述的技术目的,本申请实施例提供了如下技术方案:
第一方面,提供一种智能设备控制方法,应用于第一电子设备。该方法包括:
在第一时刻,获取已经经过用户授权的第一用户信息,并根据所述第一用户信息,显示第一界面,所述第一界面包括所述第一用户信息关联的m个第二电子设备的信息,m为大于1的整数;
检测到作用于所述第一界面上用于控制所述m个第二电子设备中的n个电子设备执行第一目标操作的控件的第一操作,n为不小于1且不大于m的整数;
响应于所述第一操作,控制所述n个电子设备执行所述第一目标操作;
在第二时刻,获取已经经过所述用户授权的第二用户信息,并根据所述第二用户信息,显示第二界面,所述第二界面包括所述第二用户信息关联的k个第二电子设备的信息,k为大于1的整数;
检测到作用于所述第二界面上用于控制所述k个第二电子设备中的j个电子设备执行第二目标操作的控件的第二操作,j为不小于1且不大于k的整数;
响应于所述第二操作,控制所述j个电子设备执行所述第二目标操作。
应理解:所述第一用户信息和所述第二用户信息包括如下任一种或多种信息:位置信息,时间信息,行为信息。
与现有技术中,需要用户通过界面进行一系列的操作,才能切换到目标家庭、目标空间对应的设备控制界面,设备控制过程不够智能相比,本申请提供的设备控制方 法,不依赖用户手动查找操作,能够自动智能的切换到相应的设备控制界面,并通过设备控制界面向用户推荐与用户信息关联的第二电子设备,从而能够简化用户操作,提升设备控制效率,同时提升用户的交互体验。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻所在第一家庭的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一家庭中的电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻所在第二家庭的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二家庭中的电子设备。
可以看出,在本申请实施例中,第一电子设备可根据用户的位置信息,向用户展示与用户位置关联的设备控制界面,使得用户可以便捷的通过该设备控制界面,控制自身位置附近的第二电子设备(比如智能家居设备)。
在一种可能的设计中,若所述第一用户信息还包括所述用户在所述第一家庭中第一时刻所处的第一空间的信息,则所述m个第二电子设备包括所述第一空间中的电子设备;所述第一界面不包括除所述m个第二电子设备之外的其他电子设备的信息;
和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭中所处的第二空间的信息,则所述k个第二电子设备包括所述第二空间中的电子设备;所述第一界面不包括除所述k个第二电子设备之外的其他电子设备的信息。
在一种可能的设计中,若所述第一用户信息还包括所述用户在所述第一时刻在所述第一家庭所处的第一空间的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一空间中的电子设备,所述用户在所述第一时刻所处第一空间内的电子设备的标识信息在所述第一界面上以预设用户界面UI效果突出显示,和/或所述用户在所述第一时刻在所处第一空间内的电子设备的标识信息在所述第一界面上排在其他电子设备的前面;
和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭所处的第二空间的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二空间中的电子设备,所述用户在所述第二时刻所处第二空间内的电子设备的标识信息在所述第二界面上以预设UI效果突出显示,和/或所述用户在所述第二时刻在所处第二空间内的电子设备的标识信息在所述第二界面上排在其他电子设备的前面。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻与第二电子设备之间的距离信息,则所述m个第二电子设备包括在所述第一时刻与所述用户相距较近的m个电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻与第二电子设备之间的距离信息,则所述k个第二电子设备包括在所述第二时刻与所述用户相距较近的k个电子设备。
在一种可能的设计中,所述m个第二电子设备的信息以弹窗形式显示在所述第一界面中,和/或所述k个第二电子设备的信息以弹窗形式显示在所述第二界面中;
所述方法还包括:自显示所述弹窗开始的预设时长后,停止显示所述弹窗。
在一种可能的设计中,所述m个第二电子设备的信息以预设UI效果显示在所述第 一界面中,和/或,所述k个第二电子设备的信息以预设UI效果显示在所述第二界面中。
通过该方式,电子设备可以提示用户注意第一界面和/或第二界面中的重要信息,从而有助于用户更加快速的从界面中查找到想要控制的第二电子设备。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻执行的第一行为的信息,则所述m个第二电子设备是所述用户执行所述第一行为时,所述用户将要控制的m个电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻执行的第二行为的信息,则所述k个第二电子设备是所述用户执行所述第二行为时,所述用户将要控制的k个电子设备。
在一种可能的设计中,若所述第一用户信息包括第一时刻的信息,则所述m个第二电子设备是在所述第一时刻,所述用户将要控制的m个电子设备;
和/或,若所述第二用户信息包括第二时刻的信息,则所述k个第二电子设备是在所述第二时刻,所述用户将要控制的k个电子设备。
在一种可能的设计中,在获取已经经过用户授权的第一用户信息之后,所述方法还包括:
根据所述第一用户信息,显示第三界面,所述第三界面用于向所述用户推荐目标执行场景。
在一种可能的设计中,在显示第三界面之后,所述方法还包括:
接收所述用户在所述第三界面上输入的第三操作,响应于所述第三操作,添加所述目标执行场景;
当满足所述目标执行场景的触发条件时,执行所述目标执行场景。
第二方面,提供一种智能设备控制装置,该装置可以是第一电子设备或能够完成第一电子设备功能的组件(或支持第一电子设备完成相应功能的组件,比如,可以为芯片系统)。该装置包括:
处理单元,用于在第一时刻,获取已经经过用户授权的第一用户信息;
显示单元,用于根据所述第一用户信息,显示第一界面,所述第一界面包括所述第一用户信息关联的m个第二电子设备的信息,m为大于1的整数;
输入单元,用于检测到作用于所述第一界面上用于控制所述m个第二电子设备中的n个电子设备执行第一目标操作的控件的第一操作,n为不小于1且不大于m的整数;
处理单元,还用于响应于所述第一操作,控制所述n个电子设备执行所述第一目标操作;
处理单元,还用于在第二时刻,获取已经经过所述用户授权的第二用户信息;
显示单元,还用于根据所述第二用户信息,显示第二界面,所述第二界面包括所述第二用户信息关联的k个第二电子设备的信息,k为大于1的整数;
输入单元,还用于检测到作用于所述第二界面上用于控制所述k个第二电子设备中的j个电子设备执行第二目标操作的控件的第二操作,j为不小于1且不大于k的整数;
处理单元,还用于响应于所述第二操作,控制所述j个电子设备执行所述第二目标操作。
应理解,所述第一用户信息和所述第二用户信息包括如下任一种或多种信息:位置信息,时间信息,行为信息。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻所在第一家庭的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一家庭中的电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻所在第二家庭的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二家庭中的电子设备。
在一种可能的设计中,若所述第一用户信息还包括所述用户在所述第一家庭中第一时刻所处的第一空间的信息,则所述m个第二电子设备包括所述第一空间中的电子设备;所述第一界面不包括除所述m个第二电子设备之外的其他电子设备的信息;
和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭中所处的第二空间的信息,则所述k个第二电子设备包括所述第二空间中的电子设备;所述第一界面不包括除所述k个第二电子设备之外的其他电子设备的信息。
在一种可能的设计中,若所述第一用户信息还包括所述用户在所述第一时刻在所述第一家庭所处的第一空间的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一空间中的电子设备,所述用户在所述第一时刻所处第一空间内的电子设备的标识信息在所述第一界面上以预设用户界面UI效果突出显示,和/或所述用户在所述第一时刻在所处第一空间内的电子设备的标识信息在所述第一界面上排在其他电子设备的前面;
和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭所处的第二空间的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二空间中的电子设备,所述用户在所述第二时刻所处第二空间内的电子设备的标识信息在所述第二界面上以预设UI效果突出显示,和/或所述用户在所述第二时刻在所处第二空间内的电子设备的标识信息在所述第二界面上排在其他电子设备的前面。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻与第二电子设备之间的距离信息,则所述m个第二电子设备包括在所述第一时刻与所述用户相距较近的m个电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻与第二电子设备之间的距离信息,则所述k个第二电子设备包括在所述第二时刻与所述用户相距较近的k个电子设备。
在一种可能的设计中,所述m个第二电子设备的信息以弹窗形式显示在所述第一界面中,和/或所述k个第二电子设备的信息以弹窗形式显示在所述第二界面中;
所述显示单元,还用于自显示所述弹窗开始的预设时长后,停止显示所述弹窗。
在一种可能的设计中,所述m个第二电子设备的信息以预设UI效果显示在所述第一界面中,和/或,所述k个第二电子设备的信息以预设UI效果显示在所述第二界面中。
在一种可能的设计中,若所述第一用户信息包括所述用户在所述第一时刻执行的第一行为的信息,则所述m个第二电子设备是所述用户执行所述第一行为时,所述用户将要控制的m个电子设备;
和/或,若所述第二用户信息包括所述用户在所述第二时刻执行的第二行为的信息,则所述k个第二电子设备是所述用户执行所述第二行为时,所述用户将要控制的k个电子设备。
在一种可能的设计中,若所述第一用户信息包括第一时刻的信息,则所述m个第二电子设备是在所述第一时刻,所述用户将要控制的m个电子设备;
和/或,若所述第二用户信息包括第二时刻的信息,则所述k个第二电子设备是在所述第二时刻,所述用户将要控制的k个电子设备。
在一种可能的设计中,显示单元,还用于根据所述第一用户信息,显示第三界面,所述第三界面用于向所述用户推荐目标执行场景。
在一种可能的设计中,输入单元,还用于接收所述用户在所述第三界面上输入的第三操作;
处理单元,还用于响应于所述第三操作,添加所述目标执行场景;当满足所述目标执行场景的触发条件时,执行所述目标执行场景。
第三方面,本申请实施例提供一种电子设备,该电子设备具有实现如上述第一方面及其中任一种可能的实现方式中所述的智能设备控制方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应地软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
第四方面,提供一种计算机可读存储介质。计算机可读存储介质存储有计算机程序(也可称为指令或代码),当该计算机程序被电子设备执行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法。
第五方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面或第一方面中任意一种实施方式的方法。
第六方面,本申请实施例提供一种电路系统,电路系统包括处理电路,处理电路被配置为执行第一方面或第一方面中任意一种实施方式的方法。
第七方面,本申请实施例提供一种芯片系统,包括至少一个处理器和至少一个接口电路,至少一个接口电路用于执行收发功能,并将指令发送给至少一个处理器,当至少一个处理器执行指令时,至少一个处理器执行第一方面或第一方面中任意一种实施方式的方法。
附图说明
图1A为本申请实施例提供的家庭场景示意图;
图1B为本申请实施例提供的坐标转换示意图;
图2为本申请实施例提供的第一电子设备的硬件结构示意图;
图3为本申请实施例提供的电子设备的软件结构示意图;
图4为本申请实施例提供的界面示意图;
图5为本申请实施例提供的界面示意图;
图6A为本申请实施例提供的智能设备控制方法的流程示意图;
图6B为本申请实施例提供的智能设备控制方法的流程示意图;
图7为本申请实施例提供的定位方法示意图;
图8-图11为本申请实施例提供的界面示意图;
图12为本申请实施例提供的智能设备控制方法的流程示意图;
图13-图16为本申请实施例提供的界面示意图;
图17为本申请实施例提供的智能设备控制方法的流程示意图;
图18为本申请实施例提供的界面示意图;
图19为本申请实施例提供的智能设备控制方法的流程示意图;
图20为本申请实施例提供的智能设备控制装置的结构示意图。
具体实施方式
下面结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。
在本申请实施例中,“示例性地”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性地”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性地”或者“例如”等词旨在以具体方式呈现相关概念。
本申请实施例提供一种智能设备的控制方法,示例性地,图1A为该方法适用的一种智能设备控制系统的示意图。如图1A所示,智能设备控制系统可以以家庭为单位对智能设备进行管控。其中,一个家庭又可称为一个全屋,全屋可以划分为不同空间,比如全屋包括入户过道、厨房、餐厅、客厅、阳台、主卧、次卧、卫生间等。
全屋系统可以包括第一电子设备100,第一电子设备100用于控制第二电子设备200(比如物联网(internet of things,IoT)设备)。第一电子设备100包括但不限于是手机、PC、平板电脑等。作为一种可能的实现方式,第一电子设备100可以安装有用于控制第二电子设备200的应用程序。该应用程序可以是系统预装应用,或非预装应用(比如从应用市场下载的应用)。应理解:系统预装应用包括系统应用的一 部分(例如系统应用中的服务,组件,或插件),或者,预先安装在第一电子设备内的独立应用,就是有独立的应用图标,示例性的,应用程序可以是智慧生活应用。
或者,第一电子设备100还可以通过控制中心控制第二电子设备200。示例性的,控制中心可以是第一电子设备100响应于用户从屏幕右上角或顶端下滑的操作,所显示的快捷控制页面。
或者,第一电子设备100还可以通过负一屏中的相应功能菜单控制第二电子设备200。示例性的,负一屏可以是第一电子设备100响应于用户在最左侧主界面上的右滑操作,所显示的系统服务能力入口页面。
全屋设置有至少一个第三电子设备300。示例性地,每个房间或区域包括至少一个第三电子设备300。
可选的,第三电子设备300用于对第二电子设备200,和/或第一电子设备100,和/或用户进行定位,并将第二电子设备200位置信息,和/或第一电子设备100的位置信息,和/或用户的位置信息上报给中枢设备400。
在一些示例中,第三电子设备300可以包括传感器,由传感器负责采集用户的空间位置信息。比如,第三电子设备300可以是摄像头,由摄像头采集其他设备和/或用户的图像信息,并据此确定各设备和/或用户的位置信息。示例性的,第三电子设备300通过传感器检测到用户所在空间为主卧。可选的,第三电子设备300可以实时或按照周期或按照其他策略采集用户的空间位置信息。
在一种示例中,第三电子设备300包括超宽带(Ultra-Wide Band,UWB)模块和/或毫米波雷达模块。第三电子设备300通过UWB模块对第二电子设备200和/或第一电子设备100进行定位。或者,第三电子设备300通过毫米波雷达模块对第二电子设备200和/或第一电子设备100进行定位。在另一些示例中,第三电子设备300包括无线高保真(wireless fidelity,Wi-Fi)模块,第三电子设备300通过Wi-Fi模块对第二电子设备200和/或第一电子设备100进行定位。在另一些示例中,第三电子设备300可以通过上述多个模块进行联合定位。
可选的,第三电子设备300还可以检测用户的生理特征、身份类别和人体姿态等信息中的至少一项,并通过有线或无线方式上传至中枢设备400。
全屋还设置有第二电子设备200(比如,IoT设备)。第二电子设备200也可称为受控设备,第二电子设备200可以由第一电子设备100进行控制。比如,厨房设置有电饭煲或电压力锅、燃气设备等;客厅设置有音箱(比如,智能音箱)、电视(比如,智能电视,也称为智慧屏、大屏等)、路由设备等;阳台设置有晾衣架(比如,智能晾衣架等);餐厅设置有扫地机器人等;主卧设置有电视(比如,智能电视)、音箱(比如,智能音箱)、落地灯(比如,智能落地灯)、路由设备等;次卧设置有台灯(比如,智能台灯)、音箱(比如,智能音箱)等;卫生间设置有体脂秤等。
需要说明的是,虽然在图1A中,第二电子设备200仅示出智能电视,但本领域技术人员应当知晓,第二电子设备200包括但不限于智能电视、智能音箱、智能灯具(如吸顶灯、智能台灯、香薰灯等)、扫地机器人、体脂秤、智能晾衣架、智能电饭煲、空气净化器、加湿器、台式电脑、路由设备、智能插座、饮水机、智能冰箱、智能空调、智能开关、智能门锁等智能家居设备。需要说明的是,第二电子设备200也可以 不是智能家居设备,而是便携设备,比如个人电脑(person computer,PC)、平板电脑、手机、智能遥控器等。本申请实施例对第二电子设备200的具体形式不做限定。
可选的,该系统还可包括中枢设备400。中枢设备400,也称为中枢、中央控制系统或主机等。在一些示例中,中枢设备400,可以用于接收第三电子设备300发送的信息(比如定位信息)。中枢设备400可以根据定位信息以及户型信息,确定用户和/或第一电子设备100所处空间,以及确定用户和/或第一电子设备100在该空间中的具体位置(比如坐标)。可选的,中枢设备400还根据接收的信息(包括但不限于定位信息)通知或控制第二电子设备200。比如,当用户通过语音唤醒智能音箱时,中枢设备400根据全屋内多个智能音箱的位置,通知或控制距离用户最近的一个或多个智能音箱被唤醒。比如,当用户从全屋内一个房间移动至另一房间,中枢设备400控制用户离开的房间内的智能音箱停止播放音频,控制用户进入的房间内的智能音箱开始播放(比如,续播)音频。
可选地,中枢设备400还可以用于根据房屋的户型图构建全屋地图,建立全屋坐标系,将各个第三电子设备300获取到的位置信息转换到全屋坐标系下。这样,可以将各个第三电子设备300检测获取的第二电子设备200和/或第一电子设备100和/或用户的位置信息转换到全屋坐标系中,并确定第二电子设备200或用户在全屋内的具体位置。
作为一种可能的实现方式,第三电子设备建立如图1B的(a)所示的坐标系(称为第一坐标系)。其中,Oe为原点,Xe为X轴,Ye为Y轴,Ze为Z轴。中枢设备400根据全屋户型图建立如图1B的(b)所示的全屋坐标系。其中,Oh为原点,Xh为X轴,Yh为Y轴,Zh为Z轴。
可选的,可以将第一坐标系转换为全屋坐标系,并将第一坐标系中点的坐标转换到全屋坐标系中。示例性的,如图1B,中枢设备400从第三电子设备300接收Ob点的坐标信息之后,可以通过向量的方式,求取Ob点在全屋坐标系下对应点Ob’的坐标。具体来说,两个点之间的距离在不同坐标系下是相同的,但两个点所形成的向量在不同坐标系下的方向表示可能是不同的。比如,要将Ob点在第一坐标系下的坐标转换为Ob点在全屋坐标系下对应点Ob’的坐标,可以通过向量的方式进行转换。示例性地,以通过(O_e O_b)的方式进行转换为例,向量(O_e O_b)在第一坐标系下的距离和(O_h O_b’)在全屋坐标系下的距离(都是L)是相同的,但向量(O_e O_b)用第一坐标系表示的方向,与向量(O_h O_b’)用全屋坐标系表示的方向是不同的。通过获取到第一坐标系和全屋坐标系之间的相对方向变化,在已知向量(O_e O_b)用第一坐标系表示的方向,可以获知向量(O_h O_b’)用全屋坐标系表示的方向;再结合Oe点、Ob点在第一坐标系下的坐标,以及Oh点在全屋坐标系下的坐标,便可求得Ob点在全屋坐标系下对应点Ob’的坐标。这样一来,中枢设备400可以将的第三电子设备300获取的坐标信息转换为全屋坐标系中的坐标。
可选的,中枢设备400还可以将其他电子设备获取的位置信息转换到全屋坐标系下,转换方式可参见上述方式,不再赘述。
上述的同一点在不同坐标系下的坐标转换方式(向量方式)仅为示意性的,本申请对于坐标转换方式不做限定。
可选地,第三电子设备300与第二电子设备200通过有线或无线方式通信。
可选的,第二电子设备200和第三电子设备300可以通过有线(比如,电力总线通信(power line communication,PLC))和/或无线(比如,Wi-Fi、蓝牙等)方式与中枢设备400连接。可以理解的,第二电子设备200和第三电子设备300与中枢设备400连接的方式可以相同也可以不同。比如,第二电子设备200和第三电子设备300都通过无线方式与中枢设备400连接。或者,第二电子设备200通过无线方式与中枢设备400连接,第三电子设备300通过有线方式与中枢设备400连接。或者,第二电子设备200中智能音箱、智能电视、体脂称、扫地机器人等设备与中枢设备400通过无线(比如Wi-Fi)方式连接,第二电子设备200中智能台灯、智能晾衣架、智能门锁等设备通过有线方式(比如PLC)与中枢设备400连接。
可选地,各房间或各区域的中枢设备、全屋的中枢设备均可以单独存在,也可以与第三电子设备或第一电子设备集成为一个设备,还可以与第三电子设备和第一电子设备集成为一个设备。本申请对此不做限定。
在一种示例中,该系统还包括路由设备(比如路由器)。路由设备用于连接局域网或因特网,使用特定协议选择和设定发送信号的路径。示例性地,全屋内部署一个或多个路由器,组成局域网,或者接入局域网或因特网。第二电子设备200和/或第三电子设备300接入路由器,通过路由器建立的Wi-Fi通道与局域网内的设备或互联网内的设备进行数据传输。在一种实施方式中,中枢设备400可以与路由设备集成为一个设备。比如,中枢设备400与路由设备集成为路由设备,即路由设备具有中枢设备400的功能。该路由设备可以为子母路由设备中的一个或多个路由设备,也可以为独立的路由设备。
上述内容,仅是对设备控制方法所适用的系统的一个举例,系统中还可以包括更多、更少的设备,或不同的设备布局位置等。
示例性地,图2示出了一种第一电子设备100的结构示意图。
如图2所示,第一电子设备100可以包括处理器310,存储器320,通用串行总线(universal serial bus,USB)接口330,电源模块340,UWB模块350,无线通信模块360等。可选地,第一电子设备100还可以包括音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,显示屏380等。可选的,第一电子设备100还可以包括传感器模块390等。
可以理解的是,图2示意的结构并不构成对第一电子设备100的具体限定。在本申请另一些实施例中,第一电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。另外,图2示意的各模块间的接口连接关系,只是示意性说明,并不构成对第一电子设备100的结构限定。在本申请另一些实施例中,第一电子设备100也可以采用与图2不同的接口连接方式,或多种接口连接方式的组合。
处理器310可以包括一个或多个处理单元,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。例如,处理器310是一个中央处理器(central processing unit,CPU),也可以是特定集成电路(application specific integrated circuit,ASIC),或者是被配置成实施本申请实施例的一个或多个集成电路,例如: 一个或多个微处理器(digital signal processor,DSP),或,一个或者多个现场可编程门阵列(field programmable gate array,FPGA)。
存储器320可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。例如,存储器320还可以存储处理器310处理后的数据。此外,存储器320可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器310通过运行存储在存储器320的指令,和/或存储在设置于处理器中的存储器的指令,执行第一电子设备100的各种功能应用以及数据处理。
无线通信模块360可以提供应用在第一电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR),紫峰(ZigBee)等无线通信的解决方案。无线通信模块360可以是包括至少一个通信处理模块的一个或多个器件。无线通信模块360经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器310。无线通信模块360还可以从处理器310接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。需要说明的是,图2中无线通信模块360、UWB模块350和毫米波雷达模块160的天线个数仅为示例性说明。可以理解的,通信模块360、UWB模块350和毫米波雷达模块160可以包括更多或更少的天线,本申请实施例对此并不进行限定。
UWB模块350可以提供应用在第一电子设备100上的基于UWB技术的无线通信的解决方案。示例性的,可以通过检测UWB信号,并结合某些定位算法来计算UWB信号在空中飞行的时长,该时长乘以UWB信号在空中传输的速率(例如光速)即得到第一电子设备100和第二电子设备200(比如IoT设备)之间的距离。再示例性的,第一电子设备100还可以根据第二电子设备200发送的UWB信号到达第一电子设备100的不同天线的相位差,确定第二电子设备200相对于第一电子设备100的方向(即UWB信号的来向)。
USB接口330是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口330可以用于连接充电器为第一电子设备100充电,也可以用于第一电子设备100与外围设备之间传输数据。
电源模块340用于为第一电子设备100的各个部件,如处理器310、存储器320等供电。
第一电子设备100可以通过音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,以及应用处理器等实现音频功能。例如音频播放,录音等。
音频模块370用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块370还可以用于对音频信号编码和解码。在一些实施例中,音频模块370可以设置于处理器310中,或将音频模块370的部分功能模块设置于处理器310中。
扬声器370A,也称“喇叭”,用于将音频电信号转换为声音信号。第一电子设备 100可以通过扬声器370A收听音频。
受话器370B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风370C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。用户可以通过人嘴靠近麦克风370C发声,将声音信号输入到麦克风370C。
耳机接口370D用于连接有线耳机。耳机接口370D可以是USB接口330,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
显示屏380用于显示图像,视频等。显示屏380包括显示面板。
可选地,传感器模块390包括惯性测量单元(inertial measurement unit,IMU)模块等。IMU模块可以包括陀螺仪,加速度计等。陀螺仪和加速度计可以用于确定第一电子设备100的运动姿态。
可选地,第一电子设备100还包括滤波器(比如,卡尔曼滤波器)。示例性地,IMU模块的输出和UWB模块350的输出两者可以叠加,两者叠加后的信号可以输入至卡尔曼滤波器进行滤波,从而减少误差。
第一电子设备100、第三电子设备300、中枢设备400的结构可参见图2所示设备的结构。比如,比图2所示设备具有更多、更少组件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。
可选的,电子设备(比如第一电子设备,第三电子设备,第二电子设备(比如IoT设备)或中枢设备)的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的系统为例,示例性说明电子设备的软件结构。
以第一电子设备为例,图3是本发明实施例的第一电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序。
如图3所示,应用程序可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
本申请实施例中,应用程序还包括智能家居管理应用以及基础服务。其中,基础服务将智能设备的管理能力开放给系统。智能家居管理应用可以调用基础服务查询待控制的智能家居设备,和/或调用基础服务控制该智能家居设备。
示例性的,智能家居管理应用可以为智慧生活应用。智能家居应用还可以为其他具有类似功能的应用。智能家居管理应用可以是系统原装应用,也可以是第三方应用,本申请实施例对智能家居管理应用的类别不做限制。下述实施例中主要以智慧生活应用为例进行举例。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电 话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。电话管理器用于提供第一电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
需要说明的是,图3所示软件架构仅是电子设备所适用的一种软件架构,本申请实施例对电子设备的软件架构不做限制。可选的,某些功能模块可以位于不同于图3所示的层级中。比如,基础服务还可以设置在框架层中,本申请实施例不局限于此。
本申请实施例的技术方案可应用在各种设备控制场景中,如下,主要以通过手机中的智慧生活应用控制智能家居设备为例进行说明,但这并不构成对技术方案所适用场景的限制。
本申请实施例中,手机可以实时获取已经经过用户授权的用户信息,并基于用户信息,向用户呈现相应的设备控制界面,以契合用户当前的设备控制需求。其中,用户信息包括但不限于如下信息:位置信息、时间信息、行为信息。如下,分情况对本申请实施例的技术方案进行举例:
情况一:用户信息为位置信息。该情况中,手机获取用户所处的位置信息,并根据用户所处的位置信息,向用户呈现相应的设备控制界面。
示例性的,用户Jack回到家(Jack的家),并打开智慧生活应用,手机检测到用户当前位于Jack的家,则显示如图4所示智慧生活应用中“Jack的家”对应的设备控制界面401。
可以看出,相比于图5的(a)所示相关技术中,用户Jack回到家(Jack的家)之后,还需手动点击界面402中的控件403,并如图5的(b)所示在弹出的家庭管理窗口404中选择“Jack的家”,用户需要多次操作界面才能切换到当前家庭对应的设备控制界面401,本申请实施例的设备控制方法,能够智能自动的为用户切换到当前家庭对应的设备控制界面,简化了用户的操作步骤,有利于提升设备控制过程之间的设备控制效率。
如下,对本申请实施例的设备控制方法进行详细介绍。示例性的,图6A示出了本申请实施例的设备控制方法的流程,该方法包括:
S101、第一电子设备100与路由设备建立连接。
其中,第一电子设备100可以是手机等用于控制智能家居等设备的设备。第一电子设备100可以通过不同方式对智能家居设备进行控制,本实施例中以第一电子设备通过智慧生活应用对智能家居设备进行控制举例。
需要说明的是,路由设备(比如路由器)可以与中枢设备400集成在一起,也可以是独立设置的设备。本实施例中以路由设备与中枢设备400为相互独立的设备为例进行举例。
S102、路由设备向中枢设备400上报第一电子设备100的网络信息。
本申请实施例中,路由设备与第一电子设备100(比如手机)建立连接之后,可以向中枢设备400上报第一电子设备100的网络信息。网络信息包括但不限于网络的名称、标识等。比如,路由设备向中枢设备400上报第一电子设备100所连接到的Wi-Fi网络的名称。
S103、中枢设备400根据第一电子设备100的网络信息,确定第一电子设备100所在家庭的信息。
示例性的,路由设备向中枢设备400上报第一电子设备100所连接到的Wi-Fi网络的名称,则由于不同家庭对应的Wi-Fi网络名称通常不同,中枢设备400可以据此确定第一电子设备100当前位于哪个家庭。
S104、第三电子设备300测量与第一电子设备100之间的距离。
其中,第三电子设备300可以包括传感器,由传感器负责采集用户的空间位置信息,或者,第三电子设备300可以包括UWB、Wi-Fi等通信模块,并通过这些通信模块实现对第一电子设备100的定位功能。本实施例中以通过UWB模块实现定位为例进行说明。通常,用来计算第一电子设备100的第三电子设备300为多个。
示例性的,假设主卧包括如图7所示的第三电子设备300,每个第三电子设备300通过UWB模块向第一电子设备100(比如手机)发送UWB信号,并等待来自第一电子设备100反馈的UWB信号,第三电子设备300可以检测UWB信号的飞行时长(time of flight),并根据飞行时长计算自身到第一电子设备100的距离r。
S105、第三电子设备300将与第一电子设备100之间的距离上报至中枢设备400。
示例性的,如图7所示的第三电子设备300将各自与第一电子设备100之间的距离r上报给中枢设备400。
S106、中枢设备400根据第三电子设备300与第一电子设备100之间的距离信息和户型图,确定第一电子设备100所在空间的信息。
示例性的,中枢设备400具有如图7所示的坐标系,其中,坐标原点为O。中枢设备400根据左上角第三电子设备300与第一电子设备100之间的距离r3,确定以该第三电子设备300为中心,r3为半径的球体;根据右上角第三电子设备300与第一电子设备100之间的距离r1,确定以该第三电子设备300为中心,r1为半径的球体;根据中间第三电子设备300与第一电子设备100之间的距离r2,确定以该第三电子设备300为中心,r2为半径的球体。中枢设备400将三个球体的交点A作为第一电子设备100的位置。在一些示例中,第一电子设备100的位置可视为用户的位置。
S107、第一电子设备100检测到用户打开智慧生活应用的操作。
其中,用户打开智慧生活应用,可以是用户打开后台运行的智慧生活应用,也可以是在前台运行的智慧生活应用。
用户打开智慧生活应用,可以是通过操作界面打开智慧生活应用,比如,用户点击桌面上智慧生活应用的图标,以触发第一电子设备100打开智慧生活应用。或者,用户通过语音指令打开智慧生活应用。
S108、第一电子设备100向中枢设备400发送位置查询请求。该位置查询请求用于查询用户的位置。
其中,用户的位置包括多个维度。其中,当以家庭为单位对智能家居设备进行管理时,用户的位置可以包括用户所处的家庭,和/或用户在该家庭中所处的空间。
作为一种可能的实现方式,第一电子设备100通过智慧生活应用调用基础服务,基础服务通过相应驱动调用通信模块向中枢设备400发送位置查询请求。
在一些示例中,步骤S108为可选步骤。比如,中枢设备400在获取到用户所处家庭、用户所处空间的信息之后,可以周期性或按照其他策略,主动向第一电子设备100发送这些位置信息。第一电子设备100可以存储用户所处家庭、空间的位置信息,并可以更新用户所处家庭、空间的位置信息。此种情况下,当第一电子设备100检测到用户打开智慧生活应用时,可以执行S110。
S109、中枢设备400向第一电子设备100反馈用户的位置信息。
作为一种可能的实现方式,中枢设备400向第一电子设备100反馈用户所在家庭的信息,和/或用户在该家庭中所处空间的信息。
S110、第一电子设备100根据用户的位置信息,显示智慧生活应用的第一界面。
其中,第一界面为用户当前所在家庭对应的第二电子设备(比如IoT设备)的控制界面。
示例性的,以第一电子设备100为手机为例,用户Jack回到家(Jack的家),并打开智慧生活应用,手机检测到用户当前位于Jack的家,则智能自动的显示如图4所示智慧生活应用中“Jack的家”对应的设备控制界面401(第一界面)。如此一来,用户可以通过设备控制界面401,对自身当前所在家庭中的设备进行控制。此种设备 控制方法中,无需用户在界面中进行多次操作,可降低用户操作的复杂度,提升设备管理效率。
在一些实施例中,用户可以以“空间”为粒度对智能家居设备进行控制。示例性的,假设用户上次使用智慧生活应用时,切换到“空间”标签下的设备控制界面。响应于用户本次打开智慧生活应用的操作,如图8的(a)所示,手机根据用户当前所处空间(位于主卧)的信息,显示“空间”标签703下的设备控制界面701,该设备控制界面701包括主卧对应的设备控制卡片702。如此,用户可以通过主卧对应的设备控制卡片702对主卧的智能家居设备进行控制。
可选的,如图8的(a)所示,卡片702可以包括主卧中的设备,比如灯、窗帘等。可选的,卡片702还可包括主卧中设备对应的任务。比如,温控任务、净化任务等。在一种可能的设计中,卡片尽可能多的包括主卧中的设备以及相应任务。如此,用户可通过卡片预览主卧中的大部分设备以及相应任务,以便于从这些设备以及任务中选择想要控制的设备以及需要该设备执行的任务。
可选的,设备控制界面701还可以包括按钮706,用户可以点击按钮706显示全屋的空间卡片,比如,显示卫生间、次卧、阳台等空间对应的卡片。
之后,用户移动位置,假设如图8的(b),手机获取到用户位于卫生间,那么,当用户在卫生间打开智慧生活应用时,手机可以显示如图8的(b)所示设备控制界面704,该设备控制界面704包括卫生间对应的卡片705。在一个示例中,该卡片705尽可能多的包括卫生间中的设备以及各设备对应的任务,以便于用户操作控制卫生间中的智能家居设备。
再示例性的,假设手机当前显示如图8的(a)所示智慧生活应用的“设备”标签对应的界面,在检测到用户点击“空间”标签703之后,手机根据用户当前位置(主卧),显示如图8的(a)所示“空间”标签703下的设备控制界面701,该设备控制界面701包括主卧对应的设备控制卡片702。用户可以通过主卧对应的设备控制卡片702对主卧的智能家居设备进行控制。
可以看出,在本申请实施例中,在用户通过手机的智慧生活应用等方式控制智能家居设备的场景,手机可根据用户的空间信息,向用户展示该空间中设备的设备控制界面,使得用户可以便捷的通过该空间内设备的设备控制界面,控制该空间内的智能家居设备。
上述以“空间”标签下的设备控制界面仅包括用户当前所处空间的卡片为例进行说明,在另一些实施例中,“空间”标签下的设备控制界面可以包括用户当前所处空间的卡片,以及其他空间的卡片。并且,手机可以以特定方式展示用户当前所处空间的卡片,以提示用户注意该卡片。可选,手机可以在设备控制界面的顶端展示当前所在空间对应的卡片,和/或,手机还可以以特定用户界面(user interface,UI)效果展示该卡片。
示例性的,假设用户上次使用智慧生活应用时,切换到“空间”标签对应的设备控制界面,那么,当用户本次打开智慧生活应用时,假设如图9的(a),手机获取到用户位于主卧,则手机可以显示如图9的(a)所示设备控制界面801。在一个示例中,该界面801包括,该界面801中,主卧对用的卡片802被展示在界面801的顶端,便 于用户关注到该卡片,并通过该卡片操作控制主卧中的智能家居设备。
之后,如图9的(b),手机获取到用户位于卫生间,那么,当用户在卫生间打开智慧生活应用时,手机可以显示如图9的(b)所示设备控制界面803。在一个示例中,该界面803中,卫生间对用的卡片804被展示在界面803的顶端,便于用户操作控制卫生间中的智能家居设备。
再示例性的,假设用户上次使用智慧生活应用时,切换到“空间”标签对应的设备控制界面,那么,当用户本次打开智慧生活应用时,假设如图10的(a),手机获取到用户位于主卧,则手机可以显示如图10的(a)所示设备控制界面901。在一个示例中,该界面901中,主卧对用的卡片902被以特定UI效果(比如边框闪烁、颜色变化等)展示,便于用户操作控制主卧中的智能家居设备。
之后,如图10的(b),手机获取到用户位于卫生间,那么,当用户在卫生间打开智慧生活应用时,手机可以显示如图10的(b)所示设备控制界面903。在一个示例中,该界面903中,卫生间对用的卡片904被以特定UI效果展示。
上述主要以中枢设备400计算用户的位置信息为例进行说明,在另一些实施例中,还可以由第一电子设备100计算用户的位置信息。如图6B示出了本申请实施例的设备控制方法的又一流程,该方法包括如下步骤:
S201、第一电子设备100与路由设备建立连接。
其中,步骤S201-S206的具体实现可参见图6B对应的实施例的相关描述。
S202、路由设备向中枢设备400上报第一电子设备100的网络信息。
S203、中枢设备400根据第一电子设备100的网络信息,确定第一电子设备100所在家庭的信息。
S204、第三电子设备300测量与第一电子设备100之间的距离。
S205、第三电子设备300将与第一电子设备100之间的距离上报至中枢设备400。
S206、第一电子设备100检测到用户打开智慧生活应用的操作。
S207、第一电子设备100向中枢设备400发送位置查询请求。该家庭查询请求用于查询用户的位置信息。
S208、中枢设备400向第一电子设备100反馈用户所在家庭的信息、第一电子设备100与第三电子设备300之间的距离信息,以及户型图的信息。
示例性的,中枢设备400向第一电子设备100反馈用户当前所在家庭为“Jack的家”、如图7所示的第一电子设备100与第三电子设备300之间的距离r1、r2、r3,以及户型图的信息。
可选的,中枢设备400可以从第一电子设备100接收一个位置查询请求之后,向第一电子设备100发送用户所位于的家庭信息,以及用于确定用户所在空间的信息(比如上述距离r1、r2、r3,以及户型图的信息)。或者,中枢设备400也可以从第一电子设备100接收不同的请求,并基于不同请求反馈不同信息。比如,中枢设备400从第一电子设备100接收家庭查询请求,并基于该家庭查询请求向第一电子设备100反馈用户所在家庭的信息。中枢设备400从第一电子设备100接收空间查询请求,并基于该空间查询请求向第一电子设备100反馈用于确定用户所在空间的信息(比如上述距离r1、r2、r3,以及户型图的信息)。或者,中枢设备400主动向第一电子设备100 发送用户所在家庭的信息,和/或用于确定用户所在空间的信息。
S209、第一电子设备100根据第三电子设备300与第一电子设备100之间的距离信息和户型图,确定第一电子设备100所在空间的信息。
示例性的,第一电子设备100根据如图7所示的r1、r2、r3,以及户型图,确定三个球体的交点A为第一电子设备100所在位置。
S210、第一电子设备100根据用户所在家庭的信息以及用户所在空间的信息,显示智慧生活应用的第一界面。
其中,第一界面包括用户所在空间的第二电子设备(比如IoT设备)的信息。
示例性的,如图8所示,用户在家里(Jack家)的主卧内,若第一电子设备100检测到用户指示切换到“空间”标签703时,则第一电子设备100可自动显示第一界面701,该第一界面701为Jack家的主卧对应的设备控制界面,该设备控制界面包括主卧对应的卡片702,如此,便于用户通过该卡片702,对当前所处主卧内的智能家居设备进行控制。也就是说,用户无需从繁杂的多个空间的多个卡片702中进行查找选择,就能方便直观的获得当前所处空间对应的设备控制卡片,能够简化用户的操作。
本申请实施例还提供一种设备控制方法,第一电子设备100可以自动智能的向用户推荐待控制的智能家居设备。
示例性的,如图10的(a),用户点击主卧对应的卡片902,手机可跳转到如图11所示主卧对应的设备详情界面1101。该设备详情界面1101包括主卧中全部的设备。手机还可以在界面1101中显示弹窗1102,弹窗1102用于向用户推荐待控制的智能家居设备。比如,手机通过弹窗1102向用户推荐距离用户最近的音箱、电视机。
需要说明的是,以弹窗1102形式向用户推荐智能设备,可以是在如图11的界面1101中显示弹窗1102,也可以是在其他任何可能的界面中显示弹窗1102。比如,手机可以在图4所示的界面401中显示弹窗1102。
图12示出了图11所示设备控制场景对应的方法流程。如图12,该方法包括:
S301、第一电子设备100检测到用户打开智慧生活应用的操作。
S302、第一电子设备100向中枢设备400发送空间查询请求。
S303、中枢设备400向第一电子设备100反馈第一电子设备100所在空间的信息以及第二电子设备200所在空间的信息。
示例性的,第二电子设备200可以是待控制的智能家居设备。
需要说明的是,图12对应的方案中主要以第一电子设备100向中枢设备400查询第一电子设备100所在空间为例,在另一些实施例中,第一电子设备100还可以自行确定第一电子设备100所在空间,和/或,第一电子设备100可以确定第二电子设备200所在空间。
S304、第一电子设备100向第二电子设备200发送UWB信号。
第一电子设备100测量与第二电子设备200之间的距离,可以有多种方式,本实施例中,以第一电子设备100通过UWB模块收发UWB信号实现测距、定位为例。
S305、第二电子设备200向第一电子设备100发送UWB信号。
可以理解,第二电子设备200从第一电子设备100接收UWB信号后,需向第一电子设备100反馈UWB信号,以便第一电子设备100通过该反馈的UWB信号,确定第二 电子设备200与第一电子设备100之间的距离。
S306、第一电子设备100根据UWB信号,确定第二电子设备200与第一电子设备100之间的距离。
S307、第一电子设备100根据第二电子设备200与第一电子设备100之间的距离、第一电子设备100所在空间的信息以及第二电子设备200所在空间的信息,显示智慧生活应用的界面。
作为一种可能的实现方式,第一电子设备100根据第二电子设备200与第一电子设备100之间的距离、第一电子设备100所在空间的信息以及第二电子设备200所在空间的信息,识别用户当前所在空间中,距离用户位置最近的N(N为正整数)个第二电子设备(比如IoT设备)。示例性的,如图10的(a),用户当前在主卧内,第一电子设备100识别出当前距离用户最近的2个第二电子设备为音箱和电视机。之后,如图11所示,第一电子设备100可以显示智慧生活应用的界面1101,该界面1101包括弹窗1102,以便向用户推荐主卧内距离用户最近的音箱和电视机。
上述主要以第一电子设备100通过独立弹窗向用户推荐待控制设备为例,在另一些实施例中,第一电子设备100还可以在界面中以特定UI效果显示待控制设备,以使得用户更加关注待控制设备。示例性的,如图13,在界面1103中,第一电子设备100将音箱和电视机对应的卡片以特定UI效果展示,以使得用户能更快速的注意到音箱和电视机的卡片,并通过相应卡片对音箱和电视机进行控制。
此外,上述主要以第一电子设备100在“主卧”界面(比如图13所示界面1103)中向用户推荐距离用户最近的智能设备为例进行说明,在另一些实施例中,第一电子设备100还可以通过其他界面,以其他形式向用户推荐智能设备,本申请实施例对此不做限制。
情况二:用户信息为用户行为信息。第一电子设备100可以获取用户行为信息,若检测到预设类型的用户行为,则可以向用户推荐用户行为相关的智能家居设备。
示例性的,第一电子设备100统计已经经过用户授权的用户行为信息,获知用户在打游戏时,通常会通过电视机等大屏设备进行操作。那么,如图14的(a)所示,当检测到用户打开游戏应用时,第一电子设备100可以在加载游戏界面1401时,弹出如图14的(b)所示弹窗1402,以向用户提示游戏体验更好的大屏设备。如图14的(b),当检测到用户点击“是”选项,则第一电子设备100控制在电视机上显示游戏界面,用户可以通过电视机进行游戏操作。
可选的,第一电子设备100可以向用户推荐用户行为关联的一个设备,也可以向用户推荐用户行为关联的多个设备。比如,第一电子设备100(手机)检测到目前连接有电视机、笔记本电脑,则在检测到用户打开游戏应用之后,手机可以以弹窗形式向用户推荐电视机、笔记本电脑。若用户选择笔记本电脑,则手机控制笔记本电脑上显示游戏的相关界面,用户可以通过笔记本电脑进行游戏。
可选的,图14的(b)所示弹窗1402还可以用悬浮窗等控件代替。可选的,第一电子设备100在显示悬浮窗的时长达到预设时长之后,停止显示该悬浮窗。
可选的,在本申请实施例中,第一电子设备100除了通过界面向用户推荐/提示用户行为关联的设备,第一电子设备100向用户提示用户行为关联的设备,还可以通过 语音等方式进行提示,本申请实施例对具体的设备推荐方式,不做限制。
情况三:用户信息为时间信息。第一电子设备100可以获取时间信息,若当前时间在目标时间段内,则可以向用户推荐该目标时间段关联的目标设备。
示例性的,第一电子设备100可以统计用户在各个时间点(或多个时间段)控制的设备,并可以将统计结果上报给中枢设备400,中枢设备400可以根据统计结果进行数据分析。用户下次在第一电子设备100上打开智慧生活应用时,第一电子设备100根据数据分析结果,把经常操作的设备显示在前面。
需要说明的是,上述主要以用户信息为位置信息、用户行为信息为例进行说明,用户信息还可以包括其他信息。第一电子设备100可以根据用户信息中的一个或多个信息,确定该一个或多个信息关联的待控制设备,并向用户提示/推荐该待控制设备。
本申请实施例还提供一种设备控制方法,该方法中,第一电子设备100可以获取用户信息,并根据用户信息向用户推荐智能场景(也可以称为执行场景)。以手机为第一电子设备100为例,示例性的,假设手机中安装有智慧生活应用,手机启动智慧生活应用,显示如图15中(a)所示的场景界面1501。手机检测到用户点击添加控件1502的操作后,显示如图15中(b)所示界面1503。
在界面1503中,手机检测到用户点击添加任务控件1505的操作后,显示如图15中(c)所示界面1507。在界面1507上,手机检测到用户点击智能设备控件1508的操作后,确定用户需要添加控制智能设备的任务,可显示可控制的智能设备供用户选择。假设用户选择的智能设备为空调,选择的执行任务为“关闭”,则如图15中(d)所示界面1509中附图标记1512所示,手机可以显示用户选择的空调对应的执行任务“关闭”。在界面1509,手机检测到用户点击附图标记1511所示控件的操作后,可根据用户的操作设置触发条件,比如,设置的触发条件为“早晨8点”。
作为一种可能的实现方式,界面1503中还可以包括推荐场景的卡片1504。该推荐场景是根据用户信息中的一个或多个信息确定的。比如,手机通过统计经过用户授权的用户信息,确定用户在家打开手机上的游戏应用时,通常会将游戏投屏到电视机上进行操作。那么,手机可以在用户添加智能场景时,向用户推荐“当在家打开手机上的游戏应用时,打开电视机”这一智能场景。比如,在场景创建界面1503中通过卡片1504向用户推荐该智能场景。当检测到用户点击卡片1504中的“确认添加”选项时,手机将该智能场景添加到待执行的智能场景中。
之后,如图15中(d)所示界面1509,手机检测到用户点击确认控件1510的操作后,确认用户当前场景创建完成。手机可以显示如图15中(e)所示界面1513,该界面1513包括用户手动创建的“早上关闭空调”场景的卡片1514以及手机推荐的“打开电视机”场景的卡片1515。
可以看出,第一电子设备根据用户信息,向用户推荐智能场景,并根据用户的指示添加该智能场景,使得用户无需手动输入智能场景的执行任务和触发条件,用户可以方便快捷的为第一电子设备添加智能场景,提升用户执行设备控制过程中的交互体验。
在一些实施例中,第一电子设备100可以对设备控制相关功能进行设置。图16示出了第一电子设备100的一种示例性的设置界面。如图16所示,设置界面可包括开 关1601,当开关1601被打开,第一电子设备100可以执行上述设备控制方法,比如可以智能的向用户推荐距离最近的设备。再示例性的,第一电子设备100还可以设置开启智能推荐设备功能的时间段、使用场景等,本申请实施例对此不做限制。
图17示出了本申请实施例的另一方法流程。该方法应用于第一电子设备,该方法包括:
S401、在第一时刻,获取已经经过用户授权的第一用户信息。
S402、根据所述第一用户信息,显示第一界面。
其中,所述第一界面包括所述第一用户信息关联的m个第二电子设备的信息,m为大于1的整数。
S403、检测到作用于所述第一界面上用于控制所述m个第二电子设备中的n个电子设备执行第一目标操作的控件的第一操作,响应于所述第一操作,控制所述n个电子设备执行所述第一目标操作。
其中,n为不小于1且不大于m的整数。
S404、在第二时刻,获取已经经过用户授权的第二用户信息。
S405、根据所述第二用户信息,显示第二界面。
其中,所述第二界面包括所述第二用户信息关联的k个第二电子设备的信息;k为大于1的整数。
S406、检测到作用于所述第二界面上用于控制所述k个第二电子设备中的j个电子设备执行第二目标操作的控件的第二操作,响应于所述第二操作,控制所述j个电子设备执行所述第二目标操作。其中,j为不小于1且不大于k的整数。
应理解:所述第一用户信息和所述第二用户信息包括如下任一种或多种信息:位置信息,时间信息,行为信息。
示例性的,第一用户信息包括如下任一种或多种信息:用户在所述第一时刻执行的第一行为的信息、所述第一时刻的信息、用户在所述第一时刻所在家庭的信息、用户在所述第一时刻所处的第一空间的信息、用户在所述第一时刻与所述第二电子设备之间的距离信息;所述第二用户信息包括如下任一种或多种信息:用户在所述第二时刻执行的第二行为的信息、所述第二时刻的信息、用户在所述第二时刻所在家庭的信息、用户在所述第二时刻所处的第二空间的信息、用户在所述第二时刻与所述第二电子设备之间的距离信息。
上述方案中,第一电子设备显示第一界面,并根据用户在第一界面输入的第一操作,控制第二电子设备执行第一目标操作。之后,在第二时刻,第一电子设备可获取新的用户信息(即第二用户信息),并根据第二用户信息,自动智能的将显示的界面由第一界面切换到第二界面,使得用户可以通过第二界面控制第二用户信息关联的第二电子设备。可见,在用户信息发生变化时,第一电子设备可以获取最新的用户信息(第二用户信息),并根据第二用户信息,自动切换至与第二用户信息关联的界面,以满足用户的设备控制需求。该过程中,无需用户进行繁琐的界面切换操作,降低了用户的操作复杂度,进而缩短切换界面所需的时间,提升设备控制的效率。
可选的,若所述第一用户信息包括用户在所述第一时刻所在第一家庭的信息,则所述m个第二电子设备包括用户在所述第一时刻所在所述第一家庭中的电子设备。和/ 或,若所述第二用户信息包括用户在所述第二时刻所在第二家庭的信息,则所述k个第二电子设备包括用户在所述第二时刻所在所述第二家庭中的电子设备。其中,第一家庭与第二家庭相同或不同。
可选的,若所述第一用户信息还包括用户在所述第一家庭中第一时刻所处的第一空间的信息,则所述m个第二电子设备包括所述第一空间中的电子设备;所述第一界面不包括除所述m个第二电子设备之外的其他电子设备的信息。和/或,若所述第二用户信息还包括用户在所述第二时刻在所述第二家庭中所处的第二空间的信息,则所述k个第二电子设备包括所述第二空间中的电子设备;所述第一界面不包括除所述k个第二电子设备之外的其他电子设备的信息。其中,第一家庭与第二家庭相同或不同。
示例性的,如图8的(a),用户在第一时刻位于主卧(第一空间),则手机可以显示界面701(第一界面),界面701包括主卧内电子设备(第一用户信息关联的第二电子设备)的信息。并且,界面701不包括除主卧之外的其他空间内电子设备的信息。若检测到作用于界面701上用于控制主卧内灯开启(即第一目标操作)的操作(即第一操作),则手机控制灯开启。
上述主要以用户通过第一界面输入第一操作为例,可选的,在另一些示例中,第一操作还可以是在其他界面中输入的。比如,仍如图8的(a)所示,用户可以点击卡片702的空白区域,触发手机跳转到卡片702的详情界面。用户可以在详情界面中输入用于控制灯开启的操作(第一操作)。
如图8的(b),用户在第二时刻位于卫生间(第二空间),则手机可以显示界面704(第二界面),界面704包括卫生间内电子设备(第二用户信息关联的第二电子设备)的信息。并且,界面704不包括除卫生间之外的其他空间内电子设备的信息。
可选的,若所述第一用户信息还包括用户在所述第一时刻在所述第一家庭所处的第一空间的信息,则所述m个第二电子设备包括所述用户在所述第一时刻在所述第一空间中的电子设备。所述第一界面还包括除所述m个第二电子设备之外的其他电子设备的信息;用户在所述第一时刻所处第一空间内的电子设备的标识信息在所述第一界面上以预设用户界面UI效果突出显示,和/或用户在所述第一时刻在所处第一空间内的电子设备的标识信息在所述第一界面上排在其他电子设备的前面。和/或,若所述第二用户信息还包括用户在所述第二时刻在所述第二家庭所处的第二空间的信息,则所述k个第二电子设备包括所述第二空间中的电子设备。所述第二界面还包括除所述k个第二电子设备之外的其他电子设备的信息;所述用户在所述第二时刻所处第二空间内的电子设备的标识信息在所述第二界面上以预设UI效果突出显示,和/或所述用户在所述第二时刻在所处第二空间内的电子设备的标识信息在所述第二界面上排在其他电子设备的前面。
可选的,预设UI效果包括但不限于如下一种或多种效果:颜色效果、动画效果。
示例性的,如图9的(a),用户在第一时刻位于主卧(第一空间),则手机可以显示界面801(第一界面),界面801包括主卧内电子设备(第一用户信息关联的第二电子设备)的信息。并且,界面801还包括除主卧之外的其他空间(比如餐厅、客厅)内电子设备的信息。界面801中,主卧对应的卡片显示在其他空间的卡片之前,或者说,主卧内电子设备的信息显示在其他空间内电子设备的前面。
如图9的(b),用户在第二时刻位于卫生间(第二空间),则手机可以显示界面803(第二界面),界面803包括卫生间内电子设备(第二用户信息关联的第二电子设备)的信息。并且,界面803还包括除卫生间之外的其他空间内电子设备的信息。界面803中,卫生间对应的卡片显示在其他空间的卡片之前,或者说,卫生间内电子设备的信息显示在其他空间内电子设备的前面。
再示例性的,如图9的(a),用户在第一时刻位于主卧(第一空间),则手机可以显示界面801(第一界面),界面801包括主卧内电子设备(第一用户信息关联的第二电子设备)的信息。并且,界面801还包括除主卧之外的其他空间(比如餐厅、客厅)内电子设备的信息。界面801中,主卧对应的卡片是按照预设UI效果(比如加粗)显示的,或者说,主卧内第二电子设备的信息是按照预设UI效果显示的。
如图9的(b),用户在第二时刻位于卫生间(第二空间),则手机可以显示界面803(第二界面),界面803包括卫生间内电子设备(第二用户信息关联的第二电子设备)的信息。并且,界面803还包括除卫生间之外的其他空间内电子设备的信息。界面803中,卫生间对应的卡片是按照预设UI效果(比如加粗)显示的,或者说,卫生间内第二电子设备的信息是按照预设UI效果显示的。
可选的,若所述第一用户信息包括用户在所述第一时刻与第二电子设备之间的距离信息,则所述m个第二电子设备包括在所述第一时刻与所述用户相距较近的m个电子设备。和/或,若所述第二用户信息包括用户在所述第二时刻与第二电子设备之间的距离信息,则所述k个第二电子设备包括在所述第二时刻与所述用户相距较近的k个电子设备。
可选的,与用户相距较近的m个电子设备,可以是与用户之间距离相等或不相同的m个设备。
可选的,与用户相距较近的电子设备可以是与用户在同一空间内的设备。
示例性的,如图11,在所述第一时刻,假设用户在主卧,手机获知与所述用户相距最近的2个电子设备为主卧内的音箱和电视机,则手机可以显示界面1101(第一界面),用于向用户推荐电视机和音箱。再示例性的,在第二时刻,假设用户在卫生间,手机获知与所述用户相距最近的电子设备为卫生间内的体脂称,则手机可以将显示的界面自动切换至包括体脂称的界面,以便向用户推荐体脂称。
可选的,所述m个第二电子设备的信息以弹窗形式显示在所述第一界面中,和/或所述k个第二电子设备的信息以弹窗形式显示在所述第二界面中。示例性的,仍如图11,电视机和音箱的信息以弹窗1102方式显示在界面1101中。
可选的,自显示所述弹窗开始的预设时长后,停止显示所述弹窗。其中,预设时长可以灵活设置。示例性的,仍如图11,自显示弹窗1102开始的5秒内,停止显示弹窗1102。
可选的,所述m个第二电子设备的信息以预设UI效果显示在所述第一界面中,或者,第二电子设备的信息显示在其他电子设备的信息之前。
和/或,所述k个第二电子设备的信息以预设UI效果显示在所述第二界面中。
示例性的,如图13,推荐的设备(音箱、电视机)以预设UI效果显示在界面1103中。
可选的,若所述第一用户信息包括所述用户在所述第一时刻执行的第一行为的信息,则所述m个第二电子设备是所述用户执行所述第一行为时,所述用户将要控制的m个电子设备。和/或,若所述第二用户信息包括所述用户在所述第二时刻执行的第二行为的信息,则所述k个第二电子设备是所述用户执行所述第二行为时,所述用户将要控制的k个电子设备。
示例性的,如图14的(a),在第一时刻,手机获知用户打开游戏应用(第一行为),则手机显示界面1401(第一界面),界面1401包括弹窗1402,弹窗1402包括用户打开游戏应用时,意图控制的电视机的信息。
再示例性的,在第二时刻,手机获知用户执行第二行为(假设第二行为关联的电子设备为扫地机器人),则手机自动将界面切换至包括扫地机器人的界面,以便向用户推荐扫地机器人。
可选的,若所述第一用户信息包括所述第一时刻的信息,则所述m个第二电子设备是在所述第一时刻,所述用户将要控制的m个电子设备。和/或,若所述第二用户信息包括所述第二时刻的信息,则所述k个第二电子设备是在所述第二时刻,所述用户将要控制的k个电子设备。
可选的,该方法还包括:根据所述第一用户信息,显示第三界面,所述第三界面用于向用户推荐目标执行场景。接收所述用户在所述第三界面上输入的第三操作,响应于所述第三操作,添加所述目标执行场景;当满足所述目标执行场景的触发条件时,执行所述目标执行场景。
示例性的,手机获取用户信息之后,获知用户在家打开手机上的游戏应用时,通常会将游戏投屏到电视机上进行操作,则手机可以显示如图15的(b)所示界面1503(第三界面),用于向用户推荐场景。在一些示例中,若检测到用户在界面1503上点击控件1504中的“确认添加”按钮(第三操作),则手机添加该推荐场景。后续,当满足该推荐场景的触发条件,即用户在家中打开手机上的游戏应用时,手机执行该推荐场景,控制打开电视机。
需要说明的是,第一电子设备还可以根据上述多种用户信息,确定用户想要控制的第二电子设备,并显示包含该第二电子设备信息的界面。
在一些示例中,第一电子设备根据用户的位置信息、行为信息和时间信息,确定用户想要控制的第二电子设备。比如,在时刻A,当检测到用户在客厅打开手机上的游戏应用时,手机可向用户推荐打开客厅的电视机,以便用户将游戏投屏到电视机上进行操作。在时刻B,当检测到用户在卧室打开手机上的游戏应用时,手机可向用户推荐打开卧室的电脑,以便用户在电脑上进行游戏操作。
在一些示例中,第一电子设备根据用户的位置信息和时间信息,确定用户想要控制的第二电子设备。比如,当检测到用户在时刻A在主卧打开智慧生活应用时,手机显示图18的(a)所示界面1801。界面1801包括主卧对应的卡片1802。卡片1802可包括主卧内部分或全部电子设备的信息。界面1801还可包括当前时刻A关联的设备卡片1803。卡片1803包括当前时刻A的部分常用设备的信息。用户可通过卡片1802和卡片1803快速便捷的查找想要控制的电子设备。
之后,当检测到用户在时刻B在卫生间打开智慧生活应用时,手机显示图18的(b) 所示界面1804。界面1804包括卫生间对应的卡片1805。卡片1805可包括卫生间内部分或全部电子设备的信息。界面1805还可包括当前时刻B关联的设备卡片1806。卡片1806包括当前时刻B的部分常用设备的信息。用户可通过卡片1805和卡片1806快速便捷的查找想要控制的电子设备。
上述仅列举了第一电子设备根据多个用户信息确定所显示界面的方式(所显示界面可包括多个信息分别关联的电子设备),第一电子设备还可以根据多个用户信息以及其他策略确定所显示界面,本申请实施例对此不做限制。比如,在另一些实施例中,可以为用户信息设置优先级。当检测到不同用户信息关联到不同的设备时,优先向用户推荐优先级高的用户信息关联的第二电子设备。或者,高优先级的用户信息关联的电子设备显示在其他电子设备的前面,或者,高优先级的用户信息关联的电子设备按照预设UI效果显示。
本申请实施例还提供一种设备控制方法,该方法应用于第一电子设备,如图19,该方法包括:
S501、获取已经经过用户授权的第一用户信息。
S502、根据所述第一用户信息,显示第三界面,所述第三界面用于向用户推荐目标执行场景。
示例性的,假设第一用户信息包括用户所执行行为的信息,手机获取第一用户信息之后,获知用户在家打开手机上的游戏应用时,通常会将游戏投屏到电视机上进行操作,则手机可以显示如图15的(b)所示界面1503(第三界面),用于向用户推荐“在家打开手机上的游戏应用时,打开电视机”这一目标执行场景。
S503、接收所述用户在所述第三界面上输入的第三操作,并根据所述第三操作,添加所述目标执行场景。
仍如图15的(b),若检测到用户在界面1503上点击控件1504中的“确认添加”按钮(第三操作),则手机添加该目标执行场景。
S504、当满足所述目标执行场景的触发条件时,执行所述目标执行场景。
可以理解,在添加上述目标执行场景后,若检测到满足上述目标执行场景的触发条件,即检测到用户在家中打开手机上的游戏应用,则手机执行该目标执行场景,控制打开电视机。
在另一些实施例中,第一电子设备还可以在检测到第一用户信息之后,自动添加第一用户信息关联的目标执行场景。
在一些方案中,可以对本申请的多个实施例进行组合,并实施组合后的方案。可选的,各方法实施例的流程中的一些操作任选地被组合,并且/或者一些操作的顺序任选地被改变。并且,各流程的步骤之间的执行顺序仅是示例性的,并不构成对步骤之间执行顺序的限制,各步骤之间还可以是其他执行顺序。并非旨在表明所述执行次序是可以执行这些操作的唯一次序。本领域的普通技术人员会想到多种方式来对本文所述的操作进行重新排序。另外,应当指出的是,本文某个实施例涉及的过程细节同样以类似的方式适用于其他实施例,或者,不同实施例之间可以组合使用。
示例性的,图6A中,对于步骤S102和步骤S105之间的执行顺序不做限制。
此外,方法实施例中的某些步骤可等效替换成其他可能的步骤。或者,方法实施 例中的某些步骤可以是可选的,在某些使用场景中可以删除。或者,可以在方法实施例中增加其他可能的步骤。
并且,各方法实施例之间可以单独实施,或结合起来实施。
可以理解的是,本申请实施例中的电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。结合本申请中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以对每个特定的应用来使用不同的方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的技术方案的范围。
本申请实施例可以根据上述方法示例对电子设备进行功能单元的划分,例如,可以对应各个功能划分各个功能单元,也可以将两个或两个以上的功能集成在一个处理单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。需要说明的是,本申请实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
图20示出了本申请实施例中提供的智能设备控制装置的一种示意性框图,该装置可以为上述的第一电子设备或具有相应功能的组件。该装置1700可以以软件的形式存在,还可以为可用于设备的芯片。装置1700包括:处理单元1702和通信单元1703。可选的,通信单元1703还可以划分为发送单元(并未在图20中示出)和接收单元(并未在图20中示出)。其中,发送单元,用于支持装置1700向其他电子设备发送信息。接收单元,用于支持装置1700从其他电子设备接收信息。
可选的,装置1700还可以包括存储单元1701,用于存储装置1700的程序代码和数据,数据可以包括不限于原始数据或者中间数据等。
处理单元1702可以用于支持接收设备执行诸如图19中的S501等,和/或用于本文所描述的方案的其它过程。通信单元1703用于支持该装置1700和其他电子设备(例如上述第二电子设备等)之间的通信,例如支持执行图12中的S304等。
可选的,装置1700还可以包括输入单元(未在图20中示出),用于接收用户的输入信息,比如,接收用户输入的第一操作、第二操作等。
可选的,装置1700还可以包括显示单元(未在图20中示出),用于显示界面和/或其他信息。
一种可能的方式中,处理单元1702可以是控制器或图2所示的处理器310,例如可以是中央处理器(Central Processing Unit,CPU),通用处理器,数字信号处理(Digital Signal Processing,DSP),应用专用集成电路(Application Specific Integrated Circuit,ASIC),现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。
一种可能的方式中,通信单元1703可以包括图2所示的无线通信模块360、还可以包括收发电路、收发器、射频器件等。
一种可能的方式中,存储单元1701可以是图2所示的存储器320。
本申请实施例还提供一种电子设备,包括一个或多个处理器以及一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述相关方法步骤实现上述实施例中的智能设备控制方法。
本申请实施例还提供一种芯片系统,包括:处理器,所述处理器与存储器耦合,所述存储器用于存储程序或指令,当所述程序或指令被所述处理器执行时,使得该芯片系统实现上述任一方法实施例中的方法。
可选地,该芯片系统中的处理器可以为一个或多个。该处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
可选地,该芯片系统中的存储器也可以为一个或多个。该存储器可以与处理器集成在一起,也可以和处理器分离设置,本申请并不限定。示例性的,存储器可以是非瞬时性处理器,例如只读存储器ROM,其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请对存储器的类型,以及存储器与处理器的设置方式不作具体限定。
示例性的,该芯片系统可以是现场可编程门阵列(field programmable gatearray,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processorunit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
应理解,上述方法实施例中的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的智能设备控制方法。
本申请实施例还提供一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的智能设备控制方法。
另外,本申请的实施例还提供一种装置,该装置具体可以是组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使装置执行上述各方法实施例中的智能设备控制方法。
其中,本申请实施例提供的电子设备、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
可以理解的是,为了实现上述功能,电子设备包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的方法,可以通过其它的方式实现。例如,以上所描述的终端设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,模块或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序指令的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种智能设备控制方法,应用于第一电子设备,其特征在于,所述方法包括:
    在第一时刻,获取已经经过用户授权的第一用户信息,并根据所述第一用户信息,显示第一界面,所述第一界面包括所述第一用户信息关联的m个第二电子设备的信息,m为大于1的整数;
    检测到作用于所述第一界面上用于控制所述m个第二电子设备中的n个电子设备执行第一目标操作的控件的第一操作,n为不小于1且不大于m的整数;
    响应于所述第一操作,控制所述n个电子设备执行所述第一目标操作;
    在第二时刻,获取已经经过所述用户授权的第二用户信息,并根据所述第二用户信息,显示第二界面,所述第二界面包括所述第二用户信息关联的k个第二电子设备的信息,k为大于1的整数;
    检测到作用于所述第二界面上用于控制所述k个第二电子设备中的j个电子设备执行第二目标操作的控件的第二操作,j为不小于1且不大于k的整数;
    响应于所述第二操作,控制所述j个电子设备执行所述第二目标操作。
  2. 根据权利要求1所述的方法,其特征在于,若所述第一用户信息包括所述用户在所述第一时刻所在第一家庭的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一家庭中的电子设备;
    和/或,若所述第二用户信息包括所述用户在所述第二时刻所在第二家庭的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二家庭中的电子设备。
  3. 根据权利要求2所述的方法,其特征在于,若所述第一用户信息还包括所述用户在所述第一家庭中第一时刻所处的第一空间的信息,则所述m个第二电子设备包括所述第一空间中的电子设备;所述第一界面不包括除所述m个第二电子设备之外的其他电子设备的信息;
    和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭中所处的第二空间的信息,则所述k个第二电子设备包括所述第二空间中的电子设备;所述第一界面不包括除所述k个第二电子设备之外的其他电子设备的信息。
  4. 根据权利要求2所述的方法,其特征在于,若所述第一用户信息还包括所述用户在所述第一时刻在所述第一家庭所处的第一空间的信息,则所述m个第二电子设备包括所述用户在所述第一时刻所在所述第一空间中的电子设备,所述用户在所述第一时刻所处第一空间内的电子设备的标识信息在所述第一界面上以预设用户界面UI效果突出显示,和/或所述用户在所述第一时刻在所处第一空间内的电子设备的标识信息在所述第一界面上排在其他电子设备的前面;
    和/或,若所述第二用户信息还包括所述用户在所述第二时刻在所述第二家庭所处的第二空间的信息,则所述k个第二电子设备包括所述用户在所述第二时刻所在所述第二空间中的电子设备,所述用户在所述第二时刻所处第二空间内的电子设备的标识信息在所述第二界面上以预设UI效果突出显示,和/或所述用户在所述第二时刻在所处第二空间内的电子设备的标识信息在所述第二界面上排在其他电子设备的前面。
  5. 根据权利要求1所述的方法,其特征在于,若所述第一用户信息包括所述用户在所述第一时刻与第二电子设备之间的距离信息,则所述m个第二电子设备包括在所述第一时刻与所述用户相距较近的m个电子设备;
    和/或,若所述第二用户信息包括所述用户在所述第二时刻与第二电子设备之间的距离信息,则所述k个第二电子设备包括在所述第二时刻与所述用户相距较近的k个 电子设备。
  6. 根据权利要求5所述的方法,其特征在于,所述m个第二电子设备的信息以弹窗形式显示在所述第一界面中,和/或所述k个第二电子设备的信息以弹窗形式显示在所述第二界面中;
    所述方法还包括:自显示所述弹窗开始的预设时长后,停止显示所述弹窗。
  7. 根据权利要求5所述的方法,其特征在于,所述m个第二电子设备的信息以预设UI效果显示在所述第一界面中,和/或,所述k个第二电子设备的信息以预设UI效果显示在所述第二界面中。
  8. 根据权利要求1所述的方法,其特征在于,若所述第一用户信息包括所述用户在所述第一时刻执行的第一行为的信息,则所述m个第二电子设备是所述用户执行所述第一行为时,所述用户将要控制的m个电子设备;
    和/或,若所述第二用户信息包括所述用户在所述第二时刻执行的第二行为的信息,则所述k个第二电子设备是所述用户执行所述第二行为时,所述用户将要控制的k个电子设备。
  9. 根据权利要求1所述的方法,其特征在于,若所述第一用户信息包括第一时刻的信息,则所述m个第二电子设备是在所述第一时刻,所述用户将要控制的m个电子设备;
    和/或,若所述第二用户信息包括第二时刻的信息,则所述k个第二电子设备是在所述第二时刻,所述用户将要控制的k个电子设备。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,在获取已经经过用户授权的第一用户信息之后,所述方法还包括:
    根据所述第一用户信息,显示第三界面,所述第三界面用于向所述用户推荐目标执行场景。
  11. 根据权利要求10所述的方法,其特征在于,在显示第三界面之后,所述方法还包括:
    接收所述用户在所述第三界面上输入的第三操作,响应于所述第三操作,添加所述目标执行场景;
    当满足所述目标执行场景的触发条件时,执行所述目标执行场景。
  12. 一种电子设备,其特征在于,包括:处理器、存储器和显示屏,所述存储器和所述显示屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器从所述存储器中读取所述计算机指令,使得所述电子设备执行如权利要求1-11中任意一项所述的方法。
  13. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1-11中任意一项所述的方法。
PCT/CN2023/082333 2022-03-18 2023-03-17 智能设备控制方法及电子设备 WO2023174429A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210270161.7A CN116804854A (zh) 2022-03-18 2022-03-18 智能设备控制方法及电子设备
CN202210270161.7 2022-03-18

Publications (1)

Publication Number Publication Date
WO2023174429A1 true WO2023174429A1 (zh) 2023-09-21

Family

ID=88022431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/082333 WO2023174429A1 (zh) 2022-03-18 2023-03-17 智能设备控制方法及电子设备

Country Status (2)

Country Link
CN (1) CN116804854A (zh)
WO (1) WO2023174429A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105634881A (zh) * 2014-10-30 2016-06-01 腾讯科技(深圳)有限公司 应用场景推荐方法及装置
US20170068807A1 (en) * 2014-10-23 2017-03-09 Vivint, Inc. Interface of an automation system
CN106647313A (zh) * 2017-02-14 2017-05-10 长沙零冰电子科技有限公司 一种智能家居控制界面的显示方法及显示设备
CN106909396A (zh) * 2017-03-03 2017-06-30 宇龙计算机通信科技(深圳)有限公司 一种智能家居控制应用的界面显示方法及装置
CN108803444A (zh) * 2018-07-25 2018-11-13 北京小米智能科技有限公司 智能设备的控制方法、装置及存储介质
CN109710134A (zh) * 2018-12-29 2019-05-03 联想(北京)有限公司 一种显示方法和电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170068807A1 (en) * 2014-10-23 2017-03-09 Vivint, Inc. Interface of an automation system
CN105634881A (zh) * 2014-10-30 2016-06-01 腾讯科技(深圳)有限公司 应用场景推荐方法及装置
CN106647313A (zh) * 2017-02-14 2017-05-10 长沙零冰电子科技有限公司 一种智能家居控制界面的显示方法及显示设备
CN106909396A (zh) * 2017-03-03 2017-06-30 宇龙计算机通信科技(深圳)有限公司 一种智能家居控制应用的界面显示方法及装置
CN108803444A (zh) * 2018-07-25 2018-11-13 北京小米智能科技有限公司 智能设备的控制方法、装置及存储介质
CN109710134A (zh) * 2018-12-29 2019-05-03 联想(北京)有限公司 一种显示方法和电子设备

Also Published As

Publication number Publication date
CN116804854A (zh) 2023-09-26

Similar Documents

Publication Publication Date Title
EP2887194B1 (en) Method for controlling a composition of a screen and electronic device thereof
KR101457632B1 (ko) 프로그램 알림 기능을 갖는 휴대용 전자 기기 및 이를 위한 프로그램 알림 방법
US8887049B2 (en) Device, method and timeline user interface for controlling home devices
KR102269035B1 (ko) 서버 및 서버의 그룹 액션 제어방법
US9967388B2 (en) Mirrored interface navigation of multiple user interfaces
CN105634881B (zh) 应用场景推荐方法及装置
CN107408005B (zh) 管理一个或更多个通知的方法及其电子装置
AU2014201856B2 (en) APP operating method and device and app output device supporting the same
WO2021147948A1 (zh) 一种控件显示方法及电子设备
CN106055190B (zh) 用于经由显示器的部分提供信息的设备和方法
KR20150025613A (ko) 애플리케이션의 관련 정보를 표시하는 전자 장치 및 방법
KR20160020166A (ko) 전자 장치 및 이의 화면 표시 방법
KR20170099665A (ko) 디스플레이 장치 및 디스플레이 장치의 동작 채널 설정방법
JP7234379B2 (ja) スマートホームデバイスによってネットワークにアクセスするための方法および関連するデバイス
KR20160042739A (ko) 화면을 공유하기 위한 방법 및 그 전자 장치
KR102013373B1 (ko) 외부 단말과의 연동을 실행하는 방법, 장치 및 기록매체
CN108282388B (zh) 用于向家居设备传递数据信息的装置与方法
CN112312410B (zh) 一种无线接入点的部署方法及装置
WO2023174429A1 (zh) 智能设备控制方法及电子设备
US20150147962A1 (en) Method for processing data and electronic device thereof
CN115599265A (zh) 一种智能设备控制方法、终端设备、服务器和存储介质
CN114466304A (zh) 智能家居设备的控制方法、移动终端及智能家居平台
CN114063459A (zh) 一种终端和智能家居控制方法
WO2023221995A1 (zh) 智能设备控制方法及电子设备
KR20150024009A (ko) 사용자 입력 피드백 제공 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23769931

Country of ref document: EP

Kind code of ref document: A1