CN116804854A - Intelligent device control method and electronic device - Google Patents

Intelligent device control method and electronic device Download PDF

Info

Publication number
CN116804854A
CN116804854A CN202210270161.7A CN202210270161A CN116804854A CN 116804854 A CN116804854 A CN 116804854A CN 202210270161 A CN202210270161 A CN 202210270161A CN 116804854 A CN116804854 A CN 116804854A
Authority
CN
China
Prior art keywords
user
information
electronic devices
interface
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210270161.7A
Other languages
Chinese (zh)
Inventor
高晓强
李乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210270161.7A priority Critical patent/CN116804854A/en
Priority to PCT/CN2023/082333 priority patent/WO2023174429A1/en
Publication of CN116804854A publication Critical patent/CN116804854A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application provides an intelligent device control method and electronic equipment, and relates to the technical field of terminals. The application can automatically switch the equipment control interface for controlling the intelligent equipment for the user, thereby simplifying the equipment control flow and improving the user experience. The method is applied to a first electronic device, and comprises the following steps: at a first moment, acquiring first user information which is authorized by a user, and displaying a first interface according to the first user information, wherein the first interface comprises information of m second electronic devices associated with the first user information; detecting a first operation; controlling n electronic devices to execute a first target operation in response to the first operation; at a second moment, obtaining second user information which is authorized by the user, and displaying a second interface according to the second user information, wherein the second interface comprises k pieces of information of second electronic equipment associated with the second user information; detecting a second operation; and responding to the second operation, and controlling the j electronic devices to execute a second target operation.

Description

Intelligent device control method and electronic device
Technical Field
The embodiment of the application relates to the technical field of communication, in particular to an intelligent device control method and electronic equipment.
Background
As technology advances, more and more devices are owned by users. As shown in fig. 1A, in the home scenario, various devices (such as audio and video devices, lighting system devices, environmental control devices, security devices, etc.) in the home are connected together through the internet of things technology to form an intelligent home system, so that centralized control of the devices is realized, and multiple functions such as home appliance control, lighting control, anti-theft alarm, etc. are provided for users.
However, because of the numerous devices, if a user needs to operate a certain device, the user needs to switch multiple interfaces in the smart home application to find the device to be controlled, which is complicated in operation and consumes time.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides an intelligent device control method and electronic equipment. According to the technical scheme provided by the embodiment of the application, the equipment control interface for controlling the intelligent equipment can be automatically switched for the user, so that the equipment control flow is simplified, and the user experience is improved.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
in a first aspect, an intelligent device control method is provided and is applied to a first electronic device. The method comprises the following steps:
At a first moment, acquiring first user information which is authorized by a user, and displaying a first interface according to the first user information, wherein the first interface comprises information of m second electronic devices associated with the first user information, and m is an integer greater than 1;
detecting a first operation of a control acting on the first interface for controlling n electronic devices in the m second electronic devices to execute a first target operation, wherein n is an integer not less than 1 and not more than m;
controlling the n electronic devices to execute the first target operation in response to the first operation;
at a second moment, obtaining second user information which is authorized by the user, and displaying a second interface according to the second user information, wherein the second interface comprises k pieces of information of second electronic equipment associated with the second user information, and k is an integer greater than 1;
detecting a second operation of a control which acts on the second interface and is used for controlling j electronic devices in the k second electronic devices to execute a second target operation, wherein j is an integer not less than 1 and not more than k;
and responding to the second operation, and controlling the j electronic devices to execute the second target operation.
It should be understood that: the first user information and the second user information include any one or more of the following: location information, time information, behavior information.
Compared with the prior art that a user needs to perform a series of operations through an interface to switch to the equipment control interface corresponding to the target family and the target space, and the equipment control process is not intelligent enough, the equipment control method provided by the application does not depend on manual searching operation of the user, can automatically and intelligently switch to the corresponding equipment control interface, and recommends second electronic equipment associated with user information to the user through the equipment control interface, so that the user operation can be simplified, the equipment control efficiency can be improved, and meanwhile, the interactive experience of the user can be improved.
In one possible design, if the first user information includes information of a first home in which the user is located at the first time, the m second electronic devices include electronic devices in the first home in which the user is located at the first time;
and/or if the second user information includes information of a second family where the user is located at the second moment, the k second electronic devices include electronic devices in the second family where the user is located at the second moment.
It can be seen that, in the embodiment of the present application, the first electronic device may display, according to the location information of the user, a device control interface associated with the location of the user to the user, so that the user may conveniently control, through the device control interface, a second electronic device (such as an intelligent home device) near the location of the user.
In one possible design, if the first user information further includes information of a first space in which the user is located at a first moment in the first home, the m second electronic devices include electronic devices in the first space; the first interface does not include information of other electronic devices except the m second electronic devices;
and/or if the second user information further includes information of a second space in which the user is located in the second home at the second moment, the k second electronic devices include electronic devices in the second space; the first interface does not include information of other electronic devices than the k second electronic devices.
In one possible design, if the first user information further includes information of a first space where the user is located in the first home at the first time, the m second electronic devices include electronic devices where the user is located in the first space at the first time, the identification information of the electronic devices where the user is located in the first space at the first time is highlighted on the first interface with a preset UI effect, and/or the identification information of the electronic devices where the user is located in the first space at the first time is arranged in front of other electronic devices on the first interface;
And/or if the second user information further includes information of a second space where the user is located in the second home at the second moment, the k second electronic devices include electronic devices where the user is located in the second space at the second moment, the identification information of the electronic devices in the second space where the user is located at the second moment is highlighted on the second interface with a preset UI effect, and/or the identification information of the electronic devices where the user is located in the second space at the second moment is arranged in front of other electronic devices on the second interface.
In one possible design, if the first user information includes distance information between the user and a second electronic device at the first time, the m second electronic devices include m electronic devices that are closer to the user at the first time;
and/or if the second user information includes distance information between the user and the second electronic device at the second moment, the k second electronic devices include k electronic devices that are closer to the user at the second moment.
In one possible design, the information of the m second electronic devices is displayed in the first interface in a popup window form, and/or the information of the k second electronic devices is displayed in the second interface in a popup window form;
The method further comprises the steps of: and stopping displaying the popup window after displaying the preset duration from the popup window.
In one possible design, the information of the m second electronic devices is displayed in the first interface with a preset UI effect, and/or the information of the k second electronic devices is displayed in the second interface with a preset UI effect.
In this way, the electronic device can prompt the user to pay attention to important information in the first interface and/or the second interface, thereby helping the user to more quickly find the second electronic device to be controlled from the interface.
In one possible design, if the first user information includes information of a first behavior performed by the user at the first time, the m second electronic devices are m electronic devices to be controlled by the user when the user performs the first behavior;
and/or if the second user information includes information of a second behavior executed by the user at the second moment, the k second electronic devices are k electronic devices to be controlled by the user when the user executes the second behavior.
In one possible design, if the first user information includes information of a first time, the m second electronic devices are m electronic devices to be controlled by the user at the first time;
And/or if the second user information includes information of a second moment, the k second electronic devices are k electronic devices to be controlled by the user at the second moment.
In one possible design, after acquiring the first user information that has been authorized by the user, the method further includes:
and displaying a third interface according to the first user information, wherein the third interface is used for recommending a target execution scene to the user.
In one possible design, after displaying the third interface, the method further includes:
receiving a third operation input by the user on the third interface, and adding the target execution scene in response to the third operation;
and executing the target execution scene when the trigger condition of the target execution scene is met.
In a second aspect, an intelligent device control apparatus is provided, where the apparatus may be a first electronic device or a component capable of performing a function of the first electronic device (or a component supporting the first electronic device to perform a corresponding function, for example, may be a chip system). The device comprises:
the processing unit is used for acquiring first user information which is authorized by a user at a first moment;
The display unit is used for displaying a first interface according to the first user information, wherein the first interface comprises information of m second electronic devices associated with the first user information, and m is an integer greater than 1;
an input unit, configured to detect a first operation of a control acting on the first interface to control n electronic devices of the m second electronic devices to perform a first target operation, where n is an integer not less than 1 and not more than m;
the processing unit is further used for responding to the first operation and controlling the n electronic devices to execute the first target operation;
the processing unit is also used for acquiring second user information which is authorized by the user at a second moment;
the display unit is further used for displaying a second interface according to the second user information, wherein the second interface comprises information of k second electronic devices associated with the second user information, and k is an integer greater than 1;
the input unit is further used for detecting a second operation of a control which acts on the second interface and is used for controlling j electronic devices in the k second electronic devices to execute a second target operation, wherein j is an integer which is not less than 1 and not more than k;
And the processing unit is also used for responding to the second operation and controlling the j electronic devices to execute the second target operation.
It should be appreciated that the first user information and the second user information include any one or more of the following: location information, time information, behavior information.
In one possible design, if the first user information includes information of a first home in which the user is located at the first time, the m second electronic devices include electronic devices in the first home in which the user is located at the first time;
and/or if the second user information includes information of a second family where the user is located at the second moment, the k second electronic devices include electronic devices in the second family where the user is located at the second moment.
In one possible design, if the first user information further includes information of a first space in which the user is located at a first moment in the first home, the m second electronic devices include electronic devices in the first space; the first interface does not include information of other electronic devices except the m second electronic devices;
And/or if the second user information further includes information of a second space in which the user is located in the second home at the second moment, the k second electronic devices include electronic devices in the second space; the first interface does not include information of other electronic devices than the k second electronic devices.
In one possible design, if the first user information further includes information of a first space where the user is located in the first home at the first time, the m second electronic devices include electronic devices where the user is located in the first space at the first time, the identification information of the electronic devices where the user is located in the first space at the first time is highlighted on the first interface with a preset UI effect, and/or the identification information of the electronic devices where the user is located in the first space at the first time is arranged in front of other electronic devices on the first interface;
and/or if the second user information further includes information of a second space where the user is located in the second home at the second moment, the k second electronic devices include electronic devices where the user is located in the second space at the second moment, the identification information of the electronic devices in the second space where the user is located at the second moment is highlighted on the second interface with a preset UI effect, and/or the identification information of the electronic devices where the user is located in the second space at the second moment is arranged in front of other electronic devices on the second interface.
In one possible design, if the first user information includes distance information between the user and a second electronic device at the first time, the m second electronic devices include m electronic devices that are closer to the user at the first time;
and/or if the second user information includes distance information between the user and the second electronic device at the second moment, the k second electronic devices include k electronic devices that are closer to the user at the second moment.
In one possible design, the information of the m second electronic devices is displayed in the first interface in a popup window form, and/or the information of the k second electronic devices is displayed in the second interface in a popup window form;
and the display unit is also used for stopping displaying the popup window after the preset duration from displaying the popup window.
In one possible design, the information of the m second electronic devices is displayed in the first interface with a preset UI effect, and/or the information of the k second electronic devices is displayed in the second interface with a preset UI effect.
In one possible design, if the first user information includes information of a first behavior performed by the user at the first time, the m second electronic devices are m electronic devices to be controlled by the user when the user performs the first behavior;
And/or if the second user information includes information of a second behavior executed by the user at the second moment, the k second electronic devices are k electronic devices to be controlled by the user when the user executes the second behavior.
In one possible design, if the first user information includes information of a first time, the m second electronic devices are m electronic devices to be controlled by the user at the first time;
and/or if the second user information includes information of a second moment, the k second electronic devices are k electronic devices to be controlled by the user at the second moment.
In a possible design, the display unit is further configured to display a third interface according to the first user information, where the third interface is configured to recommend a target execution scenario to the user.
In one possible design, the input unit is further configured to receive a third operation input by the user on the third interface;
the processing unit is further used for responding to the third operation and adding the target execution scene; and executing the target execution scene when the trigger condition of the target execution scene is met.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing the method for controlling an intelligent device as described in the first aspect and any one of possible implementation manners of the first aspect. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a fourth aspect, a computer-readable storage medium is provided. The computer readable storage medium stores a computer program (which may also be referred to as instructions or code) which, when executed by an electronic device, causes the electronic device to perform the method of the first aspect or any implementation of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product for causing an electronic device to perform the method of the first aspect or any one of the embodiments of the first aspect when the computer program product is run on the electronic device.
In a sixth aspect, an embodiment of the application provides circuitry comprising processing circuitry configured to perform the first aspect or the method of any one of the embodiments of the first aspect.
In a seventh aspect, an embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiver function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs the method of the first aspect or any implementation manner of the first aspect.
Drawings
Fig. 1A is a schematic view of a home scenario provided in an embodiment of the present application;
FIG. 1B is a schematic diagram of coordinate transformation according to an embodiment of the present application;
fig. 2 is a schematic hardware structure of a first electronic device according to an embodiment of the present application;
fig. 3 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface provided by an embodiment of the present application;
fig. 6A is a schematic flow chart of a control method of an intelligent device according to an embodiment of the present application;
fig. 6B is a schematic flow chart of a method for controlling an intelligent device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a positioning method according to an embodiment of the present application;
FIGS. 8-11 are schematic diagrams illustrating interfaces provided by embodiments of the present application;
fig. 12 is a schematic flow chart of a method for controlling an intelligent device according to an embodiment of the present application;
FIGS. 13-16 are schematic diagrams illustrating interfaces provided by embodiments of the present application;
fig. 17 is a schematic flow chart of a method for controlling an intelligent device according to an embodiment of the present application;
FIG. 18 is a schematic diagram of an interface provided by an embodiment of the present application;
fig. 19 is a schematic flow chart of a method for controlling an intelligent device according to an embodiment of the present application;
fig. 20 is a schematic structural diagram of an intelligent device control apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. In the description of embodiments of the application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two).
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
An embodiment of the present application provides a control method for an intelligent device, and fig. 1A is a schematic diagram of an intelligent device control system to which the method is applicable. As shown in fig. 1A, the smart device control system may manage smart devices in units of households. One home may be called a whole house, and the whole house may be divided into different spaces, for example, the whole house includes a passageway for entering a house, a kitchen, a restaurant, a living room, a balcony, a main sleeping room, a secondary sleeping room, a toilet, and the like.
The full house system may comprise a first electronic device 100, the first electronic device 100 being for controlling a second electronic device 200, such as an internet of things (internet of things, ioT) device. The first electronic device 100 includes, but is not limited to, a cell phone, a PC, a tablet computer, etc. As one possible implementation, the first electronic device 100 may be installed with an application for controlling the second electronic device 200. The application may be a system pre-installed application, or a non-pre-installed application (such as an application downloaded from an application market). It should be understood that: the system pre-installed application includes a part of the system application (such as a service, a component, or a plug-in the system application), or a separate application pre-installed in the first electronic device, that is, a separate application icon, and the application program may be an intelligent living application.
Alternatively, the first electronic device 100 may also control the second electronic device 200 through a control center. For example, the control center may be a shortcut control page displayed by the first electronic device 100 in response to a user's operation to slide down from the upper right corner or top of the screen.
Alternatively, the first electronic device 100 may also control the second electronic device 200 through a corresponding function menu in the negative one-screen. For example, the negative one screen may be a displayed system service capability entry page of the first electronic device 100 in response to a user's right-hand sliding operation on the leftmost main interface.
The whole house is provided with at least one third electronic device 300. Illustratively, each room or region includes at least one third electronic device 300.
Optionally, the third electronic device 300 is configured to locate the second electronic device 200, and/or the first electronic device 100, and/or the user, and report the location information of the second electronic device 200, and/or the location information of the first electronic device 100, and/or the location information of the user to the hub device 400.
In some examples, the third electronic device 300 may include a sensor that is responsible for collecting spatial location information of the user. For example, the third electronic device 300 may be a camera, and the camera captures image information of other devices and/or users and determines location information of each device and/or user according to the image information. Illustratively, the third electronic device 300 detects, via the sensor, that the space in which the user is located is dominant. Alternatively, the third electronic device 300 may collect the spatial location information of the user in real time or periodically or according to other strategies.
In one example, third electronic device 300 includes an Ultra-Wide Band (UWB) module and/or a millimeter wave radar module. The third electronic device 300 locates the second electronic device 200 and/or the first electronic device 100 via the UWB module. Alternatively, third electronic device 300 locates second electronic device 200 and/or first electronic device 100 via a millimeter wave radar module. In other examples, the third electronic device 300 includes a wireless high-fidelity (wireless fidelity, wi-Fi) module by which the third electronic device 300 locates the second electronic device 200 and/or the first electronic device 100. In other examples, the third electronic device 300 may be co-located through multiple modules as described above.
Optionally, the third electronic device 300 may also detect at least one of the physiological characteristics, identity class, and posture of the user, and upload the detected information to the hub device 400 in a wired or wireless manner.
The whole house is also provided with a second electronic device 200 (e.g., ioT device). The second electronic device 200 may also be referred to as a controlled device, and the second electronic device 200 may be controlled by the first electronic device 100. For example, the kitchen is provided with an electric cooker or an electric pressure cooker, a gas appliance and the like; the living room is provided with a sound box (such as a smart sound box), a television (such as a smart television, also called a smart screen, a large screen and the like), a routing device and the like; the balcony is provided with a clothes hanger (such as an intelligent clothes hanger and the like); the restaurant is provided with a sweeping robot and the like; the main bed is provided with a television (such as an intelligent television), a sound box (such as an intelligent sound box), a floor lamp (such as an intelligent floor lamp), a routing device and the like; the secondary lying is provided with a desk lamp (such as an intelligent desk lamp), a sound box (such as an intelligent sound box) and the like; the toilet is provided with a body fat scale and the like.
It should be noted that, although in fig. 1A, the second electronic device 200 only shows a smart television, those skilled in the art should understand that the second electronic device 200 includes, but is not limited to, smart televisions, smart speakers, smart lamps (e.g., ceiling lamps, smart desk lamps, aromatherapy lamps, etc.), sweeping robots, body fat scales, smart clothes hangers, smart electric cookers, air purifiers, humidifiers, desktop computers, routing devices, smart sockets, water dispensers, smart refrigerators, smart air conditioners, smart switches, smart door locks, and other smart home devices. It should be noted that, the second electronic device 200 may be a portable device, such as a Personal Computer (PC), a tablet computer, a mobile phone, a smart remote controller, etc., instead of the smart home device. The embodiment of the present application is not limited to the specific form of the second electronic device 200.
Optionally, the system may also include a hub device 400. Hub device 400, also known as a hub, central control system, or host, etc. In some examples, hub device 400 may be configured to receive information (e.g., location information) transmitted by third electronic device 300. The hub device 400 may determine the space in which the user and/or the first electronic device 100 is located, and determine a specific location (e.g., coordinates) of the user and/or the first electronic device 100 in the space, based on the location information and the house type information. Optionally, hub device 400 also notifies or controls second electronic device 200 based on the received information, including but not limited to positioning information. For example, when a user wakes up a smart speaker through voice, hub device 400 notifies or controls one or more smart speakers nearest to the user to wake up based on the locations of the plurality of smart speakers throughout the room. For example, when a user moves from one room to another room throughout the house, hub device 400 controls the smartphones in the rooms from which the user left off to stop playing audio, and controls the smartphones in the rooms from which the user entered to begin playing (e.g., continuing to play) audio.
Alternatively, the center device 400 may be further configured to construct a full-house map according to the house pattern map, and establish a full-house coordinate system, and convert the position information acquired by each third electronic device 300 into the full-house coordinate system. In this way, the position information of the second electronic device 200 and/or the first electronic device 100 and/or the user detected and acquired by each third electronic device 300 may be converted into a full-house coordinate system, and a specific position of the second electronic device 200 or the user in the full house may be determined.
As one possible implementation, the third electronic device establishes a coordinate system (referred to as a first coordinate system) as shown in fig. 1B (a). Wherein O is e X is the origin, X e Is X axis, Y e Is Y-axis, Z e Is the Z axis. The hub device 400 establishes a full house coordinate system as shown in fig. 1B (B) from the full house pattern diagram. Wherein O is h X is the origin, X h Is X axis, Y h Is Y-axis, Z h Is the Z axis.
Alternatively, the first coordinate system may be converted to a full house coordinate system, and the coordinates of the midpoint of the first coordinate system may be converted to the full house coordinate system. ExampleAs shown in FIG. 1B, hub device 400 receives O from third electronic device 300 b After the coordinate information of the point, O can be obtained by a vector mode b The point corresponds to point O in the full house coordinate system b ' coordinates. In particular, the distance between two points is the same in different coordinate systems, but the directional representation of the vector formed by the two points in different coordinate systems may be different. For example, to take O b Conversion of the coordinates of a point in a first coordinate system to O b The point corresponds to point O in the full house coordinate system b The coordinates of' may be transformed by means of vectors. Illustratively by passing throughThe way of (a) is converted into an example, vector +.>Distance and +.>The distances (all L) in the full house coordinate system are the same, but the vectorsDirections expressed by a first coordinate system, and vector +.>The directions represented by the full house coordinate system are different. By obtaining the relative direction change between the first coordinate system and the full house coordinate system, the vector +.>The direction indicated by the first coordinate system allows the vector +.>Directions represented by a full house coordinate system; recombination O e Point, O b Coordinates of the point in the first coordinate system and O h Is at allThe coordinates in the room coordinate system can be obtained b The point corresponds to point O in the full house coordinate system b ' coordinates. In this way, the hub device 400 may convert the coordinate information acquired by the third electronic device 300 into coordinates in the full-house coordinate system.
Optionally, the hub device 400 may further convert the position information obtained by other electronic devices into a full-house coordinate system, and the conversion method may refer to the above method, which is not described herein.
The coordinate conversion method (vector method) of the same point in different coordinate systems is merely illustrative, and the present application is not limited to the coordinate conversion method.
Optionally, the third electronic device 300 communicates with the second electronic device 200 by wired or wireless means.
Alternatively, the second electronic device 200 and the third electronic device 300 may be connected to the hub device 400 by wired (e.g., power bus communication (power line communication, PLC)) and/or wireless (e.g., wi-Fi, bluetooth, etc.). It will be appreciated that the manner in which the second electronic device 200 and the third electronic device 300 are coupled to the hub device 400 may be the same or different. For example, the second electronic device 200 and the third electronic device 300 are both connected to the hub device 400 in a wireless manner. Alternatively, the second electronic device 200 is connected to the hub device 400 in a wireless manner, and the third electronic device 300 is connected to the hub device 400 in a wired manner. Or, the devices such as the intelligent sound box, the intelligent television, the body fat scale, the sweeping robot and the like in the second electronic device 200 are connected with the central device 400 in a wireless (such as Wi-Fi) mode, and the devices such as the intelligent desk lamp, the intelligent clothes hanger and the intelligent door lock in the second electronic device 200 are connected with the central device 400 in a wired (such as PLC) mode.
Alternatively, the central devices of each room or each area and the central devices of the whole house may exist separately, or may be integrated with the third electronic device or the first electronic device into one device, or may be integrated with the third electronic device and the first electronic device into one device. The application is not limited in this regard.
In one example, the system further includes a routing device (such as a router). Routing devices are used to connect to a local area network or the internet, using a specific protocol to select and set the path of the transmitted signal. Illustratively, one or more routers within a whole house constitute a local area network, or access the local area network or the internet. The second electronic device 200 and/or the third electronic device 300 access the router, and perform data transmission with a device in the local area network or a device in the internet through a Wi-Fi channel established by the router. In one embodiment, hub device 400 may be integrated with a routing device as one device. For example, hub device 400 is integrated with a routing device as a routing device, i.e., the routing device has the functionality of hub device 400. The routing device may be one or more routing devices in the primary and secondary routing devices, or may be independent routing devices.
The above is merely an example of a system to which the device control method is applied, and more, fewer, or different device layout positions may be included in the system.
Illustratively, fig. 2 shows a schematic structural diagram of a first electronic device 100.
As shown in fig. 2, the first electronic device 100 may include a processor 310, a memory 320, a universal serial bus (universal serial bus, USB) interface 330, a power module 340, a uwb module 350, a wireless communication module 360, and the like. Optionally, the first electronic device 100 may further include an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone interface 370D, a display 380, etc. Optionally, the first electronic device 100 may also include a sensor module 390 or the like.
It is to be understood that the structure illustrated in fig. 2 does not constitute a specific limitation on the first electronic device 100. In other embodiments of the application, the first electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interface connection relationship between the modules shown in fig. 2 is merely illustrative, and is not limited to the configuration of the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may also use a different interface from that of fig. 2, or a combination of multiple interfaces.
Processor 310 may include one or more processing units, and the different processing units may be separate devices or may be integrated into one or more processors. For example, processor 310 is a central processing unit (central processing unit, CPU), but may also be an integrated circuit (application specific integrated circuit, ASIC), or one or more integrated circuits configured to implement embodiments of the present application, such as: one or more microprocessors (digital signal processor, DSPs), or one or more field programmable gate arrays (field programmable gate array, FPGAs).
Memory 320 may be used to store computer-executable program code that includes instructions. For example, the memory 320 may also store data processed by the processor 310. In addition, the memory 320 may include high-speed random access memory, and may also include nonvolatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like. The processor 310 performs various functional applications and data processing of the first electronic device 100 by executing instructions stored in the memory 320 and/or instructions stored in a memory provided in the processor.
The wireless communication module 360 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), purple peak (ZigBee), etc., applied on the first electronic device 100. The wireless communication module 360 may be one or more devices including at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via an antenna. It should be noted that the number of antennas of the wireless communication module 360, the UWB module 350 and the millimeter wave radar module 160 in fig. 2 is only illustrative. It will be appreciated that communication module 360, UWB module 350, and millimeter wave radar module 160 may include more or fewer antennas, as embodiments of the present application are not limited in this respect.
UWB module 350 may provide a solution for wireless communication based on UWB technology applied on first electronic device 100. For example, the distance between the first electronic device 100 and the second electronic device 200 (e.g., ioT device) may be obtained by detecting the UWB signal and calculating the duration of the UWB signal's flight in the air, in conjunction with some positioning algorithm, which is multiplied by the rate at which the UWB signal is transmitted in the air (e.g., the speed of light). Further exemplary, the first electronic device 100 may also determine the direction of the second electronic device 200 relative to the first electronic device 100 (i.e., the direction of the UWB signal) based on the phase difference of the UWB signal transmitted by the second electronic device 200 to the different antennas of the first electronic device 100.
The USB interface 330 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 330 may be used to connect a charger to charge the first electronic device 100, and may also be used to transfer data between the first electronic device 100 and a peripheral device.
The power module 340 is used to power various components of the first electronic device 100, such as the processor 310, the memory 320, and the like.
The first electronic device 100 may implement audio functionality through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an ear-headphone interface 370D, an application processor, and the like. Such as audio playback, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some of the functional modules of the audio module 370 may be disposed in the processor 310.
Speaker 370A, also known as a "horn," is used to convert audio electrical signals into sound signals. The first electronic device 100 may listen to audio through the speaker 370A.
A receiver 370B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 370C, also referred to as a "microphone," is used to convert sound signals into electrical signals. The user can sound near microphone 370C through the mouth, inputting a sound signal to microphone 370C.
The earphone interface 370D is for connecting a wired earphone. The headset interface 370D may be a USB interface 330 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The display 380 is used to display images, videos, and the like. The display 380 includes a display panel.
Optionally, the sensor module 390 includes an inertial measurement unit (inertial measurement unit, IMU) module or the like. The IMU module may include gyroscopes, accelerometers, and the like. Gyroscopes and accelerometers may be used to determine the motion pose of the first electronic device 100.
Optionally, the first electronic device 100 further comprises a filter (e.g. a kalman filter). Illustratively, the output of the IMU module and the output of the UWB module 350 may both be superimposed, and the signal after the superposition of the two may be input to a kalman filter for filtering, thereby reducing errors.
The first electronic device 100, the third electronic device 300, and the hub device 400 may be configured as described with reference to fig. 2. For example, the apparatus shown in FIG. 2 may have more, fewer components, or may combine certain components, or split certain components, or may have a different arrangement of components.
Optionally, an electronic device (such as a first electronic device, a third electronic device, a second electronic device (such as an IoT device) or a hub device)The software system may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the invention are configured in a layered manner The system is an example illustrating the software architecture of an electronic device.
Taking the first electronic device as an example, fig. 3 is a block diagram of a software structure of the first electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of applications.
As shown in fig. 3, the applications may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
In the embodiment of the application, the application program also comprises an intelligent home management application and basic services. The basic service opens the management capability of the intelligent device to the system. The smart home management application may invoke the base service to query the smart home device to be controlled and/or invoke the base service to control the smart home device.
For example, the smart home management application may be a smart life application. The smart home application may also be other applications with similar functionality. The intelligent home management application can be a system original application or a third party application, and the embodiment of the application does not limit the category of the intelligent home management application. The following embodiments are mainly exemplified by smart life applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The telephony manager is for providing communication functions of the first electronic device 100. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal vibrates, and an indicator light blinks.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be noted that, the software architecture shown in fig. 3 is only one software architecture applicable to the electronic device, and the embodiment of the application does not limit the software architecture of the electronic device. Alternatively, some of the functional modules may be located in a different hierarchy than that shown in FIG. 3. For example, the base service may also be provided in a framework layer, to which embodiments of the present application are not limited.
The technical scheme of the embodiment of the application can be applied to various equipment control scenes, and the intelligent household equipment is mainly controlled by the intelligent living application in the mobile phone for illustration, but the technical scheme is not limited to the scene to which the technical scheme is applied.
In the embodiment of the application, the mobile phone can acquire the user information authorized by the user in real time, and present a corresponding equipment control interface to the user based on the user information so as to meet the current equipment control requirement of the user. The user information includes, but is not limited to, the following: location information, time information, behavior information. The technical scheme of the embodiment of the application is exemplified by the following cases:
Case one: the user information is location information. In this case, the mobile phone acquires the location information of the user, and presents a corresponding device control interface to the user according to the location information of the user.
Illustratively, when the user Jack returns to home (Jack's home) and opens the smart life application and the mobile phone detects that the user is currently located at Jack's home, the device control interface 401 corresponding to "Jack's home" in the smart life application is displayed as shown in fig. 4.
It can be seen that, compared with the related art shown in fig. 5 (a), after the user Jack returns to home (Jack's home), the user also needs to manually click on the control 403 in the interface 402, and select "Jack's home" in the pop-up home management window 404 as shown in fig. 5 (b), and the user needs to operate the interface multiple times to switch to the device control interface 401 corresponding to the current home.
The device control method according to the embodiment of the present application is described in detail as follows. Fig. 6A illustrates a flow of a device control method according to an embodiment of the present application, including:
S101, the first electronic device 100 establishes a connection with the routing device.
The first electronic device 100 may be a device such as a mobile phone for controlling a device such as an intelligent home. The first electronic device 100 may control the smart home device in different manners, and in this embodiment, the first electronic device controls the smart home device through the smart life application.
It should be noted that, the routing device (such as a router) may be integrated with the hub device 400, or may be a device that is separately disposed. In this embodiment, the routing device and the hub device 400 are independent devices.
S102, the routing equipment reports the network information of the first electronic equipment 100 to the central equipment 400.
In the embodiment of the present application, after the routing device establishes a connection with the first electronic device 100 (such as a mobile phone), the network information of the first electronic device 100 may be reported to the hub device 400. Network information includes, but is not limited to, the name, identification, etc. of the network. For example, the routing device reports the name of the Wi-Fi network to which the first electronic device 100 is connected to the hub device 400.
S103, the central equipment 400 determines the information of the family where the first electronic equipment 100 is located according to the network information of the first electronic equipment 100.
Illustratively, the routing device reports the name of the Wi-Fi network to which the first electronic device 100 is connected to the hub device 400, and the hub device 400 may determine which home the first electronic device 100 is currently located in, based on that the Wi-Fi network names corresponding to different households are generally different.
S104, the third electronic device 300 measures the distance between the third electronic device and the first electronic device 100
The third electronic device 300 may include a sensor, where the sensor is responsible for collecting spatial location information of the user, or the third electronic device 300 may include a communication module such as UWB, wi-Fi, and the like, and implement a positioning function for the first electronic device 100 through the communication module. In this embodiment, positioning by a UWB module will be described as an example. Typically, the third electronic device 300 is used to calculate the first electronic device 100 in plurality.
For example, assuming that the main sleeper includes third electronic devices 300 as shown in fig. 7, each third electronic device 300 transmits UWB signals to the first electronic device 100 (such as a cellular phone) through a UWB module and waits for UWB signals fed back from the first electronic device 100, the third electronic device 300 may detect a time of flight (flight) of the UWB signals and calculate its distance r from the first electronic device 100 according to the time of flight.
S105, the third electronic device 300 reports the distance between the third electronic device and the first electronic device 100 to the hub device 400.
Illustratively, the third electronic device 300, as shown in FIG. 7, reports the distance r between each and the first electronic device 100 to the hub device 400.
S106, the center device 400 determines information of the space where the first electronic device 100 is located according to the distance information and the house type diagram between the third electronic device 300 and the first electronic device 100.
Illustratively, the hub device 400 has a coordinate system as shown in FIG. 7, where the origin of the coordinates is O. The center device 400 determines a sphere centered on the third electronic device 300, r3 being a radius, according to a distance r3 between the third electronic device 300 and the first electronic device 100 in the upper left corner; determining a sphere centering on the third electronic device 300 and having r1 as a radius according to a distance r1 between the third electronic device 300 and the first electronic device 100 in the upper right corner; a sphere centered on the third electronic device 300 and having a radius r2 is determined based on the distance r2 between the intermediate third electronic device 300 and the first electronic device 100. The hub device 400 takes the intersection point a of the three spheres as the location of the first electronic device 100. In some examples, the location of the first electronic device 100 may be considered the location of the user.
S107, the first electronic device 100 detects an operation of opening the smart life application by the user.
The user opens the intelligent living application, which can be the intelligent living application operated by the user in the background or the intelligent living application operated in the foreground.
The user may open the smart life application through an operation interface, for example, the user clicks an icon of the smart life application on the desktop to trigger the first electronic device 100 to open the smart life application. Or the user opens the intelligent life application through the voice instruction.
S108, the first electronic device 100 transmits a location query request to the hub device 400. The location query request is for querying a location of a user.
Wherein the location of the user comprises a plurality of dimensions. When the smart home devices are managed in units of households, the location of the user may include a household in which the user is located and/or a space in which the user is located in the household.
As one possible implementation, the first electronic device 100 invokes a basic service through the smart life application, and the basic service sends a location query request to the hub device 400 through the corresponding driver invocation communication module.
In some examples, step S108 is an optional step. For example, after acquiring information about the home of the user and the space in which the user is located, the hub device 400 may actively send the location information to the first electronic device 100 periodically or according to other policies. The first electronic device 100 may store location information of a home and a space where a user is located, and may update the location information of the home and the space where the user is located. In this case, when the first electronic device 100 detects that the user opens the smart life application, S110 may be performed.
S109, the center device 400 feeds back the position information of the user to the first electronic device 100.
As one possible implementation, the hub device 400 feeds back to the first electronic device 100 information of the home in which the user is located, and/or information of the space in which the user is located in the home.
S110, the first electronic device 100 displays a first interface of the intelligent life application according to the position information of the user.
The first interface is a control interface of a second electronic device (such as an IoT device) corresponding to a home where the user is currently located.
Taking the first electronic device 100 as an example of a mobile phone, if the user Jack returns to home (Jack's home), and opens the smart life application, and if the mobile phone detects that the user is currently located at Jack's home, the smart device control interface 401 (first interface) corresponding to "Jack's home" in the smart life application is displayed automatically and intelligently, as shown in fig. 4. In this way, the user can control the device in the home through the device control interface 401. In the equipment control method, the user does not need to perform multiple operations in the interface, so that the complexity of the user operation can be reduced, and the equipment management efficiency is improved.
In some embodiments, the user may control the smart home device with "space" granularity. Illustratively, suppose that the user last used the smart life application, switches to the device control interface under the "space" tab. In response to the user opening the smart life application this time, as shown in fig. 8 (a), the mobile phone displays, according to information of the space (located in the home) where the user is currently located, a device control interface 701 under a "space" label 703, where the device control interface 701 includes a device control card 702 corresponding to the home. In this way, the user can control the smart home device on which the user is lying through the device control card 702 corresponding to the user's home.
Alternatively, as shown in fig. 8 (a), card 702 may include a device in a main sleeper, such as a light, curtain, or the like. Optionally, card 702 may also include tasks corresponding to the device in the home. Such as a temperature control task, a purge task, etc. In one possible design, the card includes as many devices in the home as possible and the corresponding tasks. In this way, the user can preview most of the devices and corresponding tasks in the home via the card to facilitate selection of the device desired to be controlled and the task that the device is required to perform from among those devices and tasks.
Optionally, the device control interface 701 may further include a button 706, and the user may click on the button 706 to display a spatial card of the whole house, for example, a card corresponding to a space of a bathroom, a next-lying, a balcony, or the like.
Thereafter, if the user moves the location, and if the mobile phone acquires that the user is located in the bathroom as in fig. 8 (b), the mobile phone may display a device control interface 704 as shown in fig. 8 (b), when the user opens the smart life application in the bathroom, the device control interface 704 including a card 705 corresponding to the bathroom. In one example, the card 705 includes as many devices in the bathroom and corresponding tasks for each device as possible to facilitate user operation of the smart home devices in the bathroom.
As another example, assuming that the mobile phone currently displays an interface corresponding to the "device" tab of the smart life application as shown in fig. 8 (a), after detecting that the user clicks the "space" tab 703, the mobile phone displays a device control interface 701 under the "space" tab 703 as shown in fig. 8 (a) according to the current position (main horizontal) of the user, where the device control interface 701 includes a device control card 702 corresponding to the main horizontal. The user can control the smart home device on the main sleeper through the corresponding device control card 702 on the main sleeper.
It can be seen that in the embodiment of the application, when a user controls the scene of the smart home equipment in a smart living application mode of the mobile phone, the mobile phone can display the equipment control interface of the equipment in the space to the user according to the space information of the user, so that the user can conveniently control the smart home equipment in the space through the equipment control interface of the equipment in the space.
The above description is given by taking the example that the device control interface under the "space" label only includes the card of the space where the user is currently located, and in other embodiments, the device control interface under the "space" label may include the card of the space where the user is currently located and the card of other spaces. And, the mobile phone can display the card in the current space of the user in a specific mode so as to prompt the user to pay attention to the card. Optionally, the mobile phone may display a card corresponding to the current space at the top of the device control interface, and/or the mobile phone may display the card with a specific User Interface (UI) effect.
For example, if the user switches to the device control interface corresponding to the "space" tag when using the smart life application last time, then when the user opens the smart life application this time, assuming that the mobile phone obtains that the user is lying on the main, as in fig. 9 (a), the mobile phone may display the device control interface 801 as shown in fig. 9 (a). In one example, the interface 801 includes a card 802 for a primary sleeper that is displayed on top of the interface 801 for a user to focus on and control smart home devices in the primary sleeper through the card operation.
Thereafter, as in fig. 9 (b), the mobile phone acquires that the user is located in the bathroom, and then, when the user opens the smart life application in the bathroom, the mobile phone may display the device control interface 803 as shown in fig. 9 (b). In one example, in this interface 803, a bathroom pair card 804 is presented at the top of the interface 803 to facilitate user operation to control smart home devices in the bathroom.
Further exemplary, if the user switches to the device control interface corresponding to the "space" tag when using the smart life application last time, then when the user opens the smart life application this time, assuming that the mobile phone obtains that the user is in the home position as in fig. 10 (a), the mobile phone may display the device control interface 901 as shown in fig. 10 (a). In one example, in this interface 901, a card 902 for a primary sleeper is presented with a particular UI effect (such as a frame flashing, color change, etc.), facilitating user operation to control smart home devices in the primary sleeper.
Thereafter, as shown in fig. 10 (b), the mobile phone acquires that the user is located in the bathroom, and then, when the user opens the smart life application in the bathroom, the mobile phone may display the device control interface 903 as shown in fig. 10 (b). In one example, in this interface 903, a card 904 for a toilet pair is presented with a specific UI effect.
The above description mainly uses the hub device 400 to calculate the location information of the user, and in other embodiments, the first electronic device 100 may also calculate the location information of the user. Fig. 6B shows a further flow of a device control method according to an embodiment of the application, the method comprising the steps of:
s201, the first electronic device 100 establishes a connection with the routing device.
The specific implementation of steps S201 to S206 may be referred to as related description of the corresponding embodiment of fig. 6B.
And S202, the routing equipment reports the network information of the first electronic equipment 100 to the central equipment 400.
S203, the hub device 400 determines information of the family where the first electronic device 100 is located according to the network information of the first electronic device 100.
S204, the third electronic device 300 measures the distance between the third electronic device and the first electronic device 100
S205, the third electronic device 300 reports the distance between the third electronic device 300 and the first electronic device 100 to the hub device 400.
S206, the first electronic device 100 detects an operation of opening the smart life application by the user.
S207, the first electronic device 100 transmits a location query request to the hub device 400. The home query request is used to query the location information of the user.
S208, the center device 400 feeds back information of the family where the user is located, information of the distance between the first electronic device 100 and the third electronic device 300, and information of the family pattern to the first electronic device 100.
Illustratively, the hub device 400 feeds back to the first electronic device 100 information that the user is currently at "Jack's home", distances r1, r2, r3 between the first electronic device 100 and the third electronic device 300 as shown in fig. 7, and a house pattern diagram.
Alternatively, after receiving a location query request from the first electronic device 100, the hub device 400 may send, to the first electronic device 100, family information in which the user is located, and information for determining a space in which the user is located (for example, the distances r1, r2, r3, and information of the family pattern). Alternatively, hub device 400 may receive different requests from first electronic device 100 and feedback different information based on the different requests. For example, the hub device 400 receives a home query request from the first electronic device 100, and feeds back information of a home in which the user is located to the first electronic device 100 based on the home query request. The hub device 400 receives the space query request from the first electronic device 100, and feeds back information (such as the distances r1, r2, r3, and the information of the house type map described above) for determining the space in which the user is located to the first electronic device 100 based on the space query request. Alternatively, the hub device 400 actively transmits information of the home in which the user is located and/or information for determining the space in which the user is located to the first electronic device 100.
S209, the first electronic device 100 determines information of a space where the first electronic device 100 is located according to the distance information and the house type diagram between the third electronic device 300 and the first electronic device 100.
The first electronic device 100 determines that the intersection point a of the three spheres is the location of the first electronic device 100 according to r1, r2, r3 and the house type graph shown in fig. 7.
S210, the first electronic device 100 displays a first interface of the intelligent living application according to the information of the family where the user is located and the information of the space where the user is located.
Wherein the first interface includes information of a second electronic device (e.g., ioT device) of the space in which the user is located.
As shown in fig. 8, for example, if the first electronic device 100 detects that the user indicates to switch to the "space" tag 703 when the user is in a home (Jack home), the first electronic device 100 may automatically display a first interface 701, where the first interface 701 is a device control interface corresponding to the home of the Jack home, and the device control interface includes a card 702 corresponding to the home, so that the user can control the smart home device in the home through the card 702. That is, the user can conveniently and intuitively obtain the device control card corresponding to the current space without searching and selecting from the plurality of cards 702 in the complicated plurality of spaces, and the operation of the user can be simplified.
The embodiment of the application also provides a device control method, and the first electronic device 100 can automatically and intelligently recommend the intelligent home device to be controlled to the user.
Illustratively, as in fig. 10 (a), the user clicks on the corresponding card 902 of the home bed, and the handset can jump to the corresponding device detail interface 1101 of fig. 11. The device details interface 1101 includes all devices in the main sleeper. The mobile phone may also display a popup window 1102 in the interface 1101, where the popup window 1102 is used to recommend smart home devices to be controlled to the user. For example, the mobile phone recommends a speaker and a television closest to the user through the popup window 1102.
It should be noted that, the smart device may be recommended to the user in the form of a popup window 1102, and the popup window 1102 may be displayed in the interface 1101 of fig. 11, or may be displayed in any other possible interface. For example, the handset may display a pop-up window 1102 in the interface 401 shown in fig. 4.
Fig. 12 shows a method flow corresponding to the device control scenario shown in fig. 11. As in fig. 12, the method comprises:
s301, the first electronic device 100 detects an operation of opening the smart life application by the user.
S302, the first electronic device 100 sends a space query request to the hub device 400.
S303, the center device 400 feeds back the information of the space where the first electronic device 100 is located and the information of the space where the second electronic device 200 is located to the first electronic device 100.
The second electronic device 200 may be an intelligent home device to be controlled, for example.
It should be noted that, in the scheme corresponding to fig. 12, the first electronic device 100 is mainly used to query the central device 400 for the space where the first electronic device 100 is located, and in other embodiments, the first electronic device 100 may also determine the space where the first electronic device 100 is located by itself, and/or the first electronic device 100 may determine the space where the second electronic device 200 is located.
S304, the first electronic device 100 transmits a UWB signal to the second electronic device 200.
The distance between the first electronic device 100 and the second electronic device 200 can be measured in various manners, and in this embodiment, the first electronic device 100 is used to implement ranging and positioning by receiving and transmitting UWB signals through a UWB module.
S305, the second electronic device 200 transmits a UWB signal to the first electronic device 100.
It will be appreciated that after the second electronic device 200 receives the UWB signal from the first electronic device 100, the UWB signal needs to be fed back to the first electronic device 100, so that the first electronic device 100 determines the distance between the second electronic device 200 and the first electronic device 100 through the fed back UWB signal.
S306, the first electronic device 100 determines the distance between the second electronic device 200 and the first electronic device 100 according to the UWB signal.
S307, the first electronic device 100 displays an interface of the smart life application according to the distance between the second electronic device 200 and the first electronic device 100, the information of the space where the first electronic device 100 is located, and the information of the space where the second electronic device 200 is located.
As one possible implementation, the first electronic device 100 identifies N (N is a positive integer) second electronic devices (such as IoT devices) that are closest to the user's location in the space where the user is currently located, according to the distance between the second electronic device 200 and the first electronic device 100, the information of the space where the first electronic device 100 is located, and the information of the space where the second electronic device 200 is located. Illustratively, as in fig. 10 (a), the user is currently in the home position, and the first electronic device 100 recognizes that the 2 second electronic devices currently closest to the user are the speaker and the television. Thereafter, as shown in fig. 11, the first electronic device 100 may display an interface 1101 of the smart life application, the interface 1101 including a popup window 1102 to recommend to the user the speaker and television closest to the user within the home.
In the above-mentioned example, the first electronic device 100 recommends the device to be controlled to the user through the independent popup window, and in other embodiments, the first electronic device 100 may also display the device to be controlled with a specific UI effect in the interface, so that the user is more focused on the device to be controlled. As illustrated in fig. 13, in the interface 1103, the first electronic device 100 displays the cards corresponding to the sound box and the television with a specific UI effect, so that the user can more quickly notice the cards of the sound box and the television, and control the sound box and the television through the corresponding cards.
In addition, the foregoing description mainly uses the example that the first electronic device 100 recommends the smart device closest to the user in the "home" interface (such as the interface 1103 shown in fig. 13), and in other embodiments, the first electronic device 100 may also recommends the smart device to the user in other forms through other interfaces, which is not limited in the embodiments of the present application.
And a second case: the user information is user behavior information. The first electronic device 100 may acquire user behavior information, and if a preset type of user behavior is detected, may recommend the smart home device related to the user behavior to the user.
For example, the first electronic device 100 counts the user behavior information that has been authorized by the user, and learns that the user typically operates through a large screen device such as a television when playing a game. Then, as shown in fig. 14 (a), when detecting that the user opens the game application, the first electronic device 100 may pop up a pop-up window 1402 shown in fig. 14 (b) when loading the game interface 1401 to prompt the user for a larger screen device with a better game experience. As in (b) of fig. 14, when it is detected that the user clicks the "yes" option, the first electronic device 100 controls to display a game interface on the television, through which the user can perform a game operation.
Alternatively, the first electronic device 100 may recommend one device associated with the user behavior to the user, or may recommend a plurality of devices associated with the user behavior to the user. For example, when the first electronic device 100 (mobile phone) detects that a television or a notebook computer is currently connected, the mobile phone may recommend the television or the notebook computer to the user in a popup window after detecting that the user opens the game application. If the user selects the notebook computer, the mobile phone controls the relevant interface for displaying the game on the notebook computer, and the user can play the game through the notebook computer.
Alternatively, the popup window 1402 shown in fig. 14 (b) may be replaced with a control such as a floating window. Optionally, after the duration of displaying the floating window reaches the preset duration, the first electronic device 100 stops displaying the floating window.
Optionally, in the embodiment of the present application, the first electronic device 100 may recommend/prompt, to the user, a device associated with a user behavior through an interface, and the first electronic device 100 prompts, to the user, a device associated with a user behavior through a voice, or the like.
And a third case: the user information is time information. The first electronic device 100 may acquire time information, and if the current time is within the target time period, may recommend a target device associated with the target time period to the user.
For example, the first electronic device 100 may count devices controlled by the user at various time points (or time periods), and may report the statistics to the hub device 400, and the hub device 400 may perform data analysis according to the statistics. The next time the user opens the smart life application on the first electronic device 100, the first electronic device 100 displays the frequently operated device in front according to the data analysis result.
The above description mainly uses the user information as the location information and the user behavior information as examples, and the user information may also include other information. The first electronic device 100 may determine, according to one or more pieces of information in the user information, a device to be controlled associated with the one or more pieces of information, and prompt/recommend the device to be controlled to the user.
The embodiment of the present application further provides a device control method, where the first electronic device 100 may acquire user information, and recommend an intelligent scene (may also be referred to as an execution scene) to the user according to the user information. Taking the mobile phone as the first electronic device 100 as an example, assume that a smart life application is installed in the mobile phone, and the mobile phone starts the smart life application, and a scene interface 1501 as shown in (a) of fig. 15 is displayed. After detecting the operation of clicking the add control 1502 by the user, the mobile phone displays an interface 1503 as shown in fig. 15 (b).
In the interface 1503, when the mobile phone detects a user click on the add task control 1505, the interface 1507 shown in fig. 15 (c) is displayed. On interface 1507, after detecting the user clicking on smart device control 1508, the handset determines that the user needs to add a task to control the smart device, and may display the controllable smart device for the user to select. Assuming that the smart device selected by the user is an air conditioner and the selected execution task is "off", the mobile phone may display the execution task "off" corresponding to the air conditioner selected by the user as shown by reference numeral 1512 in the interface 1509 shown in fig. 15 (d). At interface 1509, after the mobile phone detects the user's operation of clicking on the control shown by reference numeral 1511, a trigger condition may be set according to the user's operation, for example, the set trigger condition is "8 a.m.).
As one possible implementation, a card 1504 that recommends a scene may also be included in the interface 1503. The recommendation context is determined from one or more of the user information. For example, the mobile phone counts user information authorized by the user, and determines that when the user opens a game application on the mobile phone at home, the user will typically throw the game on the television to perform an operation. Then, the mobile phone may recommend to the user the smart scene "turn on the television when the game application on the mobile phone is turned on at home" when the user adds the smart scene. For example, the smart scene is recommended to the user through the card 1504 in the scene creation interface 1503. When it is detected that the user clicks the "confirm add" option in the card 1504, the handset adds the smart scene to be executed.
Then, as shown in an interface 1509 of fig. 15 (d), after detecting the operation of clicking the confirm control 1510 by the user, the mobile phone confirms that the user's current scene creation is completed. The handset may display an interface 1513 as shown in fig. 15 (e), the interface 1513 including a card 1514 for the "off air conditioner in the morning" scene manually created by the user and a card 1515 for the "on television" scene recommended by the handset.
According to the method, the first electronic device recommends the intelligent scene to the user according to the user information, and adds the intelligent scene according to the user indication, so that the user does not need to manually input the execution task and the triggering condition of the intelligent scene, the user can conveniently and quickly add the intelligent scene for the first electronic device, and the interactive experience of the user in the control process of the execution device is improved.
In some embodiments, the first electronic device 100 may set device control related functions. Fig. 16 shows an exemplary setup interface of the first electronic device 100. As shown in fig. 16, the setting interface may include a switch 1601, and when the switch 1601 is turned on, the first electronic device 100 may perform the above device control method, for example, may intelligently recommend a device closest to the user. As another example, the first electronic device 100 may further set a period of time for turning on the function of the intelligent recommendation device, a usage scenario, etc., which is not limited in the embodiment of the present application.
Fig. 17 shows another method flow of an embodiment of the present application. The method is applied to a first electronic device, and comprises the following steps:
s401, at a first moment, acquiring first user information which is authorized by a user.
S402, displaying a first interface according to the first user information.
The first interface comprises information of m second electronic devices associated with the first user information, and m is an integer greater than 1.
S403, detecting a first operation of a control which acts on the first interface and is used for controlling n electronic devices in the m second electronic devices to execute a first target operation, and controlling the n electronic devices to execute the first target operation in response to the first operation.
Wherein n is an integer of not less than 1 and not more than m.
S404, at a second moment, second user information which is authorized by the user is acquired.
S405, displaying a second interface according to the second user information.
Wherein the second interface comprises information of k second electronic devices associated with the second user information; k is an integer greater than 1.
S406, detecting a second operation of a control which acts on the second interface and is used for controlling j electronic devices in the k second electronic devices to execute a second target operation, and controlling the j electronic devices to execute the second target operation in response to the second operation. Wherein j is an integer of not less than 1 and not more than k.
It should be understood that: the first user information and the second user information include any one or more of the following: location information, time information, behavior information.
Illustratively, the first user information includes any one or more of the following: information of a first action executed by a user at the first moment, information of a family where the user is located at the first moment, information of a first space where the user is located at the first moment, and distance information between the user and the second electronic equipment at the first moment; the second user information includes any one or more of the following: information of a second action executed by the user at the second moment, information of a family where the user is located at the second moment, information of a second space where the user is located at the second moment, and distance information between the user and the second electronic device at the second moment.
In the above scheme, the first electronic device displays the first interface, and controls the second electronic device to execute the first target operation according to the first operation input by the user on the first interface. And then, at a second moment, the first electronic equipment can acquire new user information (namely second user information), and automatically and intelligently switch the displayed interface from the first interface to the second interface according to the second user information, so that a user can control the second electronic equipment associated with the second user information through the second interface. When the user information changes, the first electronic device can acquire the latest user information (second user information) and automatically switch to an interface associated with the second user information according to the second user information so as to meet the device control requirement of the user. In the process, the user does not need to carry out complicated interface switching operation, the operation complexity of the user is reduced, the time for switching the interface is further shortened, and the equipment control efficiency is improved.
Optionally, if the first user information includes information of a first home in which the user is located at the first time, the m second electronic devices include electronic devices in the first home in which the user is located at the first time. And/or if the second user information includes information of a second home in which the user is located at the second moment, the k second electronic devices include electronic devices in the second home in which the user is located at the second moment. Wherein the first household is the same as or different from the second household.
Optionally, if the first user information further includes information of a first space in which the user is located at a first moment in the first home, the m second electronic devices include electronic devices in the first space; the first interface does not include information of other electronic devices than the m second electronic devices. And/or if the second user information further includes information of a second space in which the user is located in the second home at the second moment, the k second electronic devices include electronic devices in the second space; the first interface does not include information of other electronic devices than the k second electronic devices. Wherein the first household is the same as or different from the second household.
For example, as in fig. 8 (a), when the user is located in the home (first space) at a first time, the mobile phone may display an interface 701 (first interface), and the interface 701 includes information of the electronic device (second electronic device associated with the first user information) in the home. And, the interface 701 does not include information of electronic devices in other spaces than the main sleeper. If an operation (i.e., a first operation) acting on the interface 701 for controlling the main horizontal internal lamp to be turned on (i.e., a first target operation) is detected, the mobile phone controls the lamp to be turned on.
The above mainly uses the first operation input by the user through the first interface as an example, and alternatively, in other examples, the first operation may also be input in other interfaces. For example, as also shown in fig. 7 (a), the user may click on a blank area of card 702, triggering the handset to jump to the details interface of card 702. The user may input an operation (first operation) for controlling the lamp to be turned on in the detail interface.
As in fig. 8 (b), the user is located in the bathroom (second space) at the second moment, the mobile phone may display an interface 704 (second interface), and the interface 704 includes information of the electronic device (second electronic device associated with the second user information) in the bathroom. And, the interface 704 does not include information of electronic devices in other spaces than the bathroom.
Optionally, if the first user information further includes information of a first space where the user is located in the first home at the first time, the m second electronic devices include electronic devices where the user is located in the first space at the first time. The first interface further comprises information of other electronic devices except the m second electronic devices; the identification information of the electronic equipment in the first space where the user is located at the first moment is highlighted on the first interface with a preset user interface UI effect, and/or the identification information of the electronic equipment in the first space where the user is located at the first moment is arranged in front of other electronic equipment on the first interface. And/or if the second user information further includes information of a second space where the user is located in the second home at the second moment, the k second electronic devices include electronic devices in the second space. The second interface further comprises information of other electronic devices except the k second electronic devices; the identification information of the electronic equipment in the second space where the user is located at the second moment is highlighted on the second interface with a preset UI effect, and/or the identification information of the electronic equipment in the second space where the user is located at the second moment is arranged in front of other electronic equipment on the second interface.
Optionally, the preset UI effects include, but are not limited to, one or more of the following: color effects, animation effects.
For example, as in fig. 9 (a), when the user is in the home (first space) at the first time, the mobile phone may display an interface 801 (first interface), and the interface 801 includes information of the electronic device (second electronic device associated with the first user information) in the home. And, interface 801 also includes information of electronic devices in other spaces (e.g., restaurants, living rooms) than the main sleeper. In the interface 801, the card corresponding to the main sleeper is displayed in front of the card in the other space, or, the information of the electronic device in the main sleeper is displayed in front of the electronic device in the other space.
As in fig. 9 (b), the user is located in the toilet (second space) at the second moment, the mobile phone may display an interface 803 (second interface), and the interface 803 includes information of the electronic device (second electronic device associated with the second user information) in the toilet. And, the interface 803 also includes information of electronic devices in other spaces than the bathroom. In the interface 803, the card corresponding to the bathroom is displayed in front of the card in the other space, or the information of the electronic device in the bathroom is displayed in front of the electronic device in the other space.
As another example, as in fig. 10 (a), when the user is in the home (first space) at the first time, the mobile phone may display an interface 801 (first interface), where the interface 801 includes information of the electronic device in the home (second electronic device associated with the first user information). And, interface 801 also includes information of electronic devices in other spaces (e.g., restaurants, living rooms) than the main sleeper. In the interface 801, the card corresponding to the main sleeper is displayed according to a preset UI effect (for example, thickened), or, in other words, the information of the second electronic device in the main sleeper is displayed according to the preset UI effect.
As in fig. 10 (b), the user is located in the toilet (second space) at the second moment, the mobile phone may display an interface 803 (second interface), and the interface 803 includes information of the electronic device (second electronic device associated with the second user information) in the toilet. And, the interface 803 also includes information of electronic devices in other spaces than the bathroom. In the interface 803, the card corresponding to the bathroom is displayed according to a preset UI effect (for example, thickened), or, in other words, the information of the second electronic device in the bathroom is displayed according to the preset UI effect.
Optionally, if the first user information includes distance information between the user and the second electronic devices at the first time, the m second electronic devices include m electronic devices that are closer to the user at the first time. And/or if the second user information includes distance information between the user and the second electronic device at the second time, the k second electronic devices include k electronic devices that are closer to the user at the second time.
Optionally, m electronic devices that are closer to the user may be m devices that are equal or different from the distance between the users.
Alternatively, the electronic device that is closer to the user may be a device that is in the same space as the user.
For example, as shown in fig. 11, at the first moment, assuming that the user is lying in the main, the mobile phone knows that 2 electronic devices closest to the user are the speaker and the television in the main, the mobile phone may display an interface 1101 (first interface) for recommending the television and the speaker to the user. Further exemplary, at the second moment, assuming that the user is in the bathroom, the mobile phone knows that the electronic device closest to the user is the body fat scale in the bathroom, the mobile phone can automatically switch the displayed interface to the interface including the body fat scale so as to recommend the body fat scale to the user.
Optionally, the information of the m second electronic devices is displayed in the first interface in a popup window form, and/or the information of the k second electronic devices is displayed in the second interface in a popup window form. Illustratively, as also shown in FIG. 11, information of the television and speakers is displayed in the interface 1101 in a popup window 1102.
Optionally, after a preset duration from displaying the popup window, stopping displaying the popup window. The preset duration can be flexibly set. Illustratively, still as in FIG. 11, the display of the pop-up window 1102 is stopped within 5 seconds from the display of the pop-up window 1102.
Optionally, the information of the m second electronic devices is displayed in the first interface with a preset UI effect, or the information of the second electronic devices is displayed before the information of the other electronic devices.
And/or the information of the k second electronic devices is displayed in the second interface in a preset UI effect.
Illustratively, as in fig. 13, the recommended devices (speakers, televisions) are displayed in the interface 1103 with a preset UI effect.
Optionally, if the first user information includes information of a first behavior executed by the user at the first moment, the m second electronic devices are m electronic devices to be controlled by the user when the user executes the first behavior. And/or if the second user information includes information of a second behavior executed by the user at the second moment, the k second electronic devices are k electronic devices to be controlled by the user when the user executes the second behavior.
For example, as shown in fig. 14 (a), at a first moment, when the mobile phone knows that the user opens the game application (first action), the mobile phone displays an interface 1401 (first interface), and the interface 1401 includes a popup 1402, where the popup 1402 includes information about a television set that the user intends to control when opening the game application.
Further exemplary, at the second moment, the mobile phone learns that the user performs the second action (assuming that the electronic device associated with the second action is a sweeping robot), and then the mobile phone automatically switches the interface to the interface including the sweeping robot so as to recommend the sweeping robot to the user.
Optionally, if the first user information includes information of the first time, the m second electronic devices are m electronic devices to be controlled by the user at the first time. And/or if the second user information includes information of the second time, the k second electronic devices are k electronic devices to be controlled by the user at the second time.
Optionally, the method further comprises: and displaying a third interface according to the first user information, wherein the third interface is used for recommending the target execution scene to the user. Receiving a third operation input by the user on the third interface, and adding the target execution scene in response to the third operation; and executing the target execution scene when the trigger condition of the target execution scene is met.
For example, after the mobile phone acquires the user information, knowing that when the user opens the game application on the mobile phone at home, the user will typically throw the game on the television to perform an operation, the mobile phone may display an interface 1503 (third interface) as shown in fig. 15 (b), for recommending a scene to the user. In some examples, if the user is detected clicking on the "confirm add" button (third operation) in control 1504 on interface 1503, the handset adds the recommended scene. Subsequently, when the triggering condition of the recommended scene is met, namely, when a user opens a game application on the mobile phone at home, the mobile phone executes the recommended scene and controls the television to be turned on.
It should be noted that, the first electronic device may also determine, according to the above-mentioned various user information, a second electronic device that the user wants to control, and display an interface that includes the information of the second electronic device.
In some examples, the first electronic device determines a second electronic device that the user wants to control based on the user's location information, behavior information, and time information. For example, at time a, when it is detected that the user opens a game application on the mobile phone in the living room, the mobile phone may recommend to the user to open a television in the living room, so that the user can put the game on the television for operation. At time B, when it is detected that the user opens the gaming application on the cell phone in the bedroom, the cell phone may recommend to the user to open the computer in the bedroom so that the user performs the gaming operation on the computer.
In some examples, the first electronic device determines a second electronic device that the user wants to control based on the user's location information and time information. For example, when it is detected that the user opens the smart life application in the home at time a, the mobile phone displays an interface 1801 shown in fig. 18 (a). The interface 1801 includes a card 1802 corresponding to the home. The card 1802 may include information for some or all of the electronic devices within the home. The interface 1801 may also include a device card 1803 associated with the current time a. Card 1803 includes information of a portion of the usual devices at the current time a. A user can quickly and easily find the electronic device that wants to control through the card 1802 and the card 1803.
Thereafter, when it is detected that the user opens the smart life application in the bathroom at time B, the handset displays interface 1804 shown in fig. 18 (B). The interface 1804 includes a bathroom corresponding card 1805. The card 1805 may include information for some or all of the electronic devices within the bathroom. The interface 1805 may also include a device card 1806 associated with the current time B. Card 1806 includes information of a portion of the usual devices at the current time B. A user can quickly and conveniently find an electronic device to be controlled through the card 1805 and the card 1806.
The foregoing only exemplifies the manner in which the first electronic device determines the displayed interface according to the plurality of user information (the displayed interface may include the electronic devices respectively associated with the plurality of information), and the first electronic device may also determine the displayed interface according to the plurality of user information and other policies, which is not limited in the embodiment of the present application. For example, in other embodiments, the user information may be prioritized. And when detecting that different user information is associated with different devices, preferentially recommending the second electronic device associated with the user information with high priority to the user. Or the electronic equipment associated with the user information with high priority is displayed in front of other electronic equipment, or the electronic equipment associated with the user information with high priority is displayed according to a preset UI effect.
The embodiment of the application also provides a device control method, which is applied to the first electronic device, as shown in fig. 19, and includes:
s501, acquiring first user information which is authorized by a user.
S502, displaying a third interface according to the first user information, wherein the third interface is used for recommending a target execution scene to a user.
For example, assuming that the first user information includes information of a behavior performed by the user, after the mobile phone acquires the first user information, and knows that when the user opens a game application on the mobile phone at home, the user typically throws a game screen onto the television to perform an operation, the mobile phone may display an interface 1503 (third interface) shown in fig. 15 (b), which is used to recommend to the user that "when the game application on the mobile phone is opened at home, the television is opened".
S503, receiving a third operation input by the user on the third interface, and adding the target execution scene according to the third operation.
Still as in fig. 15 (b), if it is detected that the user clicks the "confirm add" button (third operation) in control 1504 on interface 1503, the handset adds the target execution scenario.
S504, executing the target execution scene when the trigger condition of the target execution scene is met.
It can be understood that after the target execution scene is added, if the trigger condition that the target execution scene is satisfied is detected, that is, if the user opens the game application on the mobile phone at home, the mobile phone executes the target execution scene, and controls to open the television.
In other embodiments, the first electronic device may further automatically add the target execution scenario associated with the first user information after detecting the first user information.
In some aspects, various embodiments of the application may be combined and the combined aspects implemented. Optionally, some operations in the flow of method embodiments are optionally combined, and/or the order of some operations is optionally changed. The order of execution of the steps in each flow is merely exemplary, and is not limited to the order of execution of the steps, and other orders of execution may be used between the steps. And is not intended to suggest that the order of execution is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In addition, it should be noted that details of processes involved in a certain embodiment herein apply to other embodiments as well in a similar manner, or that different embodiments may be used in combination.
Illustratively, in fig. 6A, there is no limitation on the execution order between step S102 and step S105.
Moreover, some steps in method embodiments may be equivalently replaced with other possible steps. Alternatively, some steps in method embodiments may be optional and may be deleted in some usage scenarios. Alternatively, other possible steps may be added to the method embodiments.
Moreover, the method embodiments may be implemented alone or in combination.
It will be appreciated that, in order to implement the above-mentioned functions, the electronic device in the embodiment of the present application includes corresponding hardware structures and/or software modules for performing the respective functions. The various illustrative units and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present application.
The embodiment of the application can divide the functional units of the electronic device according to the method example, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 20 shows a schematic block diagram of an intelligent device control apparatus provided in an embodiment of the present application, where the apparatus may be the first electronic device or a component with a corresponding function. The apparatus 1700 may exist in the form of software or as a chip that may be used in a device. The apparatus 1700 includes: a processing unit 1702 and a communication unit 1703. Alternatively, the communication unit 1703 may be further divided into a transmission unit (not shown in fig. 20) and a reception unit (not shown in fig. 20). The sending unit is configured to enable the apparatus 1700 to send information to other electronic devices. A receiving unit for supporting the apparatus 1700 to receive information from other electronic devices.
Optionally, the apparatus 1700 may further include a storage unit 1701 for storing program codes and data of the apparatus 1700, and the data may include, but is not limited to, raw data or intermediate data, etc.
The processing unit 1702 may be used to support a receiving device to perform processes such as S501 in fig. 19, and/or other processes for the schemes described herein. The communication unit 1703 is configured to support communication between the apparatus 1700 and other electronic devices (e.g., the second electronic device described above, etc.), for example, to support execution of S304 in fig. 12, etc.
Optionally, the apparatus 1700 may further comprise an input unit (not shown in fig. 20) for receiving input information of a user, such as a first operation, a second operation, etc. for receiving user input.
Optionally, the apparatus 1700 may further comprise a display unit (not shown in fig. 20) for displaying an interface and/or other information.
In one possible implementation, the processing unit 1702 may be a controller or the processor 310 shown in FIG. 2, such as a central processing unit (Central Processing Unit, CPU), general purpose processor, digital signal processing (Digital Signal Processing, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, a combination of a DSP and a microprocessor, and so forth.
In one possible implementation, the communication unit 1703 may include the wireless communication module 360 shown in fig. 2, and may further include a transceiver circuit, a transceiver, a radio frequency device, and so on.
In one possible approach, the memory unit 1701 may be the memory 320 shown in FIG. 2.
The embodiment of the application also provides electronic equipment which comprises one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the related method steps described above to implement the smart device control method of the above embodiments.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to implement the method of any of the method embodiments described above.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integral with the processor or separate from the processor, and the application is not limited. The memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor are not particularly limited in the present application.
The system-on-chip may be, for example, a field programmable gate array (field programmable gatearray, FPGA), an application specific integrated chip (application specific integrated circuit, ASIC), a system on chip (SoC), a central processing unit (central processorunit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
The embodiment of the application also provides a computer readable storage medium, wherein computer instructions are stored in the computer readable storage medium, and when the computer instructions run on the electronic device, the electronic device is caused to execute the related method steps to realize the intelligent device control method in the embodiment.
The embodiment of the application also provides a computer program product, which when run on a computer, causes the computer to execute the related steps to realize the intelligent device control method in the embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be a component or module in particular, which may include a processor and a memory connected; the memory is configured to store computer-executable instructions, and when the apparatus is running, the processor may execute the computer-executable instructions stored in the memory, so that the apparatus executes the intelligent device control method in the above method embodiments.
The electronic device, the computer readable storage medium, the computer program product or the chip provided by the embodiments of the present application are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated that in order to achieve the above-described functionality, the electronic device comprises corresponding hardware and/or software modules that perform the respective functionality. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method may be implemented in other manners. For example, the above-described embodiments of the terminal device are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via interfaces, modules or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in whole or in part in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) or a processor to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. An intelligent device control method applied to a first electronic device is characterized by comprising the following steps:
at a first moment, acquiring first user information which is authorized by a user, and displaying a first interface according to the first user information, wherein the first interface comprises information of m second electronic devices associated with the first user information, and m is an integer greater than 1;
detecting a first operation of a control acting on the first interface for controlling n electronic devices in the m second electronic devices to execute a first target operation, wherein n is an integer not less than 1 and not more than m;
controlling the n electronic devices to execute the first target operation in response to the first operation;
at a second moment, obtaining second user information which is authorized by the user, and displaying a second interface according to the second user information, wherein the second interface comprises k pieces of information of second electronic equipment associated with the second user information, and k is an integer greater than 1;
Detecting a second operation of a control which acts on the second interface and is used for controlling j electronic devices in the k second electronic devices to execute a second target operation, wherein j is an integer not less than 1 and not more than k;
and responding to the second operation, and controlling the j electronic devices to execute the second target operation.
2. The method of claim 1, wherein if the first user information includes information of a first home in which the user is located at the first time, the m second electronic devices include electronic devices in the first home in which the user is located at the first time;
and/or if the second user information includes information of a second family where the user is located at the second moment, the k second electronic devices include electronic devices in the second family where the user is located at the second moment.
3. The method of claim 2, wherein the m second electronic devices comprise electronic devices in a first space in which the user is located at a first time in the first home if the first user information further comprises information of the first space; the first interface does not include information of other electronic devices except the m second electronic devices;
And/or if the second user information further includes information of a second space in which the user is located in the second home at the second moment, the k second electronic devices include electronic devices in the second space; the first interface does not include information of other electronic devices than the k second electronic devices.
4. The method according to claim 2, wherein if the first user information further includes information of a first space in which the user is located at the first home at the first time, the m second electronic devices include electronic devices in the first space in which the user is located at the first time, identification information of the electronic devices in the first space in which the user is located at the first time is highlighted on the first interface with a preset user interface UI effect, and/or identification information of the electronic devices in the first space in which the user is located at the first time is arranged in front of other electronic devices on the first interface;
and/or if the second user information further includes information of a second space where the user is located in the second home at the second moment, the k second electronic devices include electronic devices where the user is located in the second space at the second moment, the identification information of the electronic devices in the second space where the user is located at the second moment is highlighted on the second interface with a preset UI effect, and/or the identification information of the electronic devices where the user is located in the second space at the second moment is arranged in front of other electronic devices on the second interface.
5. The method of claim 1, wherein if the first user information comprises distance information between the user and a second electronic device at the first time, the m second electronic devices comprise m electronic devices that are closer to the user at the first time;
and/or if the second user information includes distance information between the user and the second electronic device at the second moment, the k second electronic devices include k electronic devices that are closer to the user at the second moment.
6. The method according to claim 5, wherein the information of the m second electronic devices is displayed in the first interface in a popup window and/or the information of the k second electronic devices is displayed in the second interface in a popup window;
the method further comprises the steps of: and stopping displaying the popup window after displaying the preset duration from the popup window.
7. The method of claim 5, wherein the information of the m second electronic devices is displayed in the first interface with a preset UI effect and/or the information of the k second electronic devices is displayed in the second interface with a preset UI effect.
8. The method of claim 1, wherein if the first user information includes information of a first behavior performed by the user at the first time, the m second electronic devices are m electronic devices to be controlled by the user when the user performs the first behavior;
and/or if the second user information includes information of a second behavior executed by the user at the second moment, the k second electronic devices are k electronic devices to be controlled by the user when the user executes the second behavior.
9. The method of claim 1, wherein if the first user information includes information of a first time instant, the m second electronic devices are m electronic devices to be controlled by the user at the first time instant;
and/or if the second user information includes information of a second moment, the k second electronic devices are k electronic devices to be controlled by the user at the second moment.
10. The method according to any one of claims 1-9, wherein after obtaining the first user information that has been authorized by the user, the method further comprises:
And displaying a third interface according to the first user information, wherein the third interface is used for recommending a target execution scene to the user.
11. The method of claim 10, wherein after displaying the third interface, the method further comprises:
receiving a third operation input by the user on the third interface, and adding the target execution scene in response to the third operation;
and executing the target execution scene when the trigger condition of the target execution scene is met.
12. An electronic device, comprising: a processor, a memory and a display screen, the memory and the display screen being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the electronic device to perform the method of any of claims 1-11.
13. A computer readable storage medium, characterized in that the computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the method according to any of claims 1-11.
CN202210270161.7A 2022-03-18 2022-03-18 Intelligent device control method and electronic device Pending CN116804854A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210270161.7A CN116804854A (en) 2022-03-18 2022-03-18 Intelligent device control method and electronic device
PCT/CN2023/082333 WO2023174429A1 (en) 2022-03-18 2023-03-17 Intelligent-device control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210270161.7A CN116804854A (en) 2022-03-18 2022-03-18 Intelligent device control method and electronic device

Publications (1)

Publication Number Publication Date
CN116804854A true CN116804854A (en) 2023-09-26

Family

ID=88022431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210270161.7A Pending CN116804854A (en) 2022-03-18 2022-03-18 Intelligent device control method and electronic device

Country Status (2)

Country Link
CN (1) CN116804854A (en)
WO (1) WO2023174429A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9449442B2 (en) * 2014-10-23 2016-09-20 Vivint, Inc. Interface of an automation system
CN105634881B (en) * 2014-10-30 2020-07-07 腾讯科技(深圳)有限公司 Application scene recommendation method and device
CN106647313A (en) * 2017-02-14 2017-05-10 长沙零冰电子科技有限公司 Intelligent household control interface display method and display device
CN106909396A (en) * 2017-03-03 2017-06-30 宇龙计算机通信科技(深圳)有限公司 A kind of interface display method and device of Intelligent housing application
CN108803444A (en) * 2018-07-25 2018-11-13 北京小米智能科技有限公司 Control method, device and the storage medium of smart machine
CN109710134B (en) * 2018-12-29 2021-05-18 联想(北京)有限公司 Display method and electronic equipment

Also Published As

Publication number Publication date
WO2023174429A1 (en) 2023-09-21

Similar Documents

Publication Publication Date Title
US11132840B2 (en) Method and device for obtaining real time status and controlling of transmitting devices
CN105634881B (en) Application scene recommendation method and device
EP2887194B1 (en) Method for controlling a composition of a screen and electronic device thereof
US10075814B2 (en) Location-based selection of wireless peripheral devices
WO2019036942A1 (en) Display method and device
US20220239718A1 (en) Communication Protocol Switching Method, Apparatus, and System
CN111614524A (en) Multi-intelligent-device linkage control method, device and system
CN114466102B (en) Method for displaying application interface, related device and traffic information display system
JP7234379B2 (en) Methods and associated devices for accessing networks by smart home devices
CN111505946B (en) Equipment control method and equipment
JP6448901B2 (en) Peripheral device control system and method based on topology
CN113794796B (en) Screen projection method and electronic equipment
CN113194454B (en) Data sharing method and electronic equipment
CN114095542A (en) Display control method and electronic equipment
CN114466304B (en) Control method of intelligent household equipment, mobile terminal and intelligent household platform
CN115599265A (en) Intelligent device control method, terminal device, server and storage medium
CN116804854A (en) Intelligent device control method and electronic device
CN114172925B (en) Network distribution method and equipment
CN114780001B (en) Control method of multi-way switch equipment, terminal equipment and server
CN115174299B (en) Binding method of household equipment and electronic equipment
CN116095230B (en) Application program recommendation method, terminal device and readable storage medium
KR101556179B1 (en) Mobile device and method for controlling the mobile device
CN114721281A (en) Intelligent scene configuration method, mobile terminal and server
CN116301501A (en) Region setting method, device, electronic equipment and storage medium
CN117811947A (en) Multi-equipment networking system, method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination