CN111650840A - Intelligent household scene arranging method and terminal - Google Patents

Intelligent household scene arranging method and terminal Download PDF

Info

Publication number
CN111650840A
CN111650840A CN201910160998.4A CN201910160998A CN111650840A CN 111650840 A CN111650840 A CN 111650840A CN 201910160998 A CN201910160998 A CN 201910160998A CN 111650840 A CN111650840 A CN 111650840A
Authority
CN
China
Prior art keywords
user
scene
state information
smart home
intelligent household
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910160998.4A
Other languages
Chinese (zh)
Other versions
CN111650840B (en
Inventor
王建新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910160998.4A priority Critical patent/CN111650840B/en
Priority to CN202111385425.5A priority patent/CN114217532A/en
Publication of CN111650840A publication Critical patent/CN111650840A/en
Application granted granted Critical
Publication of CN111650840B publication Critical patent/CN111650840B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an intelligent home scene arrangement method and a terminal, relates to the field of terminals, and solves the problem that operation steps of scene arrangement are very complicated. The specific scheme is as follows: after receiving the first operation, the electronic equipment responds to the first operation and requests the state information of the intelligent household equipment from the server; after the electronic equipment receives the state information of the intelligent household equipment, a control rule is generated according to the state information of the intelligent household equipment, and the control rule is used for controlling the intelligent household equipment. Therefore, after the electronic equipment receives the second operation of starting the intelligent home scene, the electronic equipment responds to the second operation and controls the intelligent home equipment to enter the state corresponding to the state information according to the control rule.

Description

Intelligent household scene arranging method and terminal
Technical Field
The application relates to the field of terminals, in particular to an intelligent home scene arrangement method and a terminal.
Background
At present, intelligent household equipment becomes a part of daily life of people. The user can install the cell-phone software (APP) relevant with intelligent house on the terminal, through APP remote control intelligent household equipment on the operation terminal, certainly the user also can operate intelligent household equipment, control intelligent household equipment. In order to meet the requirement that a user can control a plurality of intelligent household devices at the same time, the user can create a one-key scene through an APP on a terminal. The process of creating a one-touch scene may be referred to as scene editing. Subsequently, the user can control a plurality of intelligent household devices simultaneously by operating the APP on the terminal through one key.
However, in the process of arranging the scene, the user needs to manually add the smart home devices to be managed one by one, and then select the state of each smart home device. At least two steps are required for the operation of one smart home device. If a user needs to create a scene for managing a plurality of smart home devices, more operations are required, and therefore, the operation steps for arranging the scene are very complicated, which results in low user experience.
Disclosure of Invention
The intelligent home scene arrangement method and the terminal solve the problem that operation steps of scene arrangement are very complicated.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a method for arranging smart home scenes, where the method is applied to an electronic device, and the method includes: after receiving the first operation, the electronic equipment responds to the first operation and requests the state information of the intelligent household equipment from the server; after the electronic equipment receives the state information of the intelligent household equipment, a control rule is generated according to the state information of the intelligent household equipment, and the control rule is used for controlling the intelligent household equipment, so that an intelligent household scene is generated. After the electronic device receives a second operation of starting the intelligent home scene, the electronic device responds to the second operation and can control the intelligent home device to enter a state corresponding to the state information according to the control rule.
According to the intelligent home scene arrangement method, a user can arrange scenes by one key without adding scene arrangement equipment one by one, and operation steps are reduced. Meanwhile, the user can conveniently store the current optimal state of the intelligent household equipment, and the optimal state of the intelligent household equipment can be restored by one key in the future. For example, after a user gets up a bed a day, the user may turn on the function of the intelligent soymilk machine for preparing the five cereals soymilk, turn on the intelligent sound box to play favorite music and turn on the intelligent bedroom lamp. If the user feels that the states of the smart homes reach the most desired states, the states of the smart homes can be obtained through one key and arranged into a scene. The name of the scene may be entered by the user or may be automatically generated. Illustratively, the name of a scenario is a wake scenario. When the user gets up the bed the next day, the user can click the getting-up scene by one key to recover the optimal states of the intelligent soybean milk machine, the intelligent sound and the intelligent bedroom lamp, namely the states arranged by the user in the previous day.
In a possible implementation manner, the requesting the state information of the smart home device from the server includes: and requesting state information of the intelligent household equipment associated with the user account logged in by the electronic equipment. Specifically, the state information of the smart home devices associated with the user name may be state information of all smart home devices associated with the user name.
In another possible implementation, before the electronic device receives the first operation, the method further includes: the electronic equipment receives a start operation; and the electronic equipment responds to the starting operation and starts to record the intelligent household equipment operated by the user. After the electronic device records the smart home devices operated by the user, the requesting the server for the state information of the smart home devices includes: and sending the identification of the intelligent household equipment operated by the user to the server, and requesting the state information of the intelligent household equipment corresponding to the identification.
In another possible implementation manner, the requesting the state information of the smart home device from the server further includes: and sending the type identifier of the intelligent household equipment to a server, and requesting the state information of the intelligent household equipment associated with the type identifier of the intelligent household equipment.
In combination with the foregoing various possible implementations, in another possible implementation, the electronic device may further receive a scene name, and associate the control rule with the scene name.
In one possible example, the electronic device receives a scene name, comprising: the electronic device receives a scene name input by a user. For example, a scene name input by a user may be received after receiving the first operation; the scene name input by the user can be received before the first operation is received; alternatively, the scene name input by the user may be received before the start operation is received.
In another possible example, the electronic device receives a scene name, including: the electronic equipment generates a scene name according to a preset rule.
In a second aspect, the present application provides an electronic device comprising: the mobile terminal comprises one or more processors, a memory, a touch screen and a communication module, wherein the communication module comprises a mobile communication module and a wireless communication module; wherein the memory is to store one or more programs; the touch screen is used for receiving a first operation and a second operation; the one or more processors are configured to execute the one or more programs to perform the following acts: responding to the first operation, and indicating the communication module to request the state information of the intelligent household equipment from the server, and after the electronic equipment receives the state information of the intelligent household equipment, generating a control rule according to the state information of the intelligent household equipment, wherein the control rule is used for controlling the intelligent household equipment; and responding to the second operation, and controlling the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
In one possible implementation, the communication module is configured to: and requesting state information of the intelligent household equipment associated with the user account logged in by the electronic equipment. Specifically, the state information of the smart home devices associated with the user name may be state information of all smart home devices associated with the user name.
In another possible implementation, the touch screen is further configured to receive a start operation; the processor is also used for responding to the starting operation and starting recording the intelligent household equipment operated by the user. After the electronic device records the smart home devices operated by the user, the communication module is used for: and sending the identification of the intelligent household equipment operated by the user to the server, and requesting the state information of the intelligent household equipment corresponding to the identification.
In another possible implementation manner, the communication module is further configured to: and sending the type identifier of the intelligent household equipment to a server, and requesting the state information of the intelligent household equipment associated with the type identifier of the intelligent household equipment.
In combination with the above various possible implementation manners, in another possible implementation manner, the touch screen is further configured to receive a scene name; the processor is also configured to associate the control rule with a scene name.
In one possible example, a touch screen is used to receive a scene name input by a user. For example, a scene name input by a user may be received after receiving the first operation; the scene name input by the user can be received before the first operation is received; alternatively, the scene name input by the user may be received before the start operation is received.
In another possible example, the processor is configured to generate the scene name according to a preset rule.
It should be noted that, in the first aspect and the second aspect, the smart home scenario may correspond to a first rule, where the first rule indicates that the smart home device enters a state corresponding to the state information when the first event occurs. In addition, the first operation may be a preset gesture or a voice instruction. The second operation may be a preset gesture or a voice instruction. The preset gesture may be any one of a single-tap gesture, a slide gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-tap gesture.
In a third aspect, the present application provides an intelligent home scene editing device, where the intelligent home scene editing device includes: the device comprises a processing unit, a storage unit and a transmitting-receiving unit, wherein the storage unit is used for storing one or more programs; the processing unit is used for executing one or more programs; the one or more programs include instructions for executing the smart home scene orchestration method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a smart home system, which includes one or more smart home devices, an electronic device, and a server, the smart home devices, the electronic device, and the server being interconnected, wherein,
the electronic equipment is used for receiving a first operation;
the electronic equipment is also used for responding to the first operation and requesting the state information of the intelligent household equipment from the server;
the server is used for determining the state information of the intelligent household equipment and feeding back the state information of the intelligent household equipment to the electronic equipment;
after the electronic equipment receives the state information of the intelligent household equipment, the electronic equipment is further used for generating a control rule according to the state information of the intelligent household equipment, and the control rule is used for controlling the intelligent household equipment;
after the electronic equipment generates the intelligent home scene, the electronic equipment is further used for receiving a second operation of starting the intelligent home scene; and responding to the second operation, and controlling the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
In a possible implementation manner, before the electronic device receives the first operation, the electronic device is further configured to receive a start operation, and start recording of the smart home device operated by the user in response to the start operation.
In a fifth aspect, the present application provides a computer storage medium, which may comprise computer instructions that, when run on an electronic device, cause the electronic device to perform the method according to the first aspect or any of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method according to the first aspect as such or any one of the possible implementations of the first aspect.
In a seventh aspect, the present application provides an apparatus having functionality for implementing the behavior of the electronic device in the method of the first aspect. The functions can be realized by hardware, and the functions can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, a processing unit or module, a storage unit or module, a display unit or module, a transceiver unit or module.
It should be appreciated that the description of technical features, solutions, benefits, or similar language in this application does not imply that all of the features and advantages may be realized in any single embodiment. Rather, it is to be understood that the description of a feature or advantage is intended to include the specific features, aspects or advantages in at least one embodiment. Therefore, the descriptions of technical features, technical solutions or advantages in the present specification do not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantages described in the present embodiments may also be combined in any suitable manner. One skilled in the relevant art will recognize that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
Fig. 1 is an exemplary diagram of an intelligent home system provided in the present application;
FIG. 2 is a first diagram illustrating an exemplary configuration of an electronic device according to the present disclosure;
fig. 3 is a diagram illustrating a software structure of an electronic device according to the present application;
FIG. 4A is a diagram illustrating a scene editing interface provided by the prior art;
FIG. 4B is a diagram illustrating an example of a scene layout interface provided in the prior art;
fig. 5 is a first flowchart of a method for arranging smart home scenes according to the present application;
FIG. 6A is a diagram illustrating an example of a scene layout interface provided herein;
FIG. 6B is a diagram of an example of a scene layout interface provided in the present application;
FIG. 6C is a third exemplary view of a scene layout interface provided in the present application;
FIG. 7 is a diagram illustrating an example of a scene editing process provided herein;
FIG. 8 is a diagram illustrating a first example of a scene layout result interface provided in the present application;
fig. 9 is a schematic flow chart of a second smart home scene editing method provided in the present application;
fig. 10 is a schematic flow chart of a third method for arranging smart home scenes according to the present application;
FIG. 11A is a diagram of a scene editing interface example provided by the present application;
FIG. 11B is a diagram of an example of a scene layout interface provided by the present application;
FIG. 11C is a diagram illustrating an example of a scene layout interface provided by the present application;
FIG. 12 is a diagram of a second example of a scene orchestration process interface according to the present application;
fig. 13 is a schematic flow chart of a method for arranging smart home scenes according to the present application;
fig. 14 is a second exemplary diagram of an electronic device provided in the present application;
fig. 15 is a composition example diagram of an intelligent home scene editing device provided in the present application.
Detailed Description
The terms "first", "second" and "third", etc. in the description and claims of this application and the description of the drawings are used for distinguishing between different objects and not for limiting a particular order.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
For clarity and conciseness of the following descriptions of the various embodiments, a brief introduction to the related art is first given:
the smart home (smart home or home automation) is characterized in that a home is used as a platform, facilities related to home life are integrated by utilizing a comprehensive wiring technology, a network communication technology, a safety precaution technology, an automatic control technology and an audio and video technology, a high-efficiency management system of home facilities and home schedule affairs is constructed, home safety, convenience, comfort and artistry are improved, and an environment-friendly and energy-saving living environment is realized.
Fig. 1 is an example diagram of an intelligent home system provided in an embodiment of the present application. The intelligent home system comprises intelligent home equipment 101, electronic equipment 102 and a server 103. The smart home devices 101 and the electronic devices 102 may be connected to the server 103 in a wireless manner. For example, the smart home devices 101 and the electronic devices 102 may communicate with the server 103 by connecting to the home wireless router 104 shown in fig. 1. It is to be understood that the smart home device 101 is already registered on the server 103 and associated with the electronic device 102. In other words, the smart home device 101 has been successfully paired with the electronic device 102, and at this time, the user may control the smart home device 101 through the electronic device 102. For example, the smart home device 101 is associated with a user name, or the smart home device 101 is associated with a device identification. For convenience of description, in the embodiment of the present application, the smart home device has been paired with a specific electronic device or a specific user.
Specifically, the user may send a control rule to the server 103 through the electronic device 102, and the server 103 sends the control rule to the smart home device 101 after receiving the control rule. In other embodiments, the smart home device 101 and the electronic device 102 may also communicate through a wireless connection. For example, the user may operate on the smart home APP installed in the electronic device 102 to remotely control the smart home device 101 by transmitting the control rule through a Wireless communication network (e.g., a 3G network, a 4G network, etc.) or through a Wireless manner (e.g., Wireless Fidelity (WiFi)). The embodiment of the present application does not limit the specific communication method. Optionally, the user may also directly operate on the smart home device 101 to control the smart home device 101. It should be noted that the control rule may also be understood as a control instruction, and the smart home device executes a corresponding operation after receiving the control instruction.
After receiving the control rule, the smart home device 101 executes a corresponding operation according to the control rule, and reports its own state information to the server 103. For example, the smart home device 101 is a smart air conditioner, and the smart air conditioner is in a standby state before receiving the control rule. When the intelligent air conditioner receives the control rule for starting the refrigeration mode, the intelligent air conditioner starts the refrigeration mode and reports the current refrigeration mode state information to the server 103. In other embodiments, the smart home device 101 may also be an electric cooker, an air conditioner, a water heater, or the like. For convenience of description, the household equipment in the application refers to intelligent household equipment. For different smart home devices, the current state may be different.
The server 103 may be configured to store state information of the smart home device 101. As an example, the server 103 may store the latest status information of the smart home devices 101. In another example, the server 103 may store the state information of the smart home devices 101 at different time intervals, which is not limited. In addition, the server 103 may refer to a cloud server.
In some embodiments, the electronic device 102 may be a mobile phone (as shown in fig. 1), a tablet Computer, a desktop Computer, a laptop Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a handheld Computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a smart watch, and the like, and the specific form of the smart home device, the server, and the electronic device is not particularly limited in this application. In this embodiment, the structure of the electronic device 102 may be as shown in fig. 2, which is a schematic structural diagram of the electronic device 102 in the smart home system shown in fig. 1 provided in an embodiment of the present application.
As shown in fig. 2, the electronic device 102 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device 102. In other embodiments, the electronic device 102 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. For example, in the present application, the processor 110 may arrange a scene according to the state information of the smart home devices, that is, generate a control rule, and the control rule is used for controlling the smart home devices. After the electronic equipment receives a second operation of starting the intelligent home scene, the electronic equipment responds to the second operation and controls the intelligent home equipment to enter a state corresponding to the state information according to the control rule. The intelligent household equipment is associated with a user account logged in by the electronic equipment. Further, the smart home device may also be a smart home device that has been operated by the user in the smart home device associated with the user account in which the electronic device logs in.
The controller can be, among other things, a neural center and a command center of the electronic device 102. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 102.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the capture functionality of the electronic device 102. The processor 110 and the display screen 194 communicate via the DSI interface to implement the display function of the electronic device 102.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 102, and may also be used to transmit data between the electronic device 102 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the present embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device 102. In other embodiments of the present application, the electronic device 102 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 102. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 102 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 102 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 102. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 102, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 102 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 102 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 102 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-OLED, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 102 may include 1 or N display screens 194, with N being a positive integer greater than 1.
A series of Graphical User Interfaces (GUIs) may be displayed on the display screen 194 of the electronic device 102, which GUIs are the main screen of the electronic device 102. Generally, the size of the display screen 194 of the electronic device 102 is fixed, and only a limited number of controls can be displayed in the display screen 194 of the electronic device 102. A control is a GUI element, which is a software component contained in an application program and controls all data processed by the application program and interactive operations related to the data, and a user can interact with the control through direct manipulation (direct manipulation) to read or edit information related to the application program. Generally, a control may include a visual interface element such as an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, Widget, and the like. For example, in the present embodiment, the display screen 194 may display virtual keys (one-key arrangement, start arrangement, stop arrangement).
The electronic device 102 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 102 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 102 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 102 may support one or more video codecs. In this way, the electronic device 102 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent cognition of the electronic device 102 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 102. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 102 and data processing by executing instructions stored in the internal memory 121. For example, in the present embodiment, the processor 110 may perform scene arrangement by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 102, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 102 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 102 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic device 102 may listen to music through the speaker 170A or to a hands-free conversation.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 102 answers a call or voice message, the receiver 170B can be used to answer the voice by being close to the ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 102 may be provided with at least one microphone 170C. In other embodiments, the electronic device 102 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 102 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 102 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device 102 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device 102 may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion pose of the electronic device 102. In some embodiments, the angular velocity of the electronic device 102 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 102, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 102 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 102 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 102 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 102 is a flip, the electronic device 102 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 102 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 102 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 102 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 102 may utilize the range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 102 emits infrared light outward through the light emitting diode. The electronic device 102 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 102. When insufficient reflected light is detected, the electronic device 102 may determine that there are no objects near the electronic device 102. The electronic device 102 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 102 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The electronic device 102 may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 102 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 102 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint answering, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the electronic device 102 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 102 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 102 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 102 to shut down abnormally. In other embodiments, the electronic device 102 performs a boost on the output voltage of the battery 142 when the temperature is below a further threshold to avoid an abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 102 at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device 102 may receive key inputs, generate key signal inputs relating to user settings and function controls of the electronic device 102.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be attached to and detached from the electronic device 102 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 102 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 102 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 102 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 102 and cannot be separated from the electronic device 102.
In addition, an operating system runs on the above components. For example, the iOS os developed by apple, the Android open source os developed by google, the Windows os developed by microsoft, and the like. A running application may be installed on the operating system.
The operating system of the electronic device 102 may employ a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 102.
Fig. 3 is a block diagram of a software structure of the electronic device 102 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages. As shown in fig. 3, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. For example, in the embodiment of the present application, the application package may further include a smart home. When the scene is arranged, the intelligent home application can access the scene arrangement interface management service provided by the application program framework layer.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 3, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like. For example, in the embodiment of the present application, when a scene is arranged, the application framework layer may provide an API related to a scene arrangement function for the application layer, and provide a scene arrangement interface management service for the application layer, so as to implement the scene arrangement function.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions for the electronic device 102. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Although the Android system is taken as an example for description in the embodiments of the present application, the basic principle is also applicable to electronic devices based on an os such as iOS or Windows.
The following illustrates the workflow of the software and hardware of the electronic device 102 in connection with a scenario orchestration scenario.
When the touch sensor 180K receives the touch operation, the touch operation is reported to the processor 110, so that the processor responds to the touch operation, starts the application, and displays the user interface of the application on the display screen 194. For example, after receiving the touch operation on the smart home icon 401, the touch sensor 180K reports the touch operation on the smart home icon 401 to the processor 110, so that the processor 110 responds to the touch operation, starts an application corresponding to the smart home icon 401 (may be referred to as smart home), and displays a user interface of the smart home on the display screen 194. In addition, in the embodiment of the application, the terminal can start the smart home in other manners, and a user interface of the smart home is displayed on the display screen 194. For example, when the terminal displays a certain user interface after screen blacking, screen locking interface display, or unlocking, the terminal may respond to a voice instruction or shortcut operation of the user to start the smart home, and display the user interface of the smart home on the display screen 194.
The user interface of the smart home can include various function buttons such as an equipment adding button. In some embodiments, the user interface of the smart home may further include an option of the added smart home device, and the like. Specifically, the options of the added smart home devices may include names, icons, or control buttons (for example, switch buttons) of the smart home devices. For example, the user interface of the smart home may include an add device button, an added hall light, an air conditioner, a television, a water heater, and an electric cooker, as shown in fig. 4a (b). Taking the added hall lantern as an example, the user can click the hall lantern to operate the hall lantern. The user can add corresponding intelligent household equipment through the operation of adding the button.
With the popularization of smart home, a user may need to control a plurality of smart home devices respectively after getting up, before leaving home, after returning home, before sleeping and the like. For example, a user may turn on a hall lantern after returning home. Further, if the hall lantern has a plurality of modes, for example, a daytime mode, a reading mode, and a nighttime mode, the user can select the daytime mode. Then, the user turns on the television to play favorite programs, turns on the air conditioner, and can set the mode of the air conditioner to be a cooling mode, set the temperature of the air conditioner to be 15 degrees and set the air speed of the air conditioner to be low, or set the mode of the air conditioner to be a heating mode, set the temperature of the air conditioner to be 25 degrees and set the air speed of the air conditioner to be low, and turn on the electric cooker to cook. Generally, if the life of the user is regular, the operation of the smart home device is similar every day. For example, a user may turn on hall lights, a television and an air conditioner to perform repeated operations while regularly returning home every day. This causes great inconvenience to the user.
In order to solve the problem, in the prior art, a user can edit the state information of different intelligent home devices into a scene, and the user can simultaneously send control rules to a plurality of intelligent home devices without operating the intelligent home devices one by selecting the scene. For example, fig. 4A is a diagram illustrating a scene arrangement provided in the prior art. Fig. 4A (a) is a schematic front view of a mobile phone according to an embodiment of the present disclosure. The smart home APP icon 401 is displayed on a display screen of the mobile phone, the user can click the smart home APP icon 401, as shown in (b) of fig. 4A, the mobile phone responds to a click operation and displays a user interface of the smart home APP, and the user interface can include a pre-configured scene name, such as home-returning, sleeping, leaving and the like. And if the user does not log in the smart home APP, a user name and a password are required to be input. The user name may be a mobile phone number or a micro signal of the user, and is not limited. After the user logs in the smart home APP, the user interface may further include a user name. For convenience of description, hereinafter, as shown in (b) of fig. 4AThe user name may be 138 x 7332. In addition, the user can click the "+" icon on the user interface to add the smart home device, and the user interface can also comprise the configured icon of the smart home device. Understandably, the configured smart home devices may be smart home devices associated with user names, or smart home devices associated with device identifications. For example, as shown in fig. 4A (b), the user interface may further include a configured hall light icon, an air conditioner icon, a television icon, a water heater icon, and a rice cooker icon. The user can click the pre-configured scene name, and the mobile phone responds to the click operation and displays the scene arrangement interface. For example, as shown in (c) of fig. 4A, the user may click on a pre-configured scene name "go home" icon, as shown in (d) of fig. 4A, and the mobile phone displays a go home scene arrangement interface in response to the click operation. At this time, as shown in (a) of fig. 4B, the user can click
Figure BDA0001984631660000141
And the icon is added with the intelligent household equipment, and the state of the intelligent household equipment is set. As shown in (B) in fig. 4B, the cell phone displays an add device interface in response to a click operation. Certainly, in practical application, the device adding interface may further include other smart home devices, which is not limited in this application embodiment. The user can click the icon of the intelligent household equipment displayed on the interface of the adding equipment to set, and can click all the intelligent household equipment to be added. For example, the user may click on the scene name "hall lantern" icon, as shown in fig. 4B (c), and the cellular phone displays the hall lantern setting interface in response to the click operation. The user can set the switch state of the hall lantern to be on through selection operation, the color of the hall lantern is set to be white through the option icon '>' of the color, and the brightness of the hall lantern is set to be normal through the option icon '>' of the brightness. As shown in (d) in fig. 4B, the cellular phone displays the hall lantern setting result in response to the selection operation. Optionally, the user may also add an air conditioner, set the on-off state of the air conditioner to on, set the mode of the air conditioner to a cooling mode, set the temperature of the air conditioner to 15 degrees, set the air speed of the air conditioner to low, and so onAnd (5) operating. Therefore, when a user carries out scene arrangement by adopting the prior art, the operation process of the scene arrangement is very complex, and only two intelligent household devices, namely a lamp and an air conditioner, are arranged, so that more than 10 steps of operation are needed. In addition, when arranging, the user does not know whether the set smart home device can achieve the optimal state. For example, hall lantern is typically bright enough to illuminate the entire living room. When the air-conditioning temperature is 15 ℃, whether the room is cool enough or not is judged. If the results of the choreography do not reach the optimal state, the user may need to reset.
In the embodiment of the application, a user can arrange scenes by one key without adding devices for arranging scenes one by one, and operation steps are reduced. Meanwhile, the user can conveniently store the current optimal state of the intelligent household equipment, and the optimal state of the intelligent household equipment can be restored by one key in the future. For example, after a user gets up a bed a day, the user may turn on the function of the intelligent soymilk machine for preparing the five cereals soymilk, turn on the intelligent sound box to play favorite music and turn on the intelligent bedroom lamp. If the user feels that the states of the smart homes reach the most desired states, the states of the smart homes can be obtained through one key and arranged into a scene. The name of the scene may be entered by the user or may be automatically generated. Illustratively, the name of a scenario is a wake scenario. When the user gets up the bed the next day, the user can click the getting-up scene by one key to recover the optimal states of the intelligent soybean milk machine, the intelligent sound and the intelligent bedroom lamp, namely the states arranged by the user in the previous day.
Embodiments of the present application will be described below with particular reference to fig. 5 to 13. For convenience of description, the electronic device is a mobile phone as an example.
Fig. 5 is a first flowchart of a method for arranging smart home scenes according to an embodiment of the present application. As shown in fig. 5, the smart home scene arrangement method may include:
s501, the mobile phone receives one-key arrangement operation.
When a user needs to perform scene arrangement, the user can perform one-key arrangement operation to trigger the mobile phone to start a scene arrangement function. The one-touch programming operation may be a preset gesture or voice command input by the user. The preset gesture can be any one of a single-click gesture, a sliding gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture. In some embodiments, the one-touch programming operation may be an operation of a virtual key of the mobile phone by the user, or may be an operation of clicking a physical button of the mobile phone by the user.
For example, after entering the smart home APP, as shown in (a) of fig. 6A, the mobile phone displays a user interface of the smart home APP, and the user interface may include a one-key arrangement key 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation. And executes S502 to S509.
For another example, as shown in (a) in fig. 6B, in the user interface of the smart home APP, the user performs a sliding operation according to a sliding track 602. As shown in (B) of fig. 6B, the mobile phone displays a pull-down menu 603 in response to the sliding operation, and the pull-down menu 603 includes a key arrangement key 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation. And executes S502 to S509.
As another example, the user may also determine the scene name first. In some embodiments, the user may select a system preconfigured scene name. For example, as shown in (c) of fig. 4A, the user may click on a preconfigured scene name "go home" icon. In response to the click operation, the interface displayed by the cell phone is switched from the interface shown in (C) in fig. 4A to the interface shown in (a) in fig. 6C. As shown in (a) of fig. 6C, in the go-home scene composition interface, the user performs a slide operation according to the slide trajectory 602. As shown in (b) of fig. 6C, the mobile phone displays a pull-down menu 603 in response to the sliding operation, and the pull-down menu 603 includes a key arrangement key 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation. And executes S502 to S509.
The embodiment of the present application does not limit the one-key layout operation.
S502, the mobile phone responds to the one-key arrangement operation and sends equipment inquiry request information to the server.
The user clicks the one-key arrangement key, and after receiving the one-key arrangement operation, the mobile phone can generate equipment query request information and send the equipment query request information to the server. In some examples, if the server stores the state information of the smart home devices associated with the user account logged in by the mobile phone, the device query request information is used for the mobile phone to obtain the state information of the smart home devices associated with the user account logged in by the mobile phone from the server. The state information of the smart home devices associated with the user account logged in by the mobile phone can also be understood as the state information of the smart home devices associated with the user name. The device query request information is used for the mobile phone to acquire the state information of the intelligent home device associated with the user name from the server. The device query request information may include a user name, and request state information of the smart home device associated with the user name. For example, the user name may be 138 x 7332. In other examples, the user name may be replaced by a device identifier, the device query request information may include a device identifier, and request state information of the smart home device associated with the device identifier, for example, the device identifier may be a unique identifier of a mobile phone. For convenience of description, hereinafter, it is assumed that the device query request information includes a user name.
In addition, in some embodiments, the device query request information requests the latest status of all smart home devices associated with the user name. In other embodiments, the device query request information requests the latest state of the part of the smart home devices associated with the user name. For example, the device query request information may further include a type identifier of the smart home device. The type identifier of the smart home device may be a user-defined identifier or an identifier of the smart home device. The type identification of the intelligent household equipment is used for distinguishing intelligent household equipment with different purposes. For example, class I indicates smart home devices related to cooking in a kitchen, such as an electric rice cooker, a soymilk maker, a microwave oven, and the like. Class II refers to smart home devices, such as water heaters, etc., that are associated with washing in a toilet. The type identifier of the smart home device may be stored in the mobile phone or the server, which is not described in detail herein. If the server receives the device query request information, state information of the smart home devices associated with the user name and the type identification of the smart home devices is obtained according to the user name and the type identification of the smart home devices, and device query response information is sent to the mobile phone, wherein the device query response information may include the state information of the smart home devices associated with the user name and the type identification of the smart home devices. After the mobile phone receives the device query response information, arranging scenes according to the state information of the intelligent home devices associated with the user name and the type identifications of the intelligent home devices, namely generating control rules according to the state information of the intelligent home devices associated with the user name and the type identifications of the intelligent home devices. For example, if the device query request information includes the user name 138 x 7332 and the class I, after the server receives the user name 138 x 7332 and the class I, the server may first obtain the status information of all the smart home devices associated with the user name 138 x 7332, including the status information of the hall lamp, the status information of the air conditioner, the status information of the television, the status information of the electric cooker, and the status information of the water heater, and if the type of the electric cooker is identified as the class I, the server may obtain the status information of the electric cooker from the status information of the hall lamp, the status information of the air conditioner, the status information of the television, the status information of the electric cooker, and the status information of the water heater according to the class I, and the device query response information may include the status information of the electric cooker. The mobile phone can receive the state information of the electric cooker and perform scene arrangement. The detailed explanation of the scene layout can be described with reference to S506 to S509, and is not repeated.
In another possible implementation manner, the server may further store the state information of the smart home devices at time points in different time periods. The time period may be system preset or custom. Different time periods may be set for different scenarios. For example, the home returning time period corresponding to the home returning scene may be 18:00 to 19:00, the smart home devices have a plurality of states in the home returning time period, and the server may store the state information of the smart home devices at the time point of 19:00 in the home returning time period. The sleep time interval corresponding to the sleep scene is 22: 00-8: 00, and the server can store the state information of the intelligent household equipment at the time point of 8:00 in the sleep time interval. The leaving time period corresponding to the leaving scene can be 8: 00-18: 00, and the like, and the server can store the state information of the intelligent household equipment at the time point of 18:00 in the leaving time period. After the mobile phone receives a one-key arrangement operation, a preset gesture or a voice instruction, device query request information is sent to the server, the device query request information may include a user name and time information, and the time information may indicate a time when state information is needed. After receiving the device query request information, the server determines the time period to which the time indicated by the time information belongs, and acquires the state information of the smart home device corresponding to the time period to which the user name belongs. And the server sends equipment query response information to the mobile phone, wherein the equipment query response information comprises state information of the intelligent home equipment corresponding to the time period under the user name. After the mobile phone receives the equipment query response information, arranging scenes according to the state information of the intelligent home equipment corresponding to the time period under the user name. For example, if the server determines that the time indicated by the time information belongs to the time period of returning home, the state information of the smart home device corresponding to the time period of returning home under the user name 138 x 7332 may be obtained, and the device query response information includes the state information of the smart home device corresponding to the time period of returning home under the user name 138 x 7332. After receiving the device query response information, the mobile phone lays out a scene according to the state information of the smart home device corresponding to the time period of returning home under the user name 138 x 7332. The detailed explanation of the scene layout can be described with reference to S506 to S509, and is not repeated.
S503, the server receives the equipment inquiry request information sent by the mobile phone.
S504, the server sends equipment inquiry response information to the mobile phone.
And after receiving the equipment query request information, the server acquires the state information of the intelligent household equipment associated under the user name. For example, after receiving the device query request information, the server obtains a user name from the device query request information, and then obtains state information of the smart home device associated with the user name according to the user name. The equipment inquiry response information comprises state information of the intelligent household equipment associated with the user name.
In some embodiments, the device query request information is used to request the latest status information of all smart home devices under the user name. The latest state information of the smart home device may be information of a current state of the smart home device. And after receiving the equipment query request information, the server calls up the latest state information of all the intelligent home equipment under the user name according to the user name, and sends equipment query response information to the mobile phone, wherein the equipment query response information comprises the latest state information of the intelligent home equipment related to the user name.
Exemplarily, as shown in fig. 6A (a), the smart home devices matched under the user name 138 x 7332 are hall lantern, air conditioner, tv, water heater and electric cooker. When the server receives the equipment query request information sent by the mobile phone, the latest status information of the hall lantern, the latest status information of the air conditioner, the latest status information of the television, the latest status information of the water heater and the latest status information of the electric cooker which are associated with the user name 138 x 7332 are called, and then equipment query response information is sent to the mobile phone, wherein the equipment query response information comprises the latest status information of the hall lantern, the latest status information of the air conditioner, the latest status information of the television, the latest status information of the water heater and the latest status information of the electric cooker which are associated with the user name 138 x 7332.
For example, when the server receives the device query request message, the hall lantern is in the on state, and the device query response message returned by the server includes the information that the hall lantern is in the on state. Further, if the hall lamp is in the daytime mode of the on state, the device query response message may further include information that the hall lamp is in the daytime mode.
For another example, when the server receives the device query request message, the air conditioner is in the cooling mode in the on state, and the device query response message returned by the server at this time may include the cooling mode in the on state of the air conditioner. Further, if the temperature of the air conditioner is 15 degrees, the device query response message may further include status information that the air conditioner is at 15 degrees.
And S505, the mobile phone receives the equipment inquiry response information sent by the server.
S506, arranging scenes according to the state information of the intelligent household equipment associated with the user name by the mobile phone.
After the mobile phone receives the state information of the smart home devices associated with the user name, the state information of the smart home devices associated with the user name can be converted into the control rule according to the conversion rule. The conversion rule may refer to format conversion. For example, the state information may be converted to a control rule in json format or xml format. Illustratively, the mobile phone receives state information of a hall lamp, state information of an air conditioner, state information of a television, state information of a water heater, and state information of an electric cooker. The state information of the hall lantern may include that the switch state of the hall lantern is on and the mode of the hall lantern is the daytime mode. The state information of the air conditioner may include that the on-off state of the air conditioner is on, the mode of the air conditioner is a cooling mode, the temperature of the air conditioner is 15 degrees, and the wind speed of the air conditioner is small. The state information of the television may include a switching state of the television being on. The state information of the water heater may include a switching state of the water heater being on. The state information of the electric cooker may include that the on/off state of the electric cooker is on. The control rule of the hall lantern may include turning on the hall lantern and setting the mode of the hall lantern to daytime. The control rule of the air conditioner may include turning on the air conditioner, setting a mode of the air conditioner to a cooling mode, setting a temperature of the air conditioner to 15 degrees, and setting a wind speed of the air conditioner to be small. The control rules of the television may include turning on the television. The control rule of the water heater may include turning on the water heater. The control rule of the rice cooker may include turning on the rice cooker.
Then, the terminal associates the control rule with the scene name and performs S507, or S508 and S509.
Optionally, the scene names may be obtained as follows:
1. after the user performs the one-touch editing operation, the scene name can be manually input.
Specifically, in some embodiments, as shown in (a) in fig. 6A, the user clicks the one-click editing key 601, the mobile phone displays the setting scene name prompt box as shown in (b) in fig. 6A, and the user can set the scene name according to his/her own intention. The scene name prompt box can be displayed in a user interface of the smart home APP. As shown in fig. 6A (b), the user inputs "go home", and the mobile phone associates the control rule of the hall lantern, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater, and the control rule of the electric cooker with the scene name "go home", to obtain the control rule about the scene of going home.
In other embodiments, as shown in (B) of fig. 6B, the user clicks the one-click editing button 601, the mobile phone displays the setting scene name prompt box as shown in (c) of fig. 6B, and the user can set the scene name according to his/her own intention. The scene name prompt box can be displayed in a user interface of the smart home APP. As shown in fig. 6B (c), the user inputs "go home", and the mobile phone associates the control rule of the hall lantern, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater, and the control rule of the electric cooker with the scene name "go home", to obtain the control rule about the scene of going home.
It should be noted that the mobile phone may display the scene name prompt box after obtaining the state information of the smart home device, or may display the scene name prompt box before obtaining the state information of the smart home device.
2. The user selects the scene name first and then performs a one-key arrangement operation.
The user can select the scene name first and then perform one-key arrangement operation under the arrangement interface of the scene name. For example, as shown in fig. 6C, when the mobile phone performs scene editing on the scene editing interface going home, the scene name "going home" can be obtained. In other embodiments, the user may also create a scene name and then perform a key editing operation under the editing interface of the scene name.
3. The mobile phone automatically generates a scene name.
After the user performs one-key arrangement operation, the mobile phone automatically generates a scene name according to a preset rule. In some embodiments, the mobile phone obtains time according to the state of the smart home device, and determines the scene name. For example, the state acquisition time of the smart home device is any time point between 7 a.m. and 9 a.m., and the mobile phone automatically generates the "on duty" scene name. If the state acquisition time of the smart home is any time point between 7 and 9 pm, the mobile phone automatically generates an off duty scene name.
In other embodiments, if the smart home device has the corresponding type identifier, the mobile phone may further automatically generate the scene name according to the type identifier of the smart home device. For example, if the type of smart home device is identified as "kitchen," the cell phone automatically generates the scene name "kitchen.
The embodiment of the present application does not limit the manner of obtaining the scene name.
Further, after S506, the handset may further store the association relationship between the control rule and the scene name. As shown in fig. 5, the embodiment of the present application may further include S507.
And S507, the mobile phone stores the association relationship between the control rule and the scene name.
In some embodiments, after the handset associates the control rule with the scene name, the association relationship between the control rule and the scene name may be saved in the memory. Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone responds to the trigger scene operation after receiving the trigger scene operation, obtains the control rule associated with the scene name, and controls the smart home equipment according to the control rule associated with the scene name.
For example, the handset may associate the control rule with "go home", storing the control rule for a go home scenario. The user can click a pre-configured scene name 'come home', and the mobile phone responds to the click operation and respectively sends control rules to the hall lamp, the air conditioner, the television, the water heater and the electric cooker. If the hall lamp is in the off state, the hall lamp is turned on after receiving the control rule of the hall lamp, the hall lamp is converted into the on state from the off state, the mode of the hall lamp is set to be daytime, and the brightness of the hall lamp is set to be bright. If the air conditioner is in the shutdown state, the air conditioner is started after receiving the control rule of the air conditioner, the air conditioner is switched from the shutdown state to the startup state, the mode of the air conditioner is set to be the refrigeration mode, the temperature of the air conditioner is set to be 15 ℃, and the air speed of the air conditioner is set to be small. The television is in a power-off state, and after receiving the control rule of the television, the television is turned on, and the television is switched from the power-off state to the power-on state. The water heater is in a shutdown state, the water heater is started after receiving the control rule of the water heater, and the water heater is switched to the starting state from the shutdown state. The electric cooker is in a shutdown state, and is turned on after receiving the control rules of the electric cooker.
Alternatively, the user may set the execution delay time. For example, after receiving the trigger scene operation, the mobile phone waits for a period of time, responds to the trigger scene operation, acquires the control rule associated with the scene name, and controls the smart home device according to the control rule associated with the scene name.
In other embodiments, in order to save the storage space of the mobile phone, after the mobile phone lays out a scene according to the state information of the smart home device associated with the user name, that is, S506, the embodiment of the present application may further include S508 and S509.
And S508, the mobile phone sends the user name and the incidence relation between the scene name and the control rule to the server.
S509, the server receives the user name sent by the mobile phone, and the incidence relation between the scene name and the control rule.
In some embodiments, after receiving the user name sent by the mobile phone and the association relationship between the scene name and the control rule, the server stores the association relationship between the scene name and the control rule under the user name. Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone sends a user name and a scene name to the server after receiving the trigger scene operation, the server can acquire the control rule of the smart home device associated with the scene name according to the user name and the scene name, and can respectively send the control rule to the smart home device associated with the scene name to respectively control the smart home device.
For example, after receiving the user name 138 x 7332 sent by the mobile phone and the association relationship between the scene name "go home" and the control rule of the hall lantern, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater and the control rule of the electric cooker, the server stores the association relationship between the scene name "go home" and the control rule of the hall lantern, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater and the control rule of the electric cooker under the user name 138 x 7332. After the user clicks the scene name 'home' icon, the mobile phone sends the user name 138 x 7332 and the scene name 'home', and after the server receives the user name 138 x 7332 and the scene name 'home', the server obtains the control rule of the hall lamp, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater and the control rule of the electric cooker under the user name 138 x 7332 and the scene name 'home', sends the control rule of the hall lamp, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater and the control rule of the electric cooker to the hall lamp, sends the control rule of the air conditioner to the air conditioner, sends the control rule of the television to the water heater and sends the control rule of the electric cooker according to the user name 138 x 7332 and the scene name 'home'. After receiving the control rules, the hall lantern, the air conditioner, the television, the water heater and the electric cooker execute the control rules, and for specific explanation, reference may be made to the above explanation and no further description is given.
In other embodiments, the handset may also send the association between the scene name and the control rule to the server. And the server receives the incidence relation between the scene name and the control rule sent by the mobile phone. After receiving the association between the scene name and the control rule sent by the mobile phone, the server may store the association between the scene name and the control rule according to the source address or the identifier of the mobile phone. Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone sends a user name and a scene name to the server after receiving the trigger scene operation, the server can acquire the control rule of the smart home device associated with the scene name according to the user name and the scene name, and can respectively send the control rule to the smart home device associated with the scene name to respectively control the smart home device. For a detailed explanation, reference may be made to the above description without further elaboration.
The method and the device have the advantages that the mobile phone can acquire the state information of the intelligent household equipment associated with the user name from the server side according to the one-key arrangement operation instruction of the user, the state information of the intelligent household equipment is automatically converted into the control rule, the situation that the user manually adds the intelligent household equipment with the scene arrangement one by one is avoided, the operation steps of the user for scene arrangement are effectively reduced, in addition, if the user feels that the states of the intelligent household equipment do not reach the optimal state, the optimal state of the intelligent household equipment can be recovered through one-key operation, and therefore the user experience degree is improved.
An embodiment of the smart home scene editing method provided by the present application is specifically described below by taking fig. 7 as an example.
For example, as shown in (a) in fig. 7, a smart home APP icon 401 is displayed on a display screen of the mobile phone, and a user may click the smart home APP icon 401, as shown in (b) in fig. 7, the mobile phone displays a user interface of the smart home APP in response to a click operation. The user interface comprises a user name 138, a user name 7332, a scene name 'go home', a scene name 'sleep', a scene name 'leave home', paired smart home devices, a guest hall lamp, an air conditioner, a television, a water heater and an electric cooker, and a one-key arrangement key 601. The user may click the one-click layout button 601, the mobile phone receives the one-click layout operation, and the mobile phone sends device query request information to the server in response to the click operation, where the device query request information includes the user name 138 × 7332. After receiving the equipment query request information, the server acquires the state information of all the smart home equipment associated with the user name 138 x 7332 according to the user name 138 x 7332, namely the state information of the hall lantern, the state information of the air conditioner, the state information of the television, the state information of the water heater and the state information of the electric cooker. And the server sends equipment inquiry response information to the mobile phone, wherein the equipment inquiry response information comprises the state information of the hall lamp, the state information of the air conditioner, the state information of the television, the state information of the water heater and the state information of the electric cooker. After receiving the equipment inquiry response information, the mobile phone converts the state information of the hall lamp, the state information of the air conditioner, the state information of the television, the state information of the water heater and the state information of the electric cooker into control rules respectively, and associates the control rules with the scene names. For example, when the server receives the device query request message, the hall lantern is in the on state, and the device query response message returned by the server includes the information that the hall lantern is in the on state. Further, if the hall lamp is in the daytime mode of the on state, the device query response message may further include information that the hall lamp is in the daytime mode. The control rule of the hall lantern may include turning on the hall lantern and setting the mode of the hall lantern to daytime. Since the scene name is not determined before the user clicks the one-click arranging key 601, as shown in (c) of fig. 7, the mobile phone displays a scene name setting prompt box on the user interface of the smart home APP, the user inputs "go home", and the mobile phone associates the control rule of the hall lamp, the control rule of the air conditioner, the control rule of the television, the control rule of the water heater, and the control rule of the electric cooker with the scene name "go home". Subsequently, as shown in (d) in fig. 7, the user may click the scene name "go home" icon of the user interface of the smart home APP, and the mobile phone responds to the click operation and sends the control rules to the hall lantern, the air conditioner, the television, the water heater, and the electric cooker, respectively. If the hall lamp is in the off state, the hall lamp is turned on after receiving the control rule of the hall lamp, the hall lamp is converted into the on state from the off state, the mode of the hall lamp is set to be daytime, and the brightness of the hall lamp is set to be bright. If the air conditioner is in the shutdown state, the air conditioner is started after receiving the control rule of the air conditioner, the air conditioner is switched from the shutdown state to the startup state, the mode of the air conditioner is set to be the refrigeration mode, the temperature of the air conditioner is set to be 15 ℃, and the air speed of the air conditioner is set to be small. The television is in a power-off state, and after receiving the control rule of the television, the television is turned on, and the television is switched from the power-off state to the power-on state. The water heater is in a shutdown state, the water heater is started after receiving the control rule of the water heater, and the water heater is switched to the starting state from the shutdown state. The electric cooker is in a shutdown state, and is turned on after receiving the control rules of the electric cooker.
In some examples, as shown in fig. 8 (a), the user may click on a scene name "home" icon of a user interface of the smart home APP, as shown in fig. 8 (b), and the mobile phone may further display a home scene interface in response to the click operation, where the home scene interface includes smart home devices associated with the scene name "home" under the user name 138 × 7332, including a hall light, an air conditioner, a television, a water heater, and the like. Wherein, the user can click "…" to view other smart home devices, such as an electric rice cooker, associated with the scene name "go home".
Fig. 9 is a flowchart illustrating a second method for arranging smart home scenes according to an embodiment of the present application. As shown in fig. 9, the smart home scene arrangement method may include:
s901, the mobile phone receives a first operation.
In the embodiment of the present application, the first operation may be a one-touch operation. The one-key arrangement operation may be, but is not limited to, a user clicking a virtual button, a user clicking a physical key, or a user issuing a voice command, a preset gesture input by the user, or the like.
In some embodiments, as shown in fig. 6A (a), the mobile phone displays a user interface of the smart home APP, and the user interface may include a key arrangement 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation.
For another example, as shown in (a) in fig. 6B, in the user interface of the smart home APP, the user performs a sliding operation according to a sliding track 602. As shown in (B) of fig. 6B, the mobile phone displays a pull-down menu 603 in response to the sliding operation, and the pull-down menu 603 includes a key arrangement key 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation.
As another example, the user may also determine the scene name first. In some embodiments, the user may select a system preconfigured scene name. For example, as shown in (c) of fig. 4A, the user may click on a preconfigured scene name "go home" icon. As shown in (d) in fig. 4A, the handset displays a return-to-home scene arrangement interface in response to a click operation. As shown in (a) of fig. 6C, in the go-home scene composition interface, the user performs a slide operation according to the slide trajectory 602. As shown in (b) of fig. 6C, the mobile phone displays a pull-down menu 603 in response to the sliding operation, and the pull-down menu 603 includes a key arrangement key 601. The user can click the one-touch layout button 601, and the mobile phone receives the one-touch layout operation. The embodiment of the present application does not limit the way of the one-key editing operation. For specific explanation, reference may be made to the detailed description of S501, and details are not repeated in the embodiments of the present application.
S902, the mobile phone responds to the first operation and requests the state information of the intelligent household equipment from the server.
And S903, receiving the state information of the intelligent household equipment by the mobile phone.
When the mobile phone receives the one-key arrangement operation, the state information of the intelligent home equipment associated with the user name can be obtained according to the one-key arrangement operation. For example, the handset may send device query request information to the server, the device query request information including a username. The server receives equipment query request information sent by the mobile phone, and obtains state information of the intelligent home equipment associated with the user name. And the server sends equipment query response information to the mobile phone, wherein the equipment query response information comprises state information of the intelligent household equipment associated with the user name. And the mobile phone receives the equipment inquiry response information sent by the server. For specific explanation, reference may be made to detailed descriptions of S502 to S505, and details are not described in the embodiments of the present application.
And S904, generating a control rule by the mobile phone according to the state information of the intelligent household equipment.
After the mobile phone acquires the state information of the intelligent home equipment associated with the user name, the state information of the intelligent home equipment associated with the user name is converted into a control rule according to a conversion rule. The conversion rule may refer to format conversion. For example, the state information may be converted into a control rule in json format or xml format, and then the control rule is associated with the scene name, which is specifically described in the foregoing S506 and is not described in detail.
S905, the mobile phone stores the association relationship between the control rule and the scene name.
Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone responds to the trigger scene operation after receiving the trigger scene operation, obtains the control rule associated with the scene name, and controls the smart home equipment according to the control rule associated with the scene name. As described in S507 above, details are not repeated.
Optionally, in order to save the storage space of the mobile phone, the association relationship between the control rule and the scene name may be stored by the server. The details of the above-mentioned steps S508 to S509 are not repeated.
After the mobile phone stores the association relationship between the control rule and the scene name, if the user opens the smart home scene, the following steps can be executed:
s906, the mobile phone receives a second operation of starting the intelligent household scene.
And S907, the mobile phone responds to the second operation and controls the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
The detailed description of the above corresponding description of fig. 7 is omitted for brevity.
According to the intelligent home scene arrangement method provided by the embodiment of the application, the mobile phone can acquire the state information of the intelligent home equipment associated with the user name from the server side according to the one-key arrangement operation instruction of the user, the state information of the intelligent home equipment is automatically converted into the control rule, the situation that the user manually adds the intelligent home equipment arranged with scenes one by one is avoided, the operation steps of arranging the scenes by the user are effectively reduced, the user can conveniently arrange the optimal state of the intelligent home equipment, and the user experience degree is improved.
Fig. 10 is a third flowchart of a method for arranging smart home scenes according to an embodiment of the present application. As shown in fig. 10, the smart home scene arrangement method may include:
s1001, the mobile phone receives and starts the arranging operation.
When a user needs to perform scene arrangement, the user can perform arrangement operation to trigger the mobile phone to start a scene arrangement function. In some embodiments, the starting editing operation may specifically be an operation of a virtual key of the mobile phone by the user, or a user clicking a physical button of the mobile phone.
For example, as shown in fig. 11A (a), the mobile phone displays a user interface of the smart home APP, which may include a start programming key 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start scheduling operation, S1002 is executed.
For another example, as shown in (a) in fig. 11B, in the user interface of the smart home APP, the user performs a sliding operation according to a sliding trajectory 1102. As shown in (B) in fig. 11B, the mobile phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a start arrangement key 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start scheduling operation, S1002 is executed.
As another example, the user may also determine the scene name first. In some embodiments, the user may select a system preconfigured scene name. For example, as shown in (c) of fig. 4A, the user may click on a preconfigured scene name "go home" icon. As shown in (d) in fig. 4A, the handset displays a return-to-home scene arrangement interface in response to a click operation. As shown in (a) of fig. 11C, when the user wants to compose a scene of going home, the user performs a slide operation according to the slide trajectory 1102 in the scene-of-going-home composition interface. As shown in (b) in fig. 11C, the mobile phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a start arrangement key 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start scheduling operation, S1002 is executed.
And S1002, the mobile phone responds to the starting of the arranging operation and starts to record the intelligent household equipment operated by the user.
And after the mobile phone receives the arranging starting operation, the mobile phone starts to record the intelligent household equipment operated by the user. For example, a user operates the smart home device on the smart home APP of the mobile phone, or operates the smart home device, and the mobile phone starts to record the smart home device operated by the user.
In some examples, after the mobile phone receives the start of the arranging operation, the user turns on the hall light and the air conditioner, and the mobile phone can start to record the identifier of the hall light and the identifier of the air conditioner operated by the user.
And S1003, the mobile phone receives the arranging stopping operation.
In some embodiments, the stop arranging operation may specifically be an operation of a virtual key of the mobile phone by the user, or a user clicking a physical button of the mobile phone.
For example, as shown in fig. 11A (b), the cell phone displays a user interface of the smart home APP, which may include a stop programming button 1104. The user can click the stop scheduling button 1104 and the handset receives the stop scheduling operation. After the handset receives the stop arranging operation, S1004 is executed.
For another example, as shown in (a) in fig. 11B, in the user interface of the smart home APP, the user performs a sliding operation according to a sliding trajectory 1102. As shown in (c) in fig. 11B, the cellular phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a stop arrangement key 1104 therein. The user can click the stop scheduling button 1104 and the handset receives the stop scheduling operation. After the handset receives the stop arranging operation, S1004 is executed.
As shown in (a) of fig. 11C, when the user wants to compose a scene of going home, the user performs a slide operation according to the slide trajectory 1102 in the scene-of-going-home composition interface. As shown in (C) in fig. 11C, the cellular phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a stop arrangement key 1104 therein. The user can click the stop scheduling button 1104 and the handset receives the stop scheduling operation. After the handset receives the stop arranging operation, S1004 is executed.
And S1004, the mobile phone responds to the arrangement stopping operation and sends equipment inquiry request information to the server.
The equipment query request information comprises a user name and an identifier of the intelligent home equipment operated by the user. The intelligent household equipment operated by the user is the intelligent household equipment recorded between the time when the mobile phone receives the starting arranging operation and the time when the mobile phone receives the stopping arranging operation. In other examples, the username may be replaced with a device identification. Specifically, reference may be made to the description of S502, which is not repeated herein.
S1005, the server receives the device query request information sent by the mobile phone.
S1006, the server sends the equipment inquiry response information to the mobile phone.
The server acquires a user name from the equipment query request information after receiving the equipment query request information, then acquires state information of the intelligent home equipment operated by the user under the user name, and sends query response information to the mobile phone, wherein the query response information comprises the state information of the intelligent home equipment operated by the user under the user name.
Illustratively, if the user clicks the start schedule button, the hall lantern and the air conditioner are turned on. Further, the hall lights may have a plurality of modes, such as a day mode, a reading mode, and a night mode, in which case the user may select the day mode. The user can also set the mode of the air conditioner to be a refrigeration mode, set the temperature of the air conditioner to be 15 ℃, and set the air speed of the air conditioner to be small. And the hall lamp and the air conditioner upload the state information of the hall lamp and the air conditioner to the server. The mobile phone records the intelligent household equipment operated by the user, such as the identification of the hall lamp and the identification of the air conditioner. Then, the user clicks the stop arranging button again, the mobile phone receives the stop arranging operation, the mobile phone responds to the click operation, the mobile phone sends equipment query request information to the server, the equipment query request information comprises a user name 138 x 7332, a hall lamp identifier and an air conditioner identifier, the server receives the equipment query request information, obtains state information of a hall lamp under the user name 138 x 7332 and state information of the air conditioner according to the user name 138 x 7332, the hall lamp identifier and the air conditioner identifier, and sends equipment query response information to the mobile phone, and the equipment query response information comprises the state information of the hall lamp and the state information of the air conditioner.
And S1007, the mobile phone receives the device query response information sent by the server.
And S1008, arranging scenes by the mobile phone according to the state information of the intelligent household equipment operated by the user under the user name.
After the mobile phone receives the state information of the intelligent home equipment operated by the user under the user name, the state information of the intelligent home equipment operated by the user under the user name can be converted into the control rule according to the conversion rule. The conversion rule may refer to format conversion. For example, the state information may be converted to a control rule in json format or xml format. Illustratively, the mobile phone receives status information of the hall lantern and status information of the air conditioner. The state information of the hall lantern may include that the switch state of the hall lantern is on and the mode of the hall lantern is the daytime mode. The state information of the air conditioner may include that the on-off state of the air conditioner is on, the mode of the air conditioner is a cooling mode, the temperature of the air conditioner is 15 degrees, and the wind speed of the air conditioner is small. The control rule of the hall lantern may include turning on the hall lantern and setting the mode of the hall lantern to daytime. The control rule of the air conditioner may include turning on the air conditioner, setting a mode of the air conditioner to a cooling mode, setting a temperature of the air conditioner to 15 degrees, and setting a wind speed of the air conditioner to be small.
Then, the terminal associates the control rule with the scene name, and performs S1009 or S1010 and S1011.
Optionally, the scene names may be obtained as follows:
1. after the user performs the stop scheduling operation, the scene name may be manually input.
Specifically, in some embodiments, as shown in (b) of fig. 11A, the user clicks the stop arrangement button 1104, the mobile phone displays a set scene name prompt box as shown in (c) of fig. 11A, and the user can set a scene name according to his/her own intention. The scene name prompt box can be displayed in a user interface of the smart home APP. As shown in (c) of fig. 11A, the user inputs "go home", and the mobile phone associates the control rule of the hall lantern and the control rule of the air conditioner with the scene name "go home", resulting in a control rule regarding the scene of going home.
In other embodiments, as shown in (c) of fig. 11B, the user clicks the stop editing button 1104, the mobile phone displays a scene name setting prompt box as shown in (d) of fig. 11B, and the user can set the scene name according to his or her own intention. The scene name prompt box can be displayed in a user interface of the smart home APP. As shown in (d) in fig. 11B, the user inputs "go home", and the mobile phone associates the control rule of the hall lantern and the control rule of the air conditioner with the scene name "go home", resulting in a control rule regarding the scene of going home.
It should be noted that the mobile phone may display the scene name prompt box after obtaining the state information of the smart home device, or may display the scene name prompt box before obtaining the state information of the smart home device.
2. The user selects the scene name first, and then performs the operation of starting the arrangement and the operation of stopping the arrangement.
The user can select the scene name first, and then perform the start arranging operation and the stop arranging operation under the arranging interface of the scene name. For example, as shown in fig. 11C, when the mobile phone performs scene editing on the scene editing interface going home, the scene name "going home" can be acquired. In other embodiments, the user may also create a scene name and then perform a key editing operation under the editing interface of the scene name.
For other explanations of the association of the control rule and the scene name by the terminal, reference may be made to the explanation of S506, and details of the embodiments of the present application are not repeated herein. The embodiment of the present application does not limit the manner of obtaining the scene name.
Further, after S1008, the handset may further store the association relationship between the control rule and the scene name. As shown in fig. 10, the embodiment of the present application may further include S1009.
S1009, the mobile phone stores the association relation between the control rule and the scene name.
In some embodiments, after the handset associates the control rule with the scene name, the association relationship between the control rule and the scene name may be saved in the memory. Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone responds to the trigger scene operation after receiving the trigger scene operation, obtains the control rule associated with the scene name, and controls the smart home equipment according to the control rule associated with the scene name. For a detailed explanation, reference may be made to the detailed explanation of S507, which is not repeated.
Optionally, in other embodiments, in order to save the storage space of the mobile phone, after the mobile phone lays out the scene according to the state information of the smart home device associated with the user name, that is, S1008, the embodiments of the present application may further include S1010 and S1011.
S1010, the mobile phone sends the user name and the incidence relation between the scene name and the control rule to the server.
S1011, the server receives the user name sent by the mobile phone and the incidence relation between the scene name and the control rule.
In some embodiments, after receiving the user name sent by the mobile phone and the association relationship between the scene name and the control rule, the server stores the association relationship between the scene name and the control rule under the user name. Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone sends a user name and a scene name to the server after receiving the trigger scene operation, the server can acquire the control rule of the smart home device associated with the scene name according to the user name and the scene name, and can respectively send the control rule to the smart home device associated with the scene name to respectively control the smart home device. For a detailed explanation, reference may be made to the detailed explanation of S508 and S509, which is not repeated.
The embodiment of the application has the advantage that in some cases, a user wants to acquire and arrange the state information of the smart home device in a specific scene. For example, a user can operate the hall lantern, the air conditioner and the television every day when the user goes home, and the user only wants to schedule state information of the hall lantern, the air conditioner and the television and does not want to schedule other intelligent household equipment. According to the embodiment of the application, the user can arrange the specific intelligent household equipment. The operation steps of arranging scenes by the user can be effectively reduced, the user can conveniently arrange the optimal state of the intelligent household equipment, and the user experience is improved.
Fig. 12 is taken as an example to specifically describe an embodiment of the smart home scene arrangement method provided in the present application.
For example, as shown in (a) in fig. 12, a smart home APP icon 401 is displayed on a display screen of the mobile phone, and a user may click the smart home APP icon 401, as shown in (b) in fig. 12, the mobile phone displays a user interface of the smart home APP in response to a click operation. The user interface comprises a user name 138, 7332, a scene name 'go home', a scene name 'sleep', a scene name 'leave home', paired smart home devices including a hall light, an air conditioner, a television, a water heater and an electric cooker, and a start arranging button 1101 and a stop arranging button 1104. The user can click the start layout button 1101 and the mobile phone receives the start layout operation, such as the mobile phone recording the identification of the hall light and the identification of the air conditioner operated by the user. For example, the user turns on hall lights and an air conditioner. Further, the hall lights may have a plurality of modes, such as a day mode, a reading mode, and a night mode, in which case the user may select the day mode. The user can also set the mode of the air conditioner to be a refrigeration mode, set the temperature of the air conditioner to be 15 ℃, and set the air speed of the air conditioner to be small. And the hall lamp and the air conditioner upload the state information of the hall lamp and the air conditioner to the server. Then, the user clicks the stop layout button 1104 again, the mobile phone receives the stop layout operation, and in response to the click operation, the mobile phone sends device query request information to the server, where the device query request information includes the user name 138 × 7332, the identifier of the living room lamp, and the identifier of the air conditioner. After receiving the device query request information, the server acquires the state information of the smart home devices operated by the user under the user name 138 x 7332, that is, the state information of the hall lantern and the state information of the air conditioner, according to the user name 138 x 7332. And the server sends equipment inquiry response information to the mobile phone, wherein the equipment inquiry response information comprises the state information of the hall lamp and the state information of the air conditioner. After receiving the equipment inquiry response information, the mobile phone converts the state information of the hall lantern into a control rule of the hall lantern and converts the state information of the air conditioner into the control rule of the air conditioner. For example, when the server receives the device query request message, the hall lantern is in the on state, and the device query response message returned by the server includes the information that the hall lantern is in the on state. Further, if the hall lamp is in the daytime mode of the on state, the device query response message may further include information that the hall lamp is in the daytime mode. The control rule of the hall lantern may include turning on the hall lantern and setting the mode of the hall lantern to daytime. If the scene name is not determined before the user clicks the start arranging button 1101 and the stop arranging button 1104, as shown in (c) of fig. 12, the mobile phone displays a scene name setting prompt box on the user interface of the smart home APP, the user inputs "go home", and the mobile phone associates the control rule of the hall lamp and the control rule of the air conditioner with the scene name "go home". Subsequently, as shown in (d) in fig. 12, the user may click on the scene name "go home" icon of the user interface of the smart home APP, and the mobile phone sends the control rules to the hall lantern and the air conditioner, respectively, in response to the click operation. If the hall lamp is in the off state, the hall lamp is turned on after receiving the control rule of the hall lamp, the hall lamp is converted into the on state from the off state, the mode of the hall lamp is set to be daytime, and the brightness of the hall lamp is set to be bright. If the air conditioner is in the shutdown state, the air conditioner is started after receiving the control rule of the air conditioner, the air conditioner is switched from the shutdown state to the startup state, the mode of the air conditioner is set to be the refrigeration mode, the temperature of the air conditioner is set to be 15 ℃, and the air speed of the air conditioner is set to be small.
Fig. 13 is a fourth flowchart of a method for arranging smart home scenes according to an embodiment of the present application. As shown in fig. 13, the smart home scene arrangement method may include:
and S1301, receiving and starting operation by the mobile phone.
In this embodiment of the application, the starting operation may be a starting arranging operation, and the starting arranging operation may be a manner in which a user clicks a virtual button, a user clicks a physical key, or a user issues a voice instruction, a preset gesture input by the user, or the like, without limitation.
In some embodiments, as shown in fig. 11A (a), the cell phone displays a user interface of the smart home APP, which may include a start programming button 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start editing operation, S1302 is executed.
For another example, as shown in (a) in fig. 11B, in the user interface of the smart home APP, the user performs a sliding operation according to a sliding trajectory 1102. As shown in (B) in fig. 11B, the mobile phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a start arrangement key 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start editing operation, S1302 is executed.
As another example, the user may also determine the scene name first. In some embodiments, the user may select a system preconfigured scene name. For example, as shown in (c) of fig. 4A, the user may click on a preconfigured scene name "go home" icon. As shown in (d) in fig. 4A, the handset displays a return-to-home scene arrangement interface in response to a click operation. As shown in (a) of fig. 11C, when the user wants to compose a scene of going home, the user performs a slide operation according to the slide trajectory 1102 in the scene-of-going-home composition interface. As shown in (b) in fig. 11C, the mobile phone displays a pull-down menu 1103 in response to the slide operation, and the pull-down menu 1103 includes a start arrangement key 1101. The user can click on the start layout button 1101 and the handset receives the start layout operation. After the handset receives the start editing operation, S1302 is executed. For specific explanation, reference may be made to detailed description of S1001, and details of embodiments of the present application are not repeated.
And S1302, responding to the starting operation by the mobile phone, and starting to record the intelligent household equipment operated by the user.
When the mobile phone receives the start operation, the smart home device which is operated by the user can be recorded according to the start operation. For specific explanation, reference may be made to the detailed description of S1002, and details are not described in the embodiments of the present application.
S1303, the mobile phone receives the first operation.
In an embodiment of the present application, the first operation may be a stop orchestration operation. The stop arranging operation may be a manner of clicking a virtual button by a user, clicking a physical button by a user, or giving a voice instruction by a user, inputting a preset gesture by a user, or the like, and is not limited. For specific explanation, reference may be made to detailed description of S1003, and details are not described in the embodiments of the present application.
And S1304, responding to the first operation by the mobile phone, and requesting the state information of the intelligent household equipment operated by the user from the server.
S1305, the mobile phone receives state information of the intelligent household equipment operated by the user.
After receiving the scheduling stop operation, the mobile phone may send device query request information to the server, where the device query request information includes a user name and an identifier of the smart home device operated by the user. The intelligent household equipment operated by the user is the intelligent household equipment recorded between the time when the mobile phone receives the starting arranging operation and the time when the mobile phone receives the stopping arranging operation.
The server receives equipment query request information sent by the mobile phone, acquires a user name from the equipment query request information, and then acquires state information of the intelligent home equipment operated by the user under the user name. And the server sends equipment query response information to the mobile phone, wherein the query response information comprises state information of the intelligent household equipment operated by the user under the user name. And the mobile phone receives the equipment inquiry response information sent by the server. For specific explanation, reference may be made to detailed descriptions of S1004 to S1006, and details are not repeated in the embodiments of the present application.
And S1306, generating a control rule by the mobile phone according to the state information of the intelligent household equipment operated by the user.
After the mobile phone receives the state information of the intelligent home equipment operated by the user under the user name, the state information of the intelligent home equipment operated by the user under the user name can be converted into the control rule according to the conversion rule. The conversion rule may refer to format conversion. For example, the state information may be converted to a control rule in json format or xml format. The control rule is then associated with the scene name. For specific explanation, reference may be made to the detailed description of S1008, and details are not described in the embodiments of the present application.
S1307, the mobile phone stores the association relationship between the control rule and the scene name.
Therefore, a user can trigger scene operation through the smart home APP on the mobile phone at any time and any place, the mobile phone responds to the trigger scene operation after receiving the trigger scene operation, obtains the control rule associated with the scene name, and controls the smart home equipment according to the control rule associated with the scene name. As described in S507 above, details are not repeated.
Optionally, in order to save the storage space of the mobile phone, the association relationship between the control rule and the scene name may be stored by the server. The details of the above-mentioned steps S508 to S509 are not repeated.
After the mobile phone stores the association relationship between the control rule and the scene name, if the user opens the smart home scene, the following steps can be executed:
s1308, the mobile phone receives a second operation of starting the intelligent household scene.
And S1309, the mobile phone responds to the second operation and controls the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
The detailed description of the above corresponding description of fig. 12 is omitted for brevity.
The embodiment of the application has the advantage that in some cases, a user wants to acquire and arrange the state information of the smart home device in a specific scene. For example, a user can operate the hall lantern, the air conditioner and the television every day when the user goes home, and the user only wants to schedule state information of the hall lantern, the air conditioner and the television and does not want to schedule other intelligent household equipment. According to the embodiment of the application, the user can arrange the specific intelligent household equipment. The operation steps of arranging scenes by the user can be effectively reduced, the user can conveniently arrange the optimal state of the intelligent household equipment, and the user experience is improved.
In an implementation manner, the smart home scenario may also correspond to a first rule, where the first rule indicates that the smart home device enters a state corresponding to the state information when the first event occurs. For example, the entrance door is provided with a sensor, when a user opens the entrance door, the sensor senses that the entrance door is opened, the opened state information is sent to the terminal or/and the server, after the terminal or/and the server receives the opened state information of the entrance door, a control rule associated with the opened state information of the entrance door is sent to the smart home devices, and after the smart home devices receive the control rule, the smart home devices enter a state corresponding to the state information and corresponding to the control rule.
In the embodiments provided by the application, the method provided by the embodiments of the application is introduced from the perspective of interaction among the electronic device, the smart home device, and the server. It is understood that, for each network element, for example, an electronic device, to implement each function in the method provided in the embodiments of the present application, the electronic device includes a hardware structure and/or a software module corresponding to executing each function. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Some embodiments of the present application also provide an electronic device, as shown in fig. 14, which may include: a touch screen 1401, wherein the touch screen 1401 may comprise a touch sensitive surface 1406 and a display screen 1407; one or more processors 1402; a memory 1403; and one or more computer programs 1404, which may be connected by one or more communication buses 1405. Wherein the one or more computer programs 1404 are stored in the memory 1403 and configured to be executed by the one or more processors 1402, the one or more computer programs 1404 including instructions that can be used to perform the steps performed by the cell phone in the respective embodiments of fig. 5, 9, 10 and 13. Of course, the electronic device shown in fig. 14 may further include other devices such as a sensor module, an audio module, and a SIM card interface, which is not limited in this embodiment. When the electronic device shown in fig. 14 further includes other devices such as a sensor module, an audio module, and a SIM card interface, it may be the electronic device shown in fig. 2.
In the embodiment of the present application, the electronic device may be divided into the functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing each function module according to each function, fig. 15 shows a possible composition diagram of the smart home scene editing apparatus described above and in the embodiments, where the smart home scene editing apparatus can execute the steps executed by the electronic device in any method embodiment of the present application. As shown in fig. 15, the smart home scene composition apparatus is an electronic device or a communication apparatus supporting the electronic device to implement the method provided in the embodiment, for example, the communication apparatus may be a chip system. This intelligent house scene editing device can include: a processing unit 1501, a display unit 1502, and a transmitting/receiving unit 1503.
The processing unit 1501 is configured to support the smart home scene editing apparatus to execute the method described in this embodiment. For example, the processing unit 1501 is configured to execute or be used to support the smart home scene arrangement apparatus to execute S506 in the smart home scene arrangement method shown in fig. 5, S904 in the smart home scene arrangement method shown in fig. 9, S1008 in the smart home scene arrangement method shown in fig. 10, and S1306 in the smart home scene arrangement method shown in fig. 13.
The display unit 1502 is configured to execute or be used to support the smart home scene arrangement apparatus to execute S501 in the smart home scene arrangement method shown in fig. 5, S901 and S906 in the smart home scene arrangement method shown in fig. 9, S1001 and S1003 in the smart home scene arrangement method shown in fig. 10, and S1301, S1303 and S1308 in the smart home scene arrangement method shown in fig. 13.
The transceiving unit 1503 is configured to execute or be used to support the smart home scene arrangement apparatus to execute S502 and S505 in the smart home scene arrangement method shown in fig. 5, S903 in the smart home scene arrangement method shown in fig. 9, S1004 and S1007 in the smart home scene arrangement method shown in fig. 10, and S1305 in the smart home scene arrangement method shown in fig. 13.
In this embodiment of the application, further, as shown in fig. 15, the smart home scene editing apparatus may further include: a memory cell 1504.
The storage unit 1504 is configured to execute or be used to support the smart home scene arrangement apparatus to execute S507 in the smart home scene arrangement method shown in fig. 5, S905 in the smart home scene arrangement method shown in fig. 9, S1009 in the smart home scene arrangement method shown in fig. 10, and S1307 in the smart home scene arrangement method shown in fig. 13.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The intelligent home scene arrangement device provided by the embodiment of the application is used for executing the method of any embodiment, so that the same effect as the method of the embodiment can be achieved.
The present embodiment also provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to execute the relevant method steps in fig. 5, fig. 9, fig. 10, and fig. 13 to implement the method in the above-described embodiment.
The present embodiment also provides a computer program product containing instructions, which when run on an electronic device, causes the electronic device to perform the relevant method steps as in fig. 5, 9, 10 and 13, to implement the method in the above-mentioned embodiments.
The present embodiment also provides a control device comprising a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, perform the method in the above embodiments as the associated method steps in fig. 5, 9, 10 and 13. The control device may be an integrated circuit IC or may be a system on chip SOC. The integrated circuit can be a general integrated circuit, a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC).
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
In the several embodiments provided in this embodiment, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional unit in the embodiments of the present embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present embodiment essentially or partially contributes to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the method described in the embodiments. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. The intelligent household scene arranging method is characterized by comprising the following steps:
the electronic equipment receives a first operation;
the electronic equipment responds to the first operation and requests the state information of the intelligent household equipment from the server;
the electronic equipment receives state information of the intelligent household equipment;
the electronic equipment generates a control rule according to the state information of the intelligent household equipment, and the control rule is used for controlling the intelligent household equipment;
the electronic equipment receives a second operation of starting an intelligent home scene;
and the electronic equipment responds to the second operation and controls the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
2. The method according to claim 1, wherein the requesting the server for the state information of the smart home device comprises:
and requesting state information of the intelligent household equipment associated with the user account logged in by the electronic equipment.
3. The method of claim 1, wherein prior to the electronic device receiving the first operation, the method further comprises:
the electronic equipment receives a start operation;
and the electronic equipment responds to the starting operation and starts to record the intelligent household equipment operated by the user.
4. The method according to claim 3, wherein the requesting the server for the state information of the smart home device comprises:
and sending the identification of the intelligent household equipment operated by the user to a server, and requesting the state information of the intelligent household equipment corresponding to the identification.
5. The method according to claim 2 or 4, wherein the requesting the server for the state information of the smart home device further comprises:
and sending the type identifier of the intelligent household equipment to a server, and requesting the state information of the intelligent household equipment associated with the type identifier of the intelligent household equipment.
6. The method according to any one of claims 1-5, further comprising:
the electronic equipment receives a scene name;
the electronic device associates the control rule with the scene name.
7. The method of claim 6, wherein the electronic device receives a scene name, comprising:
the electronic equipment receives the scene name input by a user.
8. The method of claim 6, wherein the electronic device receives a scene name, comprising:
and the electronic equipment generates the scene name according to a preset rule.
9. The method according to any one of claims 1-8, wherein the smart home scenario corresponds to a first rule indicating that the smart home device enters a state corresponding to the state information when a first event occurs.
10. An electronic device, characterized in that the electronic device comprises: the mobile terminal comprises one or more processors, a memory, a touch screen and a communication module, wherein the communication module comprises a mobile communication module and a wireless communication module; wherein,
the memory is used for storing one or more programs;
the touch screen is used for receiving a first operation and a second operation;
the one or more processors are configured to execute the one or more programs to perform the following acts:
responding to the first operation, and indicating the communication module to request state information of the intelligent home equipment from a server, and after the electronic equipment receives the state information of the intelligent home equipment, generating a control rule according to the state information of the intelligent home equipment, wherein the control rule is used for controlling the intelligent home equipment; and responding to the second operation, and controlling the intelligent household equipment to enter a state corresponding to the state information according to the control rule.
11. The electronic device of claim 10, wherein the communication module is configured to:
and requesting state information of the intelligent household equipment associated with the user account logged in by the electronic equipment.
12. The electronic device of claim 10,
the touch screen is also used for receiving starting operation;
the processor is further used for responding to the starting operation and starting recording the intelligent household equipment operated by the user.
13. The electronic device of claim 12, wherein the communication module is configured to:
and sending the identification of the intelligent household equipment operated by the user to a server, and requesting the state information of the intelligent household equipment corresponding to the identification.
14. The electronic device of claim 11 or 13, wherein the communication module is further configured to:
and sending the type identifier of the intelligent household equipment to a server, and requesting the state information of the intelligent household equipment associated with the type identifier of the intelligent household equipment.
15. The electronic device of any of claims 10-14,
the touch screen is also used for receiving scene names;
the processor is further configured to associate the control rule with the scene name.
16. The electronic device of claim 15, wherein the touch screen is configured to receive a user input of the scene name.
17. The electronic device of claim 15, wherein the processor is configured to generate the scene name according to a preset rule.
18. The electronic device according to any one of claims 10-17, wherein the smart home scenario corresponds to a first rule indicating that the smart home device enters a state corresponding to the state information when a first event occurs.
19. The utility model provides an intelligence house scene arrangement device which characterized in that, intelligence house scene arrangement device includes: the device comprises a processing unit, a storage unit, a display unit and a transceiving unit, wherein the storage unit is used for storing one or more programs; the processing unit is configured to execute the one or more programs; the one or more programs include instructions for performing the smart home scene orchestration method according to any one of claims 1-9.
CN201910160998.4A 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal Active CN111650840B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910160998.4A CN111650840B (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal
CN202111385425.5A CN114217532A (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910160998.4A CN111650840B (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202111385425.5A Division CN114217532A (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal

Publications (2)

Publication Number Publication Date
CN111650840A true CN111650840A (en) 2020-09-11
CN111650840B CN111650840B (en) 2021-12-03

Family

ID=72350734

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111385425.5A Pending CN114217532A (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal
CN201910160998.4A Active CN111650840B (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111385425.5A Pending CN114217532A (en) 2019-03-04 2019-03-04 Intelligent household scene arranging method and terminal

Country Status (1)

Country Link
CN (2) CN114217532A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180753A (en) * 2020-10-16 2021-01-05 惠州莫思特智照科技有限公司 Intelligent home control method, system and server
CN112306968A (en) * 2020-11-10 2021-02-02 珠海格力电器股份有限公司 Scene establishing method and device
CN112415971A (en) * 2020-11-24 2021-02-26 武汉虹信技术服务有限责任公司 Method for controlling conference room by one key, computer equipment and readable medium
CN112488555A (en) * 2020-12-11 2021-03-12 青岛海尔科技有限公司 Intelligent scene configuration method and device, storage medium and electronic equipment
CN113268004A (en) * 2021-04-22 2021-08-17 深圳Tcl新技术有限公司 Scene creating method and device, computer equipment and storage medium
CN113325745A (en) * 2021-04-28 2021-08-31 青岛海尔科技有限公司 Equipment control method, terminal, storage medium and electronic device
CN113434844A (en) * 2021-06-23 2021-09-24 青岛海尔科技有限公司 Intelligent scene building method and device, storage medium and electronic equipment
CN114280953A (en) * 2021-12-29 2022-04-05 河南紫联物联网技术有限公司 Scene mode creating method and device, electronic equipment and storage medium
CN115202217A (en) * 2021-03-25 2022-10-18 阿里巴巴新加坡控股有限公司 Intelligent equipment control system, method, device and equipment
WO2022218138A1 (en) * 2021-04-16 2022-10-20 华为技术有限公司 Event processing method and system, and device
WO2023216892A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Scenario setting method and electronic device
WO2023226923A1 (en) * 2022-05-25 2023-11-30 华为技术有限公司 Method for controlling plc device, and electronic device
WO2024067308A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Smart device control method, electronic device, and system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115412391A (en) * 2022-11-02 2022-11-29 长沙朗源电子科技有限公司 Method and system for building intelligent scene of multiple small household appliances and storage medium
CN118276456A (en) * 2022-12-30 2024-07-02 华为技术有限公司 Smart home configuration method and electronic equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103616876A (en) * 2013-12-02 2014-03-05 从兴技术有限公司 Intelligent home centralized control device and intelligent home contextual model establishing method
CN104394045A (en) * 2014-10-29 2015-03-04 小米科技有限责任公司 Scene mode recommending method and device for an intelligent device
CN104460328A (en) * 2014-10-29 2015-03-25 小米科技有限责任公司 Intelligent device control method and device based on set scenario mode
CN105487394A (en) * 2015-11-30 2016-04-13 青岛海尔智能家电科技有限公司 Intelligent household appliance control method and device and gateway
CN105652824A (en) * 2014-12-03 2016-06-08 比亚迪股份有限公司 Intelligent household control system and method
CN105744479A (en) * 2016-03-09 2016-07-06 深圳微自然创新科技有限公司 Device control method and related device based on self-adaption Geo-fencing technique
CN105897527A (en) * 2016-05-30 2016-08-24 海信集团有限公司 Method and device for setting running parameter of smart home device in smart scene
CN106292306A (en) * 2015-06-01 2017-01-04 丰唐物联技术(深圳)有限公司 Scene setting method and terminal
KR20170053238A (en) * 2015-11-06 2017-05-16 김창호 Internet (IoT) smart home / building automation system and its control method for energy saving and power saving by crowd service
CN106951758A (en) * 2017-02-28 2017-07-14 美的智慧家居科技有限公司 For the method for intelligent domestic system, device and intelligent domestic system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene
CN108449241A (en) * 2018-02-09 2018-08-24 深圳绿米联创科技有限公司 Configuration method and device, the terminal of Intelligent household scene
CN108845503A (en) * 2018-08-11 2018-11-20 深圳市百创网络科技有限公司 The providing method and its system of Intelligent household scene service
CN109150672A (en) * 2017-06-13 2019-01-04 美的智慧家居科技有限公司 Configuration method, device, system and the machine readable storage medium of smart home
KR20190014969A (en) * 2017-08-04 2019-02-13 김재영 Energy saving by cloud service Smart home / building automation system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103616876A (en) * 2013-12-02 2014-03-05 从兴技术有限公司 Intelligent home centralized control device and intelligent home contextual model establishing method
CN104394045A (en) * 2014-10-29 2015-03-04 小米科技有限责任公司 Scene mode recommending method and device for an intelligent device
CN104460328A (en) * 2014-10-29 2015-03-25 小米科技有限责任公司 Intelligent device control method and device based on set scenario mode
CN105652824A (en) * 2014-12-03 2016-06-08 比亚迪股份有限公司 Intelligent household control system and method
CN106292306A (en) * 2015-06-01 2017-01-04 丰唐物联技术(深圳)有限公司 Scene setting method and terminal
KR20170053238A (en) * 2015-11-06 2017-05-16 김창호 Internet (IoT) smart home / building automation system and its control method for energy saving and power saving by crowd service
CN105487394A (en) * 2015-11-30 2016-04-13 青岛海尔智能家电科技有限公司 Intelligent household appliance control method and device and gateway
CN105744479A (en) * 2016-03-09 2016-07-06 深圳微自然创新科技有限公司 Device control method and related device based on self-adaption Geo-fencing technique
CN105897527A (en) * 2016-05-30 2016-08-24 海信集团有限公司 Method and device for setting running parameter of smart home device in smart scene
CN106951758A (en) * 2017-02-28 2017-07-14 美的智慧家居科技有限公司 For the method for intelligent domestic system, device and intelligent domestic system
CN109150672A (en) * 2017-06-13 2019-01-04 美的智慧家居科技有限公司 Configuration method, device, system and the machine readable storage medium of smart home
KR20190014969A (en) * 2017-08-04 2019-02-13 김재영 Energy saving by cloud service Smart home / building automation system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene
CN108449241A (en) * 2018-02-09 2018-08-24 深圳绿米联创科技有限公司 Configuration method and device, the terminal of Intelligent household scene
CN108845503A (en) * 2018-08-11 2018-11-20 深圳市百创网络科技有限公司 The providing method and its system of Intelligent household scene service

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180753A (en) * 2020-10-16 2021-01-05 惠州莫思特智照科技有限公司 Intelligent home control method, system and server
CN112180753B (en) * 2020-10-16 2024-08-16 惠州莫思特智照科技有限公司 Smart home control method, system and server
CN112306968B (en) * 2020-11-10 2023-11-24 珠海格力电器股份有限公司 Scene establishment method and device
WO2022100181A1 (en) * 2020-11-10 2022-05-19 珠海格力电器股份有限公司 Scene establishment method and apparatus, and computer device and computer-readable storage medium
CN112306968A (en) * 2020-11-10 2021-02-02 珠海格力电器股份有限公司 Scene establishing method and device
EP4246258A4 (en) * 2020-11-10 2024-05-08 Gree Electric Appliances, Inc. of Zhuhai Scene establishment method and apparatus, and computer device and computer-readable storage medium
CN112415971A (en) * 2020-11-24 2021-02-26 武汉虹信技术服务有限责任公司 Method for controlling conference room by one key, computer equipment and readable medium
CN112488555A (en) * 2020-12-11 2021-03-12 青岛海尔科技有限公司 Intelligent scene configuration method and device, storage medium and electronic equipment
CN115202217A (en) * 2021-03-25 2022-10-18 阿里巴巴新加坡控股有限公司 Intelligent equipment control system, method, device and equipment
WO2022218138A1 (en) * 2021-04-16 2022-10-20 华为技术有限公司 Event processing method and system, and device
CN113268004A (en) * 2021-04-22 2021-08-17 深圳Tcl新技术有限公司 Scene creating method and device, computer equipment and storage medium
CN113325745A (en) * 2021-04-28 2021-08-31 青岛海尔科技有限公司 Equipment control method, terminal, storage medium and electronic device
CN113434844A (en) * 2021-06-23 2021-09-24 青岛海尔科技有限公司 Intelligent scene building method and device, storage medium and electronic equipment
CN114280953A (en) * 2021-12-29 2022-04-05 河南紫联物联网技术有限公司 Scene mode creating method and device, electronic equipment and storage medium
WO2023216892A1 (en) * 2022-05-12 2023-11-16 华为技术有限公司 Scenario setting method and electronic device
WO2023226923A1 (en) * 2022-05-25 2023-11-30 华为技术有限公司 Method for controlling plc device, and electronic device
WO2024067308A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Smart device control method, electronic device, and system

Also Published As

Publication number Publication date
CN111650840B (en) 2021-12-03
CN114217532A (en) 2022-03-22

Similar Documents

Publication Publication Date Title
CN111650840B (en) Intelligent household scene arranging method and terminal
WO2021052263A1 (en) Voice assistant display method and device
CN109584879B (en) Voice control method and electronic equipment
CN113272745B (en) Smart home equipment sharing system and method and electronic equipment
CN111752443A (en) Method, related device and system for controlling page by display equipment
CN111316199B (en) Information processing method and electronic equipment
CN110489215A (en) The treating method and apparatus of scene is waited in a kind of application program
CN115866122A (en) Application interface interaction method, electronic device and computer-readable storage medium
CN111614524A (en) Multi-intelligent-device linkage control method, device and system
CN110633043A (en) Split screen processing method and terminal equipment
CN113805487B (en) Control instruction generation method and device, terminal equipment and readable storage medium
CN109857401B (en) Display method of electronic equipment, graphical user interface and electronic equipment
CN113452945A (en) Method and device for sharing application interface, electronic equipment and readable storage medium
CN111492678B (en) File transmission method and electronic equipment
CN114115770A (en) Display control method and related device
CN113970888A (en) Household equipment control method, terminal equipment and computer readable storage medium
WO2023071454A1 (en) Scenario synchronization method and apparatus, and electronic device and readable storage medium
CN114500732B (en) Interface display method, electronic equipment and storage medium
CN112312410B (en) Deployment method and device of wireless access point
CN112449101A (en) Shooting method and electronic equipment
CN115941836A (en) Interface display method, electronic equipment and storage medium
WO2022052713A1 (en) Interaction method and apparatus, and electronic device
CN115550702A (en) Awakening method and system
CN115883714B (en) Message reply method and related equipment
WO2024060968A1 (en) Service widget management method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant