CN115701032A - Device control method, electronic device, and storage medium - Google Patents

Device control method, electronic device, and storage medium Download PDF

Info

Publication number
CN115701032A
CN115701032A CN202110861878.4A CN202110861878A CN115701032A CN 115701032 A CN115701032 A CN 115701032A CN 202110861878 A CN202110861878 A CN 202110861878A CN 115701032 A CN115701032 A CN 115701032A
Authority
CN
China
Prior art keywords
target
equipment
image
user
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110861878.4A
Other languages
Chinese (zh)
Inventor
曹宇玮
徐文亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110861878.4A priority Critical patent/CN115701032A/en
Priority to PCT/CN2022/106218 priority patent/WO2023005706A1/en
Publication of CN115701032A publication Critical patent/CN115701032A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • H04B17/30Monitoring; Testing of propagation channels
    • H04B17/309Measuring or estimating channel quality parameters
    • H04B17/318Received signal strength
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Abstract

The embodiment of the application provides a device control method, electronic equipment and a storage medium, and relates to the technical field of computers, wherein the method comprises the following steps: capturing an image of a second device; acquiring the device type of the second device, wherein the device type of the second device is determined according to the image of the second device; matching the device type of the second device with a first device list, and determining a plurality of target devices of the same device type as the second device, wherein the first device list comprises a plurality of devices to be selected, the plurality of devices to be selected comprise the second device, and each device to be selected has a preset device type; acquiring signal strength between the first device and each target device, wherein the signal strength is used for determining the second device from the target devices; and displaying a control interface of the second device. The method provided by the embodiment of the application can improve the control efficiency of the intelligent household appliance.

Description

Device control method, electronic device, and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, and in particular relates to an equipment control method, electronic equipment and a storage medium.
Background
With the popularization of smart homes, more and more smart home appliances (e.g., smart audio, smart lighting, smart sockets, smart televisions, smart refrigerators, etc.) have moved into homes. However, the control of these intelligent home devices requires corresponding applications. Generally, a user can control the smart home device through the related applications. In practical applications, the related applications are required to identify the intelligent household appliance to be controlled before controlling the intelligent household appliance, so that the identified intelligent household appliance can be controlled. However, the current related applications can only identify the type of the intelligent household electrical appliance, and when a user has multiple intelligent household electrical appliances of the same type in a home, the control efficiency of the intelligent household electrical appliance is reduced, and poor experience is brought to the user.
Disclosure of Invention
The embodiment of the application provides a device control method, an electronic device and a storage medium, and aims to provide a mode for identifying and controlling an intelligent household appliance based on signal intensity, so that the identification efficiency can be improved, the efficiency for controlling the intelligent household appliance can be improved, and the user experience can be improved.
In a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to a first apparatus, and includes:
taking an image of a second device; the first device can be a mobile phone, a tablet and other terminal devices, and the second device can be an intelligent household appliance such as an intelligent air conditioner, an intelligent television, an intelligent refrigerator, intelligent lighting and an intelligent switch. The first device can obtain an image of the second device through camera shooting.
Acquiring the device type of the second device, wherein the device type of the second device is determined according to the image of the second device; the device type may be used to distinguish between the types of the second device, such as a television type, a refrigerator type, an air conditioner type, a lighting type, a switch type, and the like.
Matching the device type of the second device with a first device list, and determining a plurality of target devices of the same device type as the second device, wherein the first device list comprises a plurality of devices to be selected, and each device to be selected has a preset device type; the to-be-selected equipment can be to-be-selected intelligent household appliances, the to-be-selected intelligent household appliances can locally establish a local area network, and the intelligent household appliances accessed to the local area network form an equipment list.
Acquiring signal strength between the first device and each target device, wherein the signal strength is used for determining a second device from the target devices; and displaying a control interface of the second device. Wherein the signal strength between the first device and each target device may be a received signal strength of bluetooth, the control interface being for controlling the second device.
In the embodiment of the application, the candidate target intelligent household appliances are obtained based on the appliance types, the intelligent household appliances to be controlled are determined in the candidate target intelligent household appliances according to the signal intensity, and the control interfaces of the intelligent household appliances to be controlled are displayed, so that the identification efficiency can be improved, and the efficiency of controlling the intelligent household appliances can be improved.
In order to improve the efficiency of image recognition, in one possible implementation manner, the acquiring the device type of the second device includes:
and performing image recognition on the image of the second device, and determining the device type of the second device according to the recognition result.
In one possible implementation manner, the obtaining the device type of the second device includes:
sending the image of the second device to a third device; the image of the second device is used for the third device to determine the device type of the second device; and receiving the device type of the second device sent by the third device.
In the embodiment of the application, the image is sent to the third device (cloud server) for recognition processing, so that the calculation burden of the first device can be reduced, and the accuracy of image recognition can be improved through the powerful calculation force of the third device.
In order to facilitate the shooting operation of a user and improve the experience of the user, in one possible implementation manner, a first application is installed on a first device; before the image of the second device is shot, the method further comprises the following steps:
the first application responds to the detected first operation of the user and starts a device control shooting mode;
capturing an image of the second device includes:
the first application captures an image of the second device in response to a detected second operation of the user in the device control capture mode.
In one possible implementation manner, the displaying the control interface of the second device includes:
displaying a second device list based on the signal strength; wherein the second device list includes some or all of the plurality of target devices;
and responding to the detected operation that the user selects the second equipment in the second equipment list, and displaying a control interface of the second equipment.
In the embodiment of the application, the candidate intelligent household appliances are screened according to the signal intensity, and the second appliance is selected from the intelligent household appliances obtained by the user after screening, so that the second appliance is controlled, and the accuracy of controlling the second appliance can be improved.
In one possible implementation manner, the displaying the control interface of the second device includes:
and determining the target equipment corresponding to the highest signal intensity as second equipment, and displaying a control interface of the second equipment.
In the embodiment of the application, the second equipment is automatically determined in the candidate intelligent household appliances, and the control interface of the second equipment is displayed, so that the identification efficiency can be improved, and the operation of a user is simple and convenient.
To improve the efficiency of selecting a second device in the device list. In one possible implementation manner, the displaying the second device list based on the signal strength includes:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and sequentially displaying the first target device and the second target device based on the sequence of the signal intensity from strong to weak.
In one possible implementation manner, sequentially displaying the first target device and the second target device based on the sequence from strong to weak of the signal strength includes:
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is greater than or equal to a preset signal intensity threshold value, sequentially displaying the first target equipment and the second target equipment based on the sequence of the signal intensities from strong to weak.
In the embodiment of the application, when the signal intensity difference between the target devices is large, the candidate intelligent household appliances are determined according to the signal intensity, and the accuracy of identification of the second device can be improved.
In one possible implementation manner, the displaying the second device list based on the signal strength includes:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is smaller than a preset signal intensity threshold value, displaying the first target equipment and the second target equipment according to the historical use habit of the user, wherein the historical use habit of the user is used for representing statistical information of the use of the first target equipment and the second target equipment by the user.
In the embodiment of the present application,
in one possible implementation manner, when the signal intensity difference between the target devices is small, the candidate intelligent household appliances are determined according to the historical use habits of the user, so that the accuracy of identification of the second device can be improved.
In one possible implementation manner, the displaying the first target device and the second target device according to the historical usage habits of the user includes:
determining the use probabilities respectively corresponding to the first target equipment and the second target equipment according to the historical use habits of the user; and sequentially displaying the first target device and the second target device based on the sequence from high to low of the use probability.
In one possible implementation manner, the first device is a mobile terminal.
In one possible implementation manner, the second device is an intelligent home appliance.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is applied to a first apparatus, and includes:
the shooting module is used for shooting an image of the second equipment;
the acquisition module is used for acquiring the equipment type of the second equipment, and the equipment type of the second equipment is determined according to the image of the second equipment;
the device type matching module is used for matching the device type of the second device with a first device list and determining a plurality of target devices of the same device type as the second device, wherein the first device list comprises a plurality of devices to be selected, and each device to be selected is of a preset device type;
the display module is used for acquiring signal strength between the first equipment and each target equipment, and the signal strength is used for determining second equipment from the target equipment; and displaying a control interface of the second device.
In one possible implementation manner, the obtaining module is further configured to perform image recognition on an image of the second device, and determine a device type of the second device according to a recognition result.
In one possible implementation manner, the obtaining module is further configured to send the image of the second device to a third device, where the image of the second device is used by the third device to determine the device type of the second device; and receiving the device type of the second device sent by the third device.
In one possible implementation manner, the first device is installed with a first application, and the apparatus further includes:
the starting module is used for responding to the detected first operation of the user by the first application and starting the equipment to control the shooting mode;
the shooting module is also used for responding to the detected second operation of the user in the equipment control shooting mode by the first application and shooting to obtain an image of the second equipment.
In one possible implementation manner, the display module is further configured to display a second device list based on the signal strength; wherein the second device list includes some or all of the plurality of target devices;
and responding to the detected operation that the user selects the second equipment in the second equipment list, and displaying a control interface of the second equipment.
In one possible implementation manner, the display module is further configured to determine the target device corresponding to the highest signal strength as the second device, and display a control interface of the second device.
In one possible implementation manner, the display module is further configured to acquire a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and sequentially displaying the first target device and the second target device based on the sequence of the signal intensity from strong to weak.
In one possible implementation manner, the display module is further configured to sequentially display the first target device and the second target device based on a sequence from strong to weak of the signal strength if a difference between the signal strength of the first target device and the signal strength of the second target device is greater than or equal to a preset signal strength threshold.
In one possible implementation manner, the display module is further configured to obtain a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is smaller than a preset signal intensity threshold value, displaying the first target equipment and the second target equipment according to the historical use habit of the user, wherein the historical use habit of the user is used for representing statistical information of the use of the first target equipment and the second target equipment by the user.
In one possible implementation manner, the display module is further configured to determine, according to a historical usage habit of a user, usage probabilities respectively corresponding to the first target device and the second target device;
and sequentially displaying the first target device and the second target device based on the sequence from high to low of the use probability.
In one possible implementation manner, the first device is a mobile terminal.
In one possible implementation manner, the second device is an intelligent household appliance.
In a third aspect, an embodiment of the present application provides a first device, including:
a memory, said memory for storing computer program code, said computer program code including instructions, when said first device reads said instructions from said memory, to cause said first device to perform the steps of:
capturing an image of a second device;
acquiring the device type of the second device, wherein the device type of the second device is determined according to the image of the second device;
matching the device type of the second device with a first device list, and determining a plurality of target devices of the same device type as the second device, wherein the first device list comprises a plurality of devices to be selected, and each device to be selected has a preset device type;
acquiring signal strength between the first device and each target device, wherein the signal strength is used for determining a second device from the target devices;
and displaying a control interface of the second device.
In one possible implementation manner, when the instruction is executed by the first device, the step of the first device executing to acquire the device type of the second device includes:
and performing image recognition on the image of the second device, and determining the device type of the second device according to the recognition result.
In one possible implementation manner, when the instruction is executed by the first device, the step of the first device executing to acquire the device type of the second device includes:
sending the image of the second device to a third device, wherein the image of the second device is used for the third device to determine the device type of the second device;
and receiving the device type of the second device sent by the third device.
In one possible implementation manner, the first device is provided with a first application; when the instruction is executed by the first device, before the first device executes the step of shooting the image of the second device, the method further comprises the following steps:
the first application responds to the detected first operation of the user and starts a device control shooting mode;
the step of causing the first device to perform capturing an image of a second device comprises:
the first application captures an image of the second device in response to a detected second operation of the user in the device control capture mode.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to execute displaying of a control interface of a second device includes:
displaying a second device list based on the signal strength; wherein the second device list includes some or all of the plurality of target devices;
and responding to the detected operation that the user selects the second equipment in the second equipment list, and displaying a control interface of the second equipment.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to execute displaying of a control interface of a second device includes:
and determining the target equipment corresponding to the highest signal intensity as second equipment, and displaying a control interface of the second equipment.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to display a second device list based on signal strength includes:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and sequentially displaying the first target device and the second target device based on the sequence of the signal intensity from strong to weak.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to sequentially display the first target device and the second target device based on a sequence from strong to weak of the signal strength includes:
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is greater than or equal to a preset signal intensity threshold value, sequentially displaying the first target equipment and the second target equipment based on the sequence of the signal intensities from strong to weak.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to display a second device list based on signal strength includes:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is smaller than a preset signal intensity threshold value, displaying the first target equipment and the second target equipment according to the historical use habit of the user, wherein the historical use habit of the user is used for representing statistical information of the use of the first target equipment and the second target equipment by the user.
In one possible implementation manner, when the instruction is executed by the first device, the step of causing the first device to display the first target device and the second target device according to the historical usage habit of the user includes:
determining the use probabilities respectively corresponding to the first target equipment and the second target equipment according to the historical use habits of the user;
and sequentially displaying the first target device and the second target device based on the sequence from high to low of the use probability.
In one possible implementation manner, the first device is a mobile terminal.
In one possible implementation manner, the second device is an intelligent home appliance.
In a fourth aspect, an embodiment of the present application provides an apparatus control system, including the first apparatus and the third apparatus described in the third aspect, wherein,
the third device is used for receiving the image of the second device sent by the first device, performing image recognition on the image, and sending a recognition result to the first device, wherein the recognition result is the device type of the second device.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a sixth aspect, the present application provides a computer program, which is used to execute the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program in the sixth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an embodiment of a device control method provided in the present application;
fig. 5a and 5b are schematic diagrams illustrating a device controlling shooting mode setting provided by an embodiment of the present application;
6 a-6 c are schematic diagrams of control interface displays provided by embodiments of the present application;
fig. 7 is a schematic structural diagram of an embodiment of a device control apparatus provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, the meaning of "a plurality" is two or more unless otherwise specified.
With the popularization of smart homes, more and more smart home appliances (e.g., smart audio, smart bulbs, smart sockets, smart televisions, smart refrigerators, etc.) have moved into homes. However, the control of these smart home devices requires corresponding applications. Generally, a user can control the smart home device through the related applications. In practical application, the related applications need to identify the intelligent household appliance to be controlled before controlling the intelligent household appliance, so that the identified intelligent household appliance can be controlled. However, the current related applications can only identify the type of the intelligent household electrical appliance, and when a user has multiple intelligent household electrical appliances of the same type in a home, the control efficiency of the intelligent household electrical appliance is reduced, and poor experience is brought to the user.
In order to solve the above problem, an embodiment of the present application proposes an apparatus control method, which is applied to the first apparatus 10. The first device 10 may be a mobile terminal having a camera and a display screen, among others. The first device 10 may also be referred to as a terminal device, user Equipment (UE), access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent, or User Equipment. The first device 10 may be a cellular telephone, a cordless telephone, a Personal Digital Assistant (PDA) device, a handheld device with wireless communication capability, a computing device or other processing device connected to a wireless modem, a computer, a laptop, a handheld communication device, a handheld computing device, a satellite radio, a Customer Premises Equipment (CPE) and/or other devices used for communication over a wireless system, as well as next generation communication systems, e.g., a Mobile terminal in a 5G Network or a Mobile terminal in a future evolved Public Land Mobile Network (PLMN) Network, etc. The embodiment of the present application does not specifically limit the form of the first apparatus 10.
Fig. 1 is an application scenario of the device control method, and as shown in fig. 1, the application scenario includes a first device 10, a plurality of second devices 20, and a third device 30. The second device 20 may be an intelligent household appliance, such as an intelligent sound box, an intelligent lighting, an intelligent socket, an intelligent television, an intelligent refrigerator, and the like. The embodiment of the present application does not specifically limit the form of the second apparatus 20. The third device 30 may be a cloud device (e.g., a cloud server).
The first device 10 and the second device 20 may be wirelessly connected, where the wireless connection may be WIFI or bluetooth, or may be through other wireless communication methods, and the wireless connection method between the first device 10 and the second device 20 is not particularly limited in this embodiment of the application. The first device 10 and the third device 30 may be connected through WIFI, for example, the first device 10 may be connected to a WIFI router, and establish a connection with the third device 30 through the WIFI router, optionally, the first device 10 may also communicate with a base station through a mobile communication network (e.g., 4G, 5G, etc.), and establish a connection with the third device 30 through the base station, which is not particularly limited in the embodiment of the present application to the wireless connection mode between the first device 10 and the third device 30.
Fig. 2 is a software architecture diagram of the first device 10. As shown in fig. 2, the first device 10 includes a hardware layer 11, a driver layer 12, a hardware abstraction layer 13, and an application layer 14. Wherein
The hardware layer 11 may be used to drive a device (e.g., a camera) in the first device 10 to take an image. Illustratively, when the first device 10 is in the smart shooting mode, the hardware layer 11 may drive the camera to shoot an image.
The driver layer 12 may be used to capture an image captured by the hardware layer 11 and upload the image to the hardware abstraction layer 13.
The hardware abstraction layer 13 may be used to receive the image uploaded by the driver layer 12 and may upload the image to the application layer 14.
The application layer 14 may be configured to recognize an image uploaded by the hardware abstraction layer 13, and may display a control interface of the second device 20 based on a recognition result, or send the image uploaded by the hardware abstraction layer 13 to the third device 30, where the third device 30 recognizes the image and sends a recognition result to the first device 10, and when the first device 10 receives the recognition result sent by the third device 30, the control interface of the second device 20 may be displayed based on the recognition result, so as to implement control over the second device 20.
An exemplary electronic device provided in the following embodiments of the present application is first described below with reference to fig. 3. Fig. 3 shows a schematic structural diagram of an electronic device 100, which electronic device 100 may be the first device 10 described above.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus comprising a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit the audio signal to the wireless communication module 160 through the PCM interface, so as to implement the function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 170C by uttering a voice signal close to the microphone 170C through the mouth of the user. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and perform directional recording.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. Pressure sensor 180A
There are many types of sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Fig. 4 is a schematic flowchart of an embodiment of an intelligent device control method provided in the embodiment of the present application, including:
in response to the detected first operation by the user, the first device 10 starts a device control shooting mode, step 401.
Specifically, the user may operate on the display screen 194 of the first device 10 to turn on the device control shooting mode of the first device 10. Wherein the device controlling the photographing mode may be referred to as a smart photographing mode. Illustratively, the first device 10 may be pre-installed with a first application. The first application may be a system application or a third-party application. The embodiment of the present application does not specifically limit the type of the first application. The first application may be used to provide a smart photography service, and when implemented, the first application may be referred to as a smart remote control application. The user may click the icon of the first application on the desktop of the first device 10 to start the first application, and may enter a device control shooting mode setting page. It is to be understood that the above examples merely illustrate the manner in which the user opens the first application by clicking, and do not constitute a limitation to the embodiments of the present application, and in some embodiments, the first application may also be opened by other manners, for example, by double clicking, sliding, and the like.
Next, the user can open the device control photographing mode at the device control photographing mode setting page. For example, a user may activate a switch of the smart remote control to turn on the device to control the shooting mode.
Turning on of the device control shooting mode will now be described with reference to fig. 5a and 5 b. When the user opens the first application and enters the device control shooting mode setting page, a page 500 shown in fig. 5a may be obtained, where the page 500 is the device control shooting mode setting page. As shown in fig. 5a, the page 500 includes the smart remote control switch 501, and at this time, the smart remote control switch 501 is in an off state, that is, at this time, the device is not turned on to control the shooting mode. The user can then click the smart remote control switch 501 on the page 500, whereby the page 510 shown in fig. 5b is obtained. As shown in fig. 5b, the smart remote switch 501 is in an on state, that is, at this time, the first device 10 enters a device control shooting mode.
In response to detecting a second operation of the user in the device control photographing mode, the first device 10 acquires an image of the second device 20, step 402.
Specifically, the user may retract to the desktop in the first device 10 and may aim the camera 193 of the first device 10 at an object to be photographed (e.g., the second device 20) to acquire an image of the second device 20. In a specific implementation, since the first device 10 has already turned on the device control shooting mode in step 301, the user does not need to manually open the camera application, and when the user erects the first device 10 and aligns the first device 10 to the second device 20, the first device 10 may automatically turn on the camera and obtain an image of the second device 20 through shooting, so that convenience of shooting the image by the user may be improved, and further, user experience may be improved.
In step 403, the first device 10 obtains the recognition result of the image of the second device 20.
Specifically, when the first device 10 acquires an image of the second device 20, the recognition result may be acquired based on the image of the second device 20 described above. The manner of obtaining the recognition result may be that the first device 10 locally recognizes the image of the second device 20, and for example, the image recognition algorithm in the first device 10 may be used to recognize the image of the second device 20, so that the recognition result may be obtained.
Alternatively, the first device 10 may upload the image of the second device 20 to the third device 30, and the image of the second device 20 is recognized by the strong computing power of the third device 30, so that the recognition result may be obtained, and then the third device 30 may return the recognition result to the first device 10, so that the computing load of the first device 10 may be reduced.
It is understood that the above-mentioned identification result may be a device type of the intelligent household electrical appliance, and the device type may be, for example, a television type, a lighting type, an air conditioner type, or the like. Taking the television category as an example, a user may have multiple televisions in their home, e.g., one in the living room, one in the master bedroom, and one in the slave bedroom. On the other hand, by recognizing the image of the second device 20, if the device type of the second device 20 is identified as tv, it is not known which tv is, and therefore, it is necessary to further determine which tv is the second device 20, so that the second device 20 can be controlled.
In step 404, the first device 10 obtains a preset device list.
Specifically, the preset device list may include a plurality of to-be-selected smart home devices. It can be understood that the intelligent household appliances can be interconnected through WIFI. And each to-be-selected intelligent household appliance has a preset appliance type. For example, the intelligent home devices may be configured to locally build a local area network in advance. The local area network may be of a combination of one or more of a bus type, a ring type, a star type, and a tree type. The embodiment of the present application does not specifically limit the type of the local area network.
Next, the first device 10 may acquire the preset device list. The manner of acquiring the preset device list by the first device 10 may include the following two manners. First, the first device 10 may connect to a local area network through WIFI, and may obtain the preset device list through the local area network. That is, the first device 10 may query the to-be-selected smart home devices that have access to the local area network. Secondly, after the local area network of the intelligent household appliance component, the preset device list may be sent to the third device 30. The first device 10 may send a device list request to the third device 30 for requesting a device list of the intelligent home devices to be selected. When the third device 30 receives the device list request of the first device 10, the device list of the to-be-selected intelligent home device may be sent to the first device 10.
It can be understood that, after any of the to-be-selected intelligent home devices exits the local area network, for example, the intelligent home device is in an offline state, the intelligent home device is no longer present in the preset device list. When any intelligent household electrical appliance is accessed to the local area network, the intelligent household electrical appliance becomes an intelligent household electrical appliance to be selected and appears in the preset equipment list. When any of the to-be-selected intelligent home appliances exits the local area network or any of the intelligent home appliances joins the local area network, the preset appliance list may be synchronized with the third appliance 30.
In step 405, the first device 10 determines the number of target devices in the preset device list that have the same device type as the device type in the identification result.
Specifically, the first device 10 may match the identified device type with the device type of each intelligent home device to be selected in the preset device list, so as to obtain the number of target devices in the local area network, which are of the same device type as that of the second device 20. Illustratively, assume that a user has multiple televisions in their home, e.g., one in the living room, one in the primary bedroom, and one in the secondary bedroom. At this time, if it is recognized that the device type of the second device 20 is tv, the number of target devices (tvs) can be determined to be 3 (one in the living room, one in the main bed, and one in the sub bed, respectively).
In step 406, the first device 10 displays a control interface of the second device 20.
Specifically, after the first device 10 determines the number of target devices, the second device 20 may be determined among the target devices, and a control interface of the second device 20 may be displayed. In specific implementation, the control interface may be implemented by a control card, each control card may correspond to one intelligent household electrical appliance, that is, each control card may be configured to control one intelligent household electrical appliance, and the control card may include a function menu and a control component, and is configured to control the corresponding intelligent household electrical appliance. The manner of displaying the control interface may include, but is not limited to, the following two manners.
The first method is as follows: manual display
Specifically, the first device 10 may first determine priorities of a plurality of target devices, and display a device list of the target devices on a desktop of the first device 10 in descending order of the priorities, where the device list of the target devices may include the plurality of target devices. After the user selects one of the devices in the device list of the target device, the interface of the first device 10 may display a corresponding control interface for controlling the second device 20.
In a specific implementation, when the first device 10 obtains the number of target devices, the number may be compared with a preset first threshold, so that it may be determined whether to directly display a device list of the target devices on the interface of the first device 10.
For example, the preset first threshold may be set to 2, and when the number of the target devices is less than 2, that is, if the first device 10 is only matched to one target device, the first device 10 may directly display the control interface on the interface.
When the number of target devices is greater than or equal to 2, that is, the first device 10 is matched to a plurality of target devices, at this time, the first device 10 may further determine priorities of the plurality of target devices, and may display a device list in order of the priorities. The priority may be used to distinguish a display order of the target devices, for example, a target device with a high priority may be preferentially displayed in the device list. In a specific implementation, the determining the priority of the target device may be implemented by a Received Signal Strength Indicator (RSSI), where the RSSI is used to represent the Received Signal Strength of the bluetooth between the first device 10 and the target device.
The first device 10 may obtain the RSSI of each target device and compare the RSSI of all target devices. If the RSSI difference between the target devices is significant, the priority may be determined according to the RSSI value of each target device and may be sorted according to the priority. It is understood that the above way to determine the RSSI difference may be: and acquiring the maximum RSSI and the second maximum RSSI, calculating a difference value between the maximum RSSI and the second maximum RSSI, and if the difference value is greater than or equal to a preset second threshold value, determining that the RSSI difference between the target devices is obvious. If the difference is smaller than the preset second threshold, it may be determined that the RSSI difference between the target devices is not significant.
Then, if the RSSI difference between the target devices is significant, the priority of the target device may be determined according to the RSSI of each target, for example, the target device corresponding to the highest RSSI (for convenience of description, the "target device corresponding to the highest RSSI" is referred to as "the first target device" hereinafter) may be the highest priority, the target device corresponding to the next highest RSSI (for convenience of description, the "target device corresponding to the next highest RSSI" is referred to as "the second target device" hereinafter) may be the next highest priority, and so on. And a device list including the first target device and the second target device may be displayed on the interface of the first device 10 according to the order of the priority. It is understood that the above examples only exemplarily show the scenes of the first target device and the second target device, but are not limited to only display the first target device and the second target device. In addition, the display mode of the device list of the target device may be a pop-up window, and the display mode of the device list is not particularly limited in this embodiment of the application.
Referring now to fig. 6 a-6 c, two target devices (e.g., a first target device and a second target device) are illustrated as an example. Fig. 6a is an interface before the first device 10 displays the device list of the target device. After the RSSI comparison, the first device 10 determines that the RSSI value of the first target device is greater than the RSSI value of the second target device, for example, the RSSI difference between the first target device and the second target device is greater than a preset second threshold. That is, the RSSI difference between the first target device and the second target device is significant, and the first target device has a higher priority than the second target device. At this time, the first device 10 may display the interface 600 as shown in fig. 6 b. As shown in fig. 6b, the interface 600 may include a device list 601 of a first target device 6011 and a second target device 6012. Since the first target apparatus 6011 has a higher priority than the second target apparatus 6012, the position of the first target apparatus 6011 in the apparatus list 601 may be located on the left side of the second target apparatus 6012 according to the usage habits of people from left to right and from top to bottom in general. Alternatively, the position of the first target apparatus 6011 in the apparatus list 601 may be located on the upper side of the second target apparatus 6012 according to the usage habit of people from top to bottom in general. It is to be understood that the above description of embodying the priority of the target device by the display position in the device list is only an exemplary illustration and does not constitute a limitation to the embodiments of the present application, and in some embodiments, the priority of the target device may also be embodied by other display manners.
Next, the user may operate the first target device 6011 or the second target device 6012 in the device list 601, for example, click or slide the first target device 6011 or the second target device 6012, so that a control interface corresponding to the above target devices may be displayed. Taking the example that the user clicks the first target device 6011, the interface 610 shown in fig. 6c may be obtained, and as shown in fig. 6c, the interface 610 may include a control interface 611 corresponding to the first target device 6011, where the control interface 611 may be used to control the first target device 6011 (i.e., the second device 20).
If the RSSI difference between the target devices (e.g., the first target device 6011 and the second target device 6012) is not significant, the priority of the target devices may be determined according to the user historical usage habits, where the user historical usage habits may be used to represent statistical information of the user using the target devices (e.g., the first target device 6011 and the second target device 6012). For example, the statistical information may include which intelligent home device is used by the user at which time, and the statistical information may be statistical information obtained by counting user data in a period (e.g., a month, a half year, a year, etc.). In particular implementations, a user prediction model may be obtained by pre-training, and the user prediction model may be used to predict the probability that a user will use a target device at a particular time. For example, the current time may be input to the user prediction model, so that the probability that the user uses all the intelligent home devices in the home at the current time may be obtained, and then, the first device 10 may screen the usage probability of the target device corresponding to the currently identified device type from the usage probabilities of all the intelligent home devices.
Alternatively, the current time and the device type may be input to the user prediction model, so that the probability that the user uses the target device corresponding to the device type at the current time may be obtained.
It is to be understood that the above-mentioned determining, according to the time, the usage probability of the target device is only an exemplary illustration and does not constitute a limitation to the embodiments of the present application, and in some embodiments, an identifier of the current user may also be input in the above-mentioned user prediction model, so as to comprehensively decide the probability that the current user uses the target device. The probability that different users use the same target device at the same time is different, illustratively, the probability that the old at home turns on the radio in the morning is higher than that of children, and the probability that the television is turned on at night in the middle and young at home is higher than that of the old, so that the prediction accuracy can be improved through the user identification and the comprehensive decision at the time.
Then, the priority of the target device can be determined according to the corresponding use probability of each target device. Illustratively, the target device with the highest probability of use has the highest priority, the target device with the next highest probability of use has the next highest priority, and so on.
Alternatively, the user prediction model is usually trained using a neural network, that is, the user prediction model is usually a neural network model. When performing the calculation using the user prediction model, a large amount of calculation power is consumed by the first device 10, a burden is imposed on the system operation of the first device 10, and a delay is incurred. Therefore, the first device 10 may also preset a time and/or a mapping relationship between the user identifier and the target device, so that the target device that the user may use at the current time may be determined according to the time and/or the user identifier. For example, the elderly in the home will typically turn on the radio at 7 am, the middle aged or young at home will typically turn on the television at 8 pm, etc. Therefore, through simple mapping, the target device corresponding to the current time and/or the user can be output, and therefore time delay and system consumption can be reduced.
It is to be understood that the mapping relationship may be a one-to-one mapping, for example, a time and a user identifier may be mapped to a target device. Alternatively, the mapping relationship may be a one-to-many mapping, for example, a time and a user identifier may be mapped to a plurality of target devices, where each mapped target device may have a corresponding usage probability. Illustratively, the television is turned on at 8 pm for the middle and young in the home, with 80% probability of turning on the television in the living room and 20% probability of turning on the television in the bedroom. Then, the priority of the target device may be determined according to the usage probability of the target device, for example, the target device with a high usage probability has a high priority.
After determining the priority of the target device, a device list of the target device may be further displayed on the interface of the first device 10 in order of the priority. And may display a control interface corresponding to the second device 20 on the interface of the first device 10 after the user selects a target device in the device list. For the above specific process of displaying the device list of the target device and the control interface of the second device 20 on the interface of the first device 10, reference may be made to fig. 6a to 6c, which are not described herein again.
The second method comprises the following steps: automatic display
Specifically, the first device 10 may first determine priorities of a plurality of target devices, and may display a control interface of a highest priority target device, which may be regarded as a control interface of the second device 20, on the interface of the first device 10 for controlling the second device 20.
In a specific implementation, when the first device 10 obtains the number of target devices, the number may be compared with a preset first threshold, so that it may be determined whether to directly display the control interface of the second device 20 on the interface of the first device 10.
For example, the preset first threshold may be set to 2, and when the number of the target devices is less than 2, that is, if the first device 10 is only matched to one target device, the first device 10 may directly display a control interface of the target device on the interface, where the control interface of the target device may be used to control the second device 20.
When the number of the target devices is greater than or equal to 2, that is, the first device 10 is matched with a plurality of target devices, at this time, the first device 10 may further determine the priorities of the plurality of target devices, and may display the control interface of the target device with the highest priority on the interface of the first device 10, thereby facilitating the operation of the user and improving the control efficiency of the intelligent household appliance. For the display effect of the specific control interface, refer to fig. 6c, which is not described herein again. In a specific implementation, the above-mentioned manner of determining the priority of the target device may be implemented by RSSI, or may be implemented according to a historical usage habit of the user. Specifically, the process of determining the priority of the target device according to the RSSI and the historical usage habit of the user may refer to the method one, and will not be described herein again.
It is understood that, in the above embodiments, steps 401 to 406 are optional steps, and this application only provides one possible embodiment, and may further include more or less steps than steps 401 to 406, which is not limited in this application.
Fig. 7 is a schematic structural diagram of an embodiment of the device control apparatus of the present application, and as shown in fig. 7, the device control apparatus 70 applied to the first device may include: a shooting module 71, an acquisition module 72, a matching module 73 and a display module 74; wherein the content of the first and second substances,
a photographing module 71 for photographing an image of the second device;
an obtaining module 72, configured to obtain a device type of the second device, where the device type of the second device is determined according to an image of the second device;
a matching module 73, configured to match a device type of a second device with a first device list, and determine multiple target devices of the same device type as the second device, where the first device list includes multiple devices to be selected, and each device to be selected has a preset device type;
a display module 74, configured to obtain a signal strength between the first device and each target device, where the signal strength is used to determine a second device from the target devices; and displaying a control interface of the second device.
In one possible implementation manner, the obtaining module 72 is further configured to perform image recognition on an image of the second device, and determine the device type of the second device according to a recognition result.
In one possible implementation manner, the obtaining module 72 is further configured to send the image of the second device to the third device, where the image of the second device is used by the third device to determine the device type of the second device; and receiving the device type of the second device sent by the third device.
In one possible implementation manner, the first device is installed with a first application, and the device control apparatus 70 further includes:
an opening module 75, configured to, in response to a detected first operation of the user, open the device control shooting mode by the first application;
the above-mentioned photographing module 71 is also used for the first application to photograph an image of the second device in response to the detected second operation of the user in the device control photographing mode.
In one possible implementation manner, the display module 74 is further configured to display a second device list based on the signal strength; wherein the second device list includes some or all of the plurality of target devices;
and responding to the detected operation that the user selects the second equipment in the second equipment list, and displaying a control interface of the second equipment.
In one possible implementation manner, the display module 74 is further configured to determine the target device corresponding to the highest signal strength as the second device, and display a control interface of the second device.
In one possible implementation manner, the display module 74 is further configured to obtain a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and sequentially displaying the first target device and the second target device based on the sequence of the signal intensity from strong to weak.
In one possible implementation manner, the display module 74 is further configured to sequentially display the first target device and the second target device based on a sequence from strong to weak of the signal strength if a difference between the signal strength of the first target device and the signal strength of the second target device is greater than or equal to a preset signal strength threshold.
In one possible implementation manner, the display module 74 is further configured to obtain a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is smaller than a preset signal intensity threshold value, displaying the first target equipment and the second target equipment according to the historical use habits of the user, wherein the historical use habits of the user are used for representing statistical information of the user for using the first target equipment and the second target equipment.
In one possible implementation manner, the display module 74 is further configured to determine, according to the historical usage habits of the user, usage probabilities respectively corresponding to the first target device and the second target device;
and sequentially displaying the first target device and the second target device based on the sequence from high to low of the use probability.
In one possible implementation manner, the first device is a mobile terminal.
In one possible implementation manner, the second device is an intelligent household appliance.
The apparatus control device 70 provided in the embodiment shown in fig. 7 may be used to implement the technical solutions of the method embodiments shown in fig. 1 to fig. 6 of the present application, and the implementation principles and technical effects may further refer to the related descriptions in the method embodiments.
It should be understood that the division of the modules of the device control apparatus 70 shown in fig. 7 is merely a logical division, and the actual implementation may be wholly or partially integrated into one physical entity or may be physically separated. And these modules can be realized in the form of software called by processing element; or can be implemented in the form of hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the detection module may be a separately established processing element, or may be integrated into a chip of the electronic device. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
It is to be understood that the electronic devices and the like described above include hardware structures and/or software modules for performing the respective functions in order to realize the functions described above. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
In the embodiment of the present application, the electronic device and the like may be divided into functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and another division manner may be available in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, in essence or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. An apparatus control method applied to a first apparatus, the method comprising:
capturing an image of a second device;
acquiring the device type of the second device, wherein the device type of the second device is determined according to the image of the second device;
matching the device type of the second device with a first device list, and determining a plurality of target devices of the same device type as the second device, wherein the first device list comprises a plurality of devices to be selected, the plurality of devices to be selected comprise the second device, and each device to be selected has a preset device type;
acquiring signal strength between the first device and each target device, wherein the signal strength is used for determining the second device from the target devices;
and displaying a control interface of the second device.
2. The method of claim 1, wherein the obtaining the device type of the second device comprises:
and performing image recognition on the image of the second equipment, and determining the equipment type of the second equipment according to the recognition result.
3. The method of claim 1, wherein the obtaining the device type of the second device comprises:
sending the image of the second device to a third device, wherein the image of the second device is used for the third device to determine the device type of the second device;
and receiving the device type of the second device sent by the third device.
4. The method of any of claims 1-3, wherein the first device is installed with a first application;
before the capturing the image of the second device, the method further comprises: the first application responds to the detected first operation of the user and starts a device control shooting mode;
the capturing an image of the second device includes:
the first application captures an image of a second device in response to a detected second operation of the user in the device control capture mode.
5. The method of any of claims 1-4, wherein the displaying the control interface of the second device comprises:
displaying a second device list based on the signal strength; wherein the second device list includes some or all of the plurality of target devices;
and responding to the detected operation that the user selects the second equipment in the second equipment list, and displaying a control interface of the second equipment.
6. The method of any of claims 1-4, wherein the displaying the control interface of the second device comprises:
and determining the target equipment corresponding to the highest signal intensity as the second equipment, and displaying a control interface of the second equipment.
7. The method of claim 5, wherein displaying a second list of devices based on the signal strength comprises:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and sequentially displaying the first target equipment and the second target equipment based on the sequence of the signal intensity from strong to weak.
8. The method of claim 7, wherein the sequentially displaying the first target device and the second target device based on the order of the signal strengths from strong to weak comprises:
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is greater than or equal to a preset signal intensity threshold value, sequentially displaying the first target equipment and the second target equipment based on the sequence of the signal intensity from strong to weak.
9. The method of claim 5, wherein displaying the second list of devices based on the signal strength comprises:
acquiring a first target device corresponding to the highest signal strength and a second target device corresponding to the second highest signal strength;
and if the difference value between the signal intensity of the first target equipment and the signal intensity of the second target equipment is smaller than a preset signal intensity threshold value, displaying the first target equipment and the second target equipment according to the historical use habit of the user, wherein the historical use habit of the user is used for representing statistical information of the use of the first target equipment and the second target equipment by the user.
10. The method of claim 9, wherein displaying the first target device and the second target device according to user historical usage habits comprises:
determining the use probabilities respectively corresponding to the first target equipment and the second target equipment according to the historical use habits of the user;
and sequentially displaying the first target equipment and the second target equipment based on the sequence from high to low of the use probability.
11. The method of any of claims 1-10, wherein the first device is a mobile terminal.
12. The method of any one of claims 1-11, wherein the second device is a smart home device.
13. A first device, comprising: a memory for storing computer program code, the computer program code comprising instructions that, when read from the memory by the first device, cause the first device to perform the method of any of claims 1-12.
14. An appliance control system comprising the first appliance and the third appliance according to claim 13, wherein,
the third device is used for receiving the image of the second device sent by the first device, performing image recognition on the image, and sending a recognition result to the first device, wherein the recognition result is the device type of the second device.
15. A computer readable storage medium comprising computer instructions which, when executed on the first device, cause the first device to perform the method of any one of claims 1-12.
CN202110861878.4A 2021-07-29 2021-07-29 Device control method, electronic device, and storage medium Pending CN115701032A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110861878.4A CN115701032A (en) 2021-07-29 2021-07-29 Device control method, electronic device, and storage medium
PCT/CN2022/106218 WO2023005706A1 (en) 2021-07-29 2022-07-18 Device control method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110861878.4A CN115701032A (en) 2021-07-29 2021-07-29 Device control method, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN115701032A true CN115701032A (en) 2023-02-07

Family

ID=85086261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110861878.4A Pending CN115701032A (en) 2021-07-29 2021-07-29 Device control method, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN115701032A (en)
WO (1) WO2023005706A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132036A (en) * 2023-02-17 2023-11-28 荣耀终端有限公司 Material distribution method and distribution system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107682237A (en) * 2017-09-14 2018-02-09 珠海格力电器股份有限公司 A kind of method that household electrical appliances are controlled by mobile terminal, mobile terminal and storage device
CN109410943A (en) * 2018-12-10 2019-03-01 珠海格力电器股份有限公司 Sound control method, system and the intelligent terminal of equipment
US20190171689A1 (en) * 2017-12-05 2019-06-06 Google Llc Optimizing item display on graphical user interfaces
CN110262264A (en) * 2019-06-24 2019-09-20 珠海格力电器股份有限公司 Simplify home equipment control method, device and the home equipment of user's operation
CN110908340A (en) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 Smart home control method and device
CN112671620A (en) * 2020-12-21 2021-04-16 珠海格力电器股份有限公司 Equipment control method and device, storage medium and mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102363828B1 (en) * 2014-11-01 2022-02-16 삼성전자주식회사 Method and system for generating a signal strength map
CN108370492B (en) * 2017-01-20 2021-08-20 华为技术有限公司 Indoor positioning method and equipment
CN110308660B (en) * 2019-06-06 2020-12-22 美的集团股份有限公司 Intelligent equipment control method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107682237A (en) * 2017-09-14 2018-02-09 珠海格力电器股份有限公司 A kind of method that household electrical appliances are controlled by mobile terminal, mobile terminal and storage device
US20190171689A1 (en) * 2017-12-05 2019-06-06 Google Llc Optimizing item display on graphical user interfaces
CN110908340A (en) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 Smart home control method and device
CN109410943A (en) * 2018-12-10 2019-03-01 珠海格力电器股份有限公司 Sound control method, system and the intelligent terminal of equipment
CN110262264A (en) * 2019-06-24 2019-09-20 珠海格力电器股份有限公司 Simplify home equipment control method, device and the home equipment of user's operation
CN112671620A (en) * 2020-12-21 2021-04-16 珠海格力电器股份有限公司 Equipment control method and device, storage medium and mobile terminal

Also Published As

Publication number Publication date
WO2023005706A1 (en) 2023-02-02

Similar Documents

Publication Publication Date Title
CN110191241B (en) Voice communication method and related device
CN110347269B (en) Empty mouse mode realization method and related equipment
CN112289313A (en) Voice control method, electronic equipment and system
CN111262975B (en) Bright screen control method, electronic device, computer-readable storage medium, and program product
CN110730114B (en) Method and equipment for configuring network configuration information
CN112492193B (en) Method and equipment for processing callback stream
CN110458902B (en) 3D illumination estimation method and electronic equipment
CN112312366B (en) Method, electronic equipment and system for realizing functions through NFC (near field communication) tag
WO2022156555A1 (en) Screen brightness adjustment method, apparatus, and terminal device
WO2022116930A1 (en) Content sharing method, electronic device, and storage medium
CN111835907A (en) Method, equipment and system for switching service across electronic equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
WO2023005706A1 (en) Device control method, electronic device, and storage medium
CN114125789A (en) Communication method, terminal device and storage medium
CN109285563B (en) Voice data processing method and device in online translation process
CN115119336B (en) Earphone connection system, earphone connection method, earphone, electronic device and readable storage medium
CN113467904B (en) Method and device for determining collaboration mode, electronic equipment and readable storage medium
WO2022022319A1 (en) Image processing method, electronic device, image processing system and chip system
CN111885768B (en) Method, electronic device and system for adjusting light source
CN113572798B (en) Device control method, system, device, and storage medium
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information
CN114521878A (en) Sleep evaluation method, electronic device and storage medium
CN114661258A (en) Adaptive display method, electronic device, and storage medium
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN113867520A (en) Device control method, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination