WO2023005706A1 - Procédé de commande de dispositif, dispositif électronique et support de stockage - Google Patents

Procédé de commande de dispositif, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2023005706A1
WO2023005706A1 PCT/CN2022/106218 CN2022106218W WO2023005706A1 WO 2023005706 A1 WO2023005706 A1 WO 2023005706A1 CN 2022106218 W CN2022106218 W CN 2022106218W WO 2023005706 A1 WO2023005706 A1 WO 2023005706A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
signal strength
target device
image
user
Prior art date
Application number
PCT/CN2022/106218
Other languages
English (en)
Chinese (zh)
Inventor
曹宇玮
徐文亮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023005706A1 publication Critical patent/WO2023005706A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B17/00Monitoring; Testing
    • H04B17/30Monitoring; Testing of propagation channels
    • H04B17/309Measuring or estimating channel quality parameters
    • H04B17/318Received signal strength
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the embodiments of the present application relate to the technical field of computers, and in particular, to a device control method, an electronic device, and a storage medium.
  • smart home appliances eg, smart speakers, smart lighting, smart sockets, smart TVs, smart refrigerators, etc.
  • corresponding applications are required for the control of the aforementioned smart home appliances.
  • users can control smart home appliances through the above-mentioned related applications.
  • the above related applications need to identify the smart home appliances to be controlled, so that the identified smart home appliances can be controlled.
  • the current related applications can only identify the types of smart home appliances.
  • the control efficiency of the above-mentioned smart home appliances will be reduced, which will bring bad results to the user. experience.
  • the embodiment of the present application provides a device control method, an electronic device, and a storage medium, so as to provide a method for identifying and controlling smart home appliances based on signal strength, thereby improving the identification efficiency, and further improving the identification of smart home appliances. Control the efficiency and can improve the user experience.
  • the embodiment of the present application provides a device control method applied to the first device, including:
  • the first device may be a terminal device such as a mobile phone or a tablet
  • the second device may be a smart home appliance such as a smart air conditioner, a smart TV, a smart refrigerator, a smart lighting, or a smart switch.
  • the first device can obtain the image of the second device by shooting with the camera.
  • the device type of the second device is determined according to the image of the second device; where the device type can be used to distinguish the type of the second device, for example, TV, refrigerator, air conditioner, lighting class, switch class, etc.
  • the device to be selected may be a smart home appliance to be selected, and the smart home appliance to be selected may form a local area network, and the smart home appliances connected to the local area network form a device list.
  • the signal strength between the first device and each target device is acquired, and the signal strength is used to determine the second device from the target devices; and a control interface of the second device is displayed.
  • the signal strength between the first device and each target device may be the received signal strength of Bluetooth, and the control interface is used to control the second device.
  • the candidate target smart home appliances are obtained based on the device type, and the smart home appliances to be controlled are determined among the candidate target smart home appliances according to the signal strength, and the control interface of the smart home appliances to be controlled is displayed, In this way, the identification efficiency can be improved, and the efficiency of controlling the smart home appliance can be further improved.
  • obtaining the device type of the second device includes:
  • obtaining the device type of the second device includes:
  • the image of the second device is sent to the third device; the image of the second device is used by the third device to determine the device type of the second device; and the device type of the second device sent by the third device is received.
  • the third device (cloud server) for recognition processing
  • the calculation burden of the first device can be reduced, and the accuracy of image recognition can be improved through the powerful computing power of the third device.
  • the first device is installed with the first application; before taking the image of the second device, it also includes:
  • the first application responds to the detected first operation of the user, enabling the device to control the shooting mode
  • Taking images of the second device includes:
  • the first application captures and obtains an image of the second device in response to the detected second operation of the user in the device control shooting mode.
  • displaying the control interface of the second device includes:
  • the second device list includes some or all of the plurality of target devices
  • a control interface of the second device is displayed.
  • the candidate smart home appliances are screened according to the signal strength, and the user selects the second device from the smart home appliances obtained after screening to realize the control of the second device, which can improve the control of the second device. Accuracy of device control.
  • displaying the control interface of the second device includes:
  • the target device corresponding to the highest signal strength is determined as the second device, and the control interface of the second device is displayed.
  • the identification efficiency can be improved and the user's operation can be simplified.
  • displaying the second device list based on the signal strength includes:
  • the first target device and the second target device are sequentially displayed.
  • displaying the first target device and the second target device sequentially based on the order of signal strength from strong to weak includes:
  • the first target device and the second target device are displayed sequentially.
  • the candidate smart home appliances are determined according to the signal strengths, which can improve the accuracy of identifying the second device.
  • displaying the second device list based on the signal strength includes:
  • the first target device and the second target device are displayed according to the user's historical usage habits, wherein the user's historical use Habits are used to characterize the statistical information of the user's use of the first target device and the second target device.
  • the candidate smart home appliances are determined according to the user's historical usage habits, which can improve the accuracy of identifying the second device.
  • displaying the first target device and the second target device according to the user's historical usage habits includes:
  • the first device is a mobile terminal.
  • the second device is a smart home appliance.
  • the embodiment of the present application provides a device control device applied to the first device, including:
  • a photographing module configured to photograph the image of the second device
  • An acquisition module configured to acquire the device type of the second device, where the device type of the second device is determined according to the image of the second device;
  • a matching module configured to match the device type of the second device with the first device list, and determine a plurality of target devices of the same device type as the second device, wherein the first device list includes a plurality of candidate devices, each to be The selected device has a preset device type;
  • the display module is used to acquire the signal strength between the first device and each target device, and the signal strength is used to determine the second device from the target devices; and display the control interface of the second device.
  • the acquisition module is further configured to perform image recognition on the image of the second device, and determine the device type of the second device according to the recognition result.
  • the acquisition module is further configured to send the image of the second device to the third device, and the image of the second device is used by the third device to determine the device type of the second device; receiving the image sent by the third device The device type of the second device.
  • the first device is installed with the first application, and the above apparatus further includes:
  • the starting module is used for the first application to start the device to control the shooting mode in response to the detected first operation of the user;
  • the above photographing module is further used for the first application to photograph and obtain an image of the second device in response to the detected second operation of the user in the device control photographing mode.
  • the display module is further configured to display a second device list based on signal strength; wherein, the second device list includes part or all of multiple target devices;
  • a control interface of the second device is displayed.
  • the display module is further configured to determine the target device corresponding to the highest signal strength as the second device, and display a control interface of the second device.
  • the above display module is further configured to obtain the first target device corresponding to the highest signal strength, and the second target device corresponding to the second highest signal strength;
  • the first target device and the second target device are sequentially displayed.
  • the display module is further configured to: if the difference between the signal strength of the first target device and the signal strength of the second target device is greater than or equal to a preset signal strength threshold, then based on the signal strength by In order from strong to weak, the first target device and the second target device are displayed sequentially.
  • the above display module is further configured to obtain the first target device corresponding to the highest signal strength, and the second target device corresponding to the second highest signal strength;
  • the first target device and the second target device are displayed according to the user's historical usage habits, wherein the user's historical use Habits are used to characterize the statistical information of the user's use of the first target device and the second target device.
  • the display module is further configured to determine the usage probabilities respectively corresponding to the first target device and the second target device according to the user's historical usage habits;
  • the first target device and the second target device are sequentially displayed based on the descending order of usage probability.
  • the first device is a mobile terminal.
  • the second device is a smart home appliance.
  • the embodiment of the present application provides a first device, including:
  • the above-mentioned memory is used to store computer program codes, and the above-mentioned computer program codes include instructions, when the above-mentioned first device reads the above-mentioned instructions from the above-mentioned memory so that the above-mentioned first device performs the following steps:
  • the device type of the second device is determined according to the image of the second device
  • the device type of the second device with the first device list to determine multiple target devices of the same device type as the second device, wherein the first device list includes a plurality of candidate devices, and each candidate device has a preset Equipment type;
  • making the above-mentioned first device execute the step of obtaining the device type of the second device includes:
  • making the above-mentioned first device execute the step of obtaining the device type of the second device includes:
  • the first device is installed with a first application; when the above-mentioned instructions are executed by the above-mentioned first device, before the above-mentioned first device executes the step of capturing an image of the second device, the following steps are further included:
  • the first application responds to the detected first operation of the user, enabling the device to control the shooting mode
  • the step of causing the above-mentioned first device to perform capturing the image of the second device includes:
  • the first application captures and obtains an image of the second device in response to the detected second operation of the user in the device control shooting mode.
  • the step of causing the above-mentioned first device to display the control interface of the second device includes:
  • the second device list includes some or all of the plurality of target devices
  • a control interface of the second device is displayed.
  • the step of causing the above-mentioned first device to display the control interface of the second device includes:
  • the target device corresponding to the highest signal strength is determined as the second device, and the control interface of the second device is displayed.
  • making the above-mentioned first device perform the step of displaying the second device list based on signal strength includes:
  • the first target device and the second target device are sequentially displayed.
  • the above-mentioned first device executes the step of sequentially displaying the first target device and the second target device based on the order of signal strength from strong to weak include:
  • the first target device and the second target device are displayed sequentially.
  • making the above-mentioned first device perform the step of displaying the second device list based on signal strength includes:
  • the first target device and the second target device are displayed according to the user's historical usage habits, wherein the user's historical use Habits are used to characterize the statistical information of the user's use of the first target device and the second target device.
  • making the above-mentioned first device execute the step of displaying the first target device and the second target device according to the user's historical usage habits includes:
  • the first target device and the second target device are sequentially displayed based on the descending order of usage probability.
  • the first device is a mobile terminal.
  • the second device is a smart home appliance.
  • an embodiment of the present application provides a device control system, including the first device and the third device as described in the third aspect, wherein,
  • the third device is configured to receive the image of the second device sent by the first device, perform image recognition on the image, and send the recognition result to the first device, where the recognition result is the device type of the second device.
  • the embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it is run on a computer, it causes the computer to execute the method as described in the first aspect.
  • an embodiment of the present application provides a computer program, which is used to execute the method described in the first aspect when the above computer program is executed by a computer.
  • all or part of the program in the sixth aspect may be stored on a storage medium packaged with the processor, or part or all may be stored on a memory not packaged with the processor.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of the software architecture of the electronic device provided by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of an embodiment of a device control method provided by the present application.
  • Fig. 5a and Fig. 5b are schematic diagrams of device control shooting mode settings provided by the embodiment of the present application.
  • Fig. 6a-Fig. 6c are schematic diagrams showing the control interface provided by the embodiment of the present application.
  • Fig. 7 is a schematic structural diagram of an embodiment of an equipment control device provided by the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, "plurality” means two or more.
  • smart home appliances for example, smart speakers, smart light bulbs, smart sockets, smart TVs, smart refrigerators, etc.
  • corresponding applications are required for the control of the aforementioned smart home appliances.
  • users can control smart home appliances through the above-mentioned related applications.
  • the above related applications need to identify the smart home appliances to be controlled, so that the identified smart home appliances can be controlled.
  • the current related applications can only identify the types of smart home appliances.
  • the control efficiency of the above-mentioned smart home appliances will be reduced, which will bring bad results to the user. experience.
  • the first device 10 may be a mobile terminal with a camera and a display screen.
  • the first device 10 may also be called terminal equipment, user equipment (User Equipment, UE), access terminal, subscriber unit, subscriber station, mobile station, mobile station, remote station, remote terminal, mobile equipment, user terminal, terminal, A wireless communication device, user agent, or user device.
  • User Equipment User Equipment
  • the first device 10 may be a cellular telephone, a cordless telephone, a Personal Digital Assistant (PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, a computer, a laptop computer , handheld communication equipment, handheld computing equipment, satellite wireless equipment, customer premise equipment (Customer Premise Equipment, CPE) and/or other equipment used to communicate on wireless systems and next-generation communication systems, for example, in 5G networks mobile terminal or a mobile terminal in a public land mobile network (Public Land Mobile Network, PLMN) network that will evolve in the future.
  • PLMN Public Land Mobile Network
  • FIG. 1 is an application scenario of the above device control method.
  • the above application scenario includes a first device 10 , multiple second devices 20 and third devices 30 .
  • the second device 20 may be a smart home appliance, for example, a smart speaker, a smart lighting, a smart socket, a smart TV, a smart refrigerator, and the like.
  • the embodiment of the present application does not specifically limit the form of the above-mentioned second device 20 .
  • the third device 30 may be a cloud device (eg, a cloud server).
  • the first device 10 and the second device 20 may be connected via wireless, wherein the wireless connection may be WIFI or Bluetooth, or other wireless communication methods.
  • the wireless connection mode between 20 is not particularly limited.
  • the first device 10 and the third device 30 can be connected through WIFI.
  • the first device 10 can be connected to a WIFI router, and a connection can be established with the third device 30 through the WIFI router.
  • the first device 10 can also be connected through a WIFI router.
  • the mobile communication network (for example, 4G, 5G, etc.) communicates with the base station, and establishes a connection with the third device 30 through the base station.
  • the embodiment of the present application does not make special arrangements for the wireless connection between the first device 10 and the third device 30. limited.
  • FIG. 2 is a schematic diagram of the software architecture of the first device 10 .
  • the first device 10 includes a hardware layer 11 , a driver layer 12 , a hardware abstraction layer 13 and an application layer 14 .
  • the hardware layer 11 may be used to drive devices (eg, a camera) in the first device 10 to capture images.
  • devices eg, a camera
  • the hardware layer 11 can drive the camera to take pictures and obtain images.
  • the driver layer 12 can be used to acquire the image taken by the hardware layer 11 and upload the image to the hardware abstraction layer 13 .
  • the hardware abstraction layer 13 can be used to receive the image uploaded by the driver layer 12 and can upload the image to the application layer 14 .
  • the application layer 14 can be used to identify the image uploaded by the hardware abstraction layer 13, and can display the control interface of the second device 20 based on the identification result, or send the image uploaded by the hardware abstraction layer 13 to the third device 30, and the third device
  • the device 30 recognizes the image above, and sends the recognition result to the first device 10. After the first device 10 receives the recognition result sent by the third device 30, it can display the control interface of the second device 20 based on the recognition result. In order to realize the control of the second device 20.
  • FIG. 3 shows a schematic structural diagram of an electronic device 100 , which may be the above-mentioned first device 10 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, and calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement, thereby achieving anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a clamshell machine, the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation may be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • Fig. 4 is a schematic flowchart of an embodiment of a smart device control method provided in the embodiment of the present application, including:
  • Step 401 in response to a detected first user operation, the first device 10 starts a device control shooting mode.
  • the user can operate on the display screen 194 of the first device 10 to enable the device control shooting mode of the first device 10 .
  • the device-controlled shooting mode may be referred to as a smart-sensing shooting mode.
  • the first device 10 may pre-install a first application.
  • the first application may be a system application or a third-party application.
  • the embodiment of the present application does not specifically limit the type of the above-mentioned first application.
  • the first application may be used to provide a smart shooting service, and in specific implementation, the first application may be called a smart remote control application.
  • the user can click the icon of the above-mentioned first application on the desktop of the first device 10 to start the first application, and can enter the device control shooting mode setting page.
  • the above example only shows how the user clicks to open the first application, and does not constitute a limitation to the embodiment of the present application.
  • the above first application can also be opened in other ways , for example, double-tap, slide, etc.
  • the user can enable the device control shooting mode on the device control shooting mode setting page.
  • the user can turn on the switch of the smart remote control to turn on the device to control the shooting mode.
  • a page 500 shown in FIG. 5 a can be obtained, which is a device control shooting mode setting page.
  • the page 500 includes a smart remote control switch 501.
  • the smart remote control switch 501 is in the off state, that is, as mentioned, the device control shooting mode is not turned on at this time.
  • the user can click the smart remote control switch 501 on the page 500, thereby obtaining the page 510 shown in FIG. 5b.
  • the smart remote control switch 501 is in an on state, that is to say, at this moment, the first device 10 enters a device control shooting mode.
  • Step 402 in response to the detected second operation of the user in the device control shooting mode, the first device 10 acquires an image of the second device 20 .
  • the user can return to the desktop in the first device 10 and point the camera 193 of the first device 10 at the object to be photographed (for example, the second device 20 ), so as to obtain an image of the second device 20 .
  • the first device 10 since the first device 10 has turned on the device control shooting mode in step 301, the user does not need to manually open the camera application.
  • the first The device 10 can automatically turn on the camera and obtain the image of the second device 20 by shooting, thereby improving the convenience for the user to take images, and further improving the user experience.
  • Step 403 the first device 10 acquires the recognition result of the image of the second device 20 .
  • the recognition result may be obtained based on the image of the second device 20 .
  • the above-mentioned manner of obtaining the recognition result may be that the first device 10 performs local recognition on the image of the second device 20 , for example, the image recognition algorithm in the first device 10 may be used to perform a local recognition on the image of the second device 20 Recognition, from which the recognition result can be obtained.
  • the above method of obtaining the recognition result may also be that the first device 10 uploads the image of the second device 20 to the third device 30, and the image of the second device 20 is processed by the powerful computing power of the third device 30.
  • the recognition is carried out, and thus the recognition result can be obtained, and then the third device 30 can return the recognition result to the first device 10 , thereby reducing the calculation burden of the first device 10 .
  • the above identification result may be the device type of the smart home appliance, for example, the device type may be TV, lighting, air conditioner and so on.
  • the device type may be TV, lighting, air conditioner and so on.
  • the user may have multiple TVs in his home, for example, one in the living room, one in the master bedroom, and one in the second bedroom.
  • the recognition of the image of the second device 20 if the device type of the second device 20 is identified as a TV, since it is unknown which TV it is, it is necessary to further determine which type of the second device 20 is. TV, so that the second device 20 can be controlled.
  • step 404 the first device 10 acquires a preset device list.
  • the preset device list may include multiple smart home appliances to be selected. It can be understood that the above-mentioned smart home appliances can be interconnected through WIFI. Wherein, each smart home appliance to be selected has a preset device type. Exemplarily, the above-mentioned smart home appliances may establish a local area network in advance.
  • the local area network can be one or a combination of bus, ring, star and tree. The embodiment of the present application does not specifically limit the type of the foregoing local area network.
  • the first device 10 may acquire the aforementioned preset device list.
  • the above-mentioned manners for the first device 10 to obtain the preset device list may include the following two methods. Firstly, the first device 10 can be connected to a local area network through WIFI, and can obtain the aforementioned preset device list through the local area network. That is to say, the first device 10 can inquire about the candidate smart home appliances connected to the local area network. Second, when the above-mentioned smart home appliances are connected to the local area network, the above-mentioned preset device list can be sent to the third device 30 . The first device 10 may send a device list request to the third device 30, for requesting a device list of smart home appliances to be selected. After receiving the device list request from the first device 10 , the third device 30 may send the device list of the smart home appliances to be selected to the first device 10 .
  • any smart home appliance to be selected exits the local area network, for example, the smart home appliance is offline, the smart home appliance no longer appears in the preset device list.
  • the smart home appliance becomes a candidate smart home appliance and appears in the above-mentioned default device list.
  • the above-mentioned preset device list may be synchronized with the third device 30 .
  • step 405 the first device 10 determines the number of target devices of the same device type as those in the recognition result in the preset device list.
  • the first device 10 can match the identified device type with the device type of each candidate smart home appliance device in the preset device list, so that the devices in the local area network and the second device 20 can be obtained
  • the number of target devices of the same type Exemplarily, it is assumed that there are multiple televisions in the user's home, for example, there is one in the living room, one in the master bedroom, and one in the second bedroom.
  • the device type of the second device 20 is identified as a TV
  • the number of target devices (TVs) can be determined to be 3 (one in the living room, one in the master bedroom, and one in the second bedroom).
  • Step 406 the first device 10 displays the control interface of the second device 20 .
  • the second device 20 may be determined among the target devices, and a control interface of the second device 20 may be displayed.
  • the above control interface can be realized by a control card, and each control card can correspond to a smart home appliance, that is, each control card can be used to control a smart home appliance.
  • each control card can include The function menu and controls are used to control the corresponding smart home appliances.
  • the above manners of displaying the control interface may include but not limited to the following two.
  • the first device 10 may first determine the priorities of multiple target devices, and display a device list of the target devices on the desktop of the first device 10 in order of priority, wherein the device list of the target device may include multiple target device. After the user selects one of the devices in the device list of the target device, the interface of the first device 10 may display a corresponding control interface for controlling the second device 20 .
  • the first device 10 after the first device 10 obtains the number of target devices, it can compare the number with a preset first threshold, thereby determining whether to directly display the device list of the target device on the interface of the first device 10 .
  • the preset first threshold above can be set to 2.
  • the number of target devices is less than 2, that is, if the first device 10 only matches one target device, at this time, the first device 10 can directly Display the control interface on the interface.
  • the first device 10 can further determine the priorities of the above-mentioned multiple target devices, and can The device list is displayed in the order of high and low.
  • the above priorities may be used to distinguish the display order of the target devices, for example, target devices with higher priorities may be displayed preferentially in the above device list.
  • the above-mentioned method of determining the priority of the target device can be realized by receiving signal strength (Received Signal Strength Indication, RSSI), and the above-mentioned RSSI is used to characterize the received signal strength of Bluetooth between the first device 10 and the target device.
  • RSSI Receiveived Signal Strength Indication
  • the first device 10 may obtain the RSSI of each target device, and compare the RSSIs of all target devices. If the RSSI differences between the target devices are obvious, the priority can be determined according to the RSSI value of each target device, and sorting can be performed according to the priority. It can be understood that, the above method of determining the RSSI difference may be: obtaining the maximum RSSI and the second largest RSSI, and calculating the difference between the largest RSSI and the second largest RSSI, if the difference is greater than or equal to the preset second threshold, then It can be determined that the RSSI difference between the target devices is significant. If the difference is smaller than the preset second threshold, it can be determined that the RSSI difference between the target devices is not obvious.
  • the priority of the target device can be determined according to the RSSI of each target, for example, the target device corresponding to the highest RSSI (for the convenience of description, the following will be “the target corresponding to the highest RSSI device” is called “the first target device”) can be the highest priority, and the target device corresponding to the second highest RSSI (for the convenience of explanation, the following "target device corresponding to the second highest RSSI” is called “the second target device”) can be is the next highest priority, and so on.
  • the device list including the first target device and the second target device may be displayed on the interface of the first device 10 according to the order of the priority.
  • the above example only exemplarily shows the scene of the first target device and the second target device, but is not limited to displaying only the first target device and the second target device.
  • the display method of the device list of the target device may be in the form of a pop-up window, and the embodiment of the present application does not specifically limit the display method of the device list.
  • 6a is an interface before the first device 10 displays the device list of the target device.
  • the first device 10 compares the RSSI, it is determined that the RSSI value of the first target device is much greater than the RSSI value of the second target device, for example, the RSSI difference between the first target device and the second target device is greater than the preset second threshold. That is to say, the RSSI difference between the first target device and the second target device is obvious, and the priority of the first target device is higher than that of the second target device.
  • the first device 10 may display an interface 600 as shown in FIG. 6b. As shown in FIG.
  • the interface 600 may include a device list 601 of a first target device 6011 and a second target device 6012 . Because the priority of the first target device 6011 is higher than that of the second target device 6012, according to people's usage habits from left to right and top to bottom, the position of the first target device 6011 in the device list 601 can be located in the second target device Left side of device 6012. Optionally, the position of the first target device 6011 in the device list 601 may be on the upper side of the second target device 6012 according to people's usage habits from top to bottom. It can be understood that, the above displaying the priority of the target device through the display position in the device list is only an exemplary illustration, and does not constitute a limitation to the embodiment of the present application. In some embodiments, the target device can also be reflected through other display methods. The priority of the device.
  • the user can operate on the first target device 6011 or the second target device 6012 in the device list 601, such as clicking or sliding the first target device 6011 or the second target device 6012, so that the corresponding control interface.
  • the user clicking on the first target device 6011 as an example, an interface 610 as shown in FIG. 6c can be obtained.
  • FIG. 6c As shown in FIG. It can be used to control the first target device 6011 (that is, the second device 20).
  • the priority of the target device can be determined according to the user's historical usage habits, wherein the user's historical usage habits can be used for Statistical information characterizing the use of target devices (eg, the first target device 6011 and the second target device 6012 ) by the user.
  • the statistical information may include which smart home appliance is used by the user at which moment, and the statistical information may be statistical information obtained by counting user data within a period (for example, one month, half a year, or one year, etc.).
  • a user prediction model can be obtained through pre-training, and the user prediction model can be used to predict the probability of a user using a certain target device at a specific moment.
  • the current moment can be input into the above-mentioned user prediction model, so that the probability that the user uses all the smart home appliances in the home at the current moment can be obtained, and then, the first device 10 can filter the usage probabilities of all the above-mentioned smart home appliances.
  • the usage probability of the target device corresponding to the currently identified device type is obtained.
  • the current time and device type may also be input into the above-mentioned user prediction model, so as to obtain the probability that the user uses the target device corresponding to the above-mentioned device type at the current time.
  • the determination of the usage probability of the target device according to the time is only an exemplary description, and does not constitute a limitation to the embodiment of the present application.
  • the current user's identity can also be input in the user prediction model. It is used to comprehensively determine the probability that the current user uses the target device. Different users have different probabilities of using the same target device at the same time. For example, the probability of the elderly at home turning on the radio in the morning is higher than that of children, and the probability of young and middle-aged people at home turning on the TV at night is higher than that of the elderly. Through Comprehensive decision-making of user identification and time, which can improve the accuracy of prediction.
  • the priority of the target device may be determined according to the usage probability corresponding to each target device.
  • the target device with the highest usage probability has the highest priority
  • the target device with the second highest usage probability has the second highest priority
  • the user prediction model is usually trained using a neural network, that is, the above user prediction model is usually a neural network model.
  • the first device 10 may also preset the mapping relationship between the time and/or the user identifier and the target device, so that the target device that the user may use at the current moment may be determined according to the time and/or the user identifier.
  • the elderly at home usually turn on the radio at 7 am
  • the young and middle-aged at home usually turn on the TV at 8 pm.
  • the target device corresponding to the current moment and/or user can be output, thereby reducing time delay and system consumption.
  • the above mapping relationship may be a one-to-one mapping, for example, a time and a user identifier may be mapped to a target device.
  • the above-mentioned mapping relationship may be a one-to-many mapping, for example, a moment and a user identifier may be mapped to multiple target devices, wherein each mapped target device may have a corresponding usage probability.
  • the middle-aged and young people at home turn on the TV at 8:00 p.m., wherein the probability of turning on the TV in the living room is 80%, and the probability of turning on the TV in the bedroom is 20%.
  • the priority of the target device may be determined according to the usage probability of the target device, for example, a target device with a higher usage probability has a higher priority.
  • the device list of the target device may be further displayed on the interface of the first device 10 in order of priority.
  • the control interface corresponding to the second device 20 may be displayed on the interface of the first device 10 .
  • the first device 10 may first determine the priorities of multiple target devices, and may display the control interface of the target device with the highest priority on the interface of the first device 10, and the control interface of the target device with the highest priority may be It is regarded as the control interface of the second device 20 and is used to control the second device 20 .
  • the first device 10 after the first device 10 obtains the number of the target device, it can compare the number with the preset first threshold, thereby determining whether to directly display the control of the second device 20 on the interface of the first device 10 interface.
  • the preset first threshold above can be set to 2.
  • the number of target devices is less than 2, that is, if the first device 10 only matches one target device, at this time, the first device 10 can directly
  • the control interface of the target device is displayed on the interface, and the control interface of the target device can be used to control the second device 20 .
  • the first device 10 may further determine the priority of the above-mentioned multiple target devices, and may The control interface of the target device with the highest priority is displayed on the interface of the device 10, thereby facilitating the operation of the user and improving the control efficiency of the smart home appliance.
  • the above method of determining the priority of the target device may be implemented through RSSI, or may be based on the user's historical usage habits.
  • Method 1 For the specific process of determining the priority of the target device based on the RSSI and the user's historical usage habits, reference may be made to Method 1, which will not be repeated here.
  • step 401-step 406 are all optional steps, and this application only provides a feasible embodiment, which may also include more or fewer steps than step 401-step 406. Applications are not limited to this.
  • FIG. 7 is a schematic structural diagram of an embodiment of the device control device of the present application.
  • the above device control device 70 is applied to the first device and may include: a shooting module 71 , an acquisition module 72 , a matching module 73 and a display module 74 ;in,
  • a photographing module 71 configured to photograph an image of the second device
  • An acquisition module 72 configured to acquire a device type of the second device, where the device type of the second device is determined according to the image of the second device;
  • the matching module 73 is configured to match the device type of the second device with the first device list, and determine a plurality of target devices of the same device type as the second device, wherein the first device list includes a plurality of candidate devices, each The device to be selected has a preset device type;
  • the display module 74 is configured to acquire the signal strength between the first device and each target device, and the signal strength is used to determine the second device from the target devices; and display the control interface of the second device.
  • the acquisition module 72 is further configured to perform image recognition on the image of the second device, and determine the device type of the second device according to the recognition result.
  • the acquisition module 72 is also used to send the image of the second device to the third device, and the image of the second device is used by the third device to determine the device type of the second device; The device type of the sending second device.
  • the first device is installed with the first application, and the device control apparatus 70 further includes:
  • the starting module 75 is used for the first application to start the device control shooting mode in response to the detected first operation of the user;
  • the above-mentioned photographing module 71 is also used for the first application to photograph and obtain an image of the second device in response to the detected second operation of the user in the device control photographing mode.
  • the display module 74 is further configured to display a second device list based on signal strength; wherein, the second device list includes part or all of multiple target devices;
  • a control interface of the second device is displayed.
  • the display module 74 is further configured to determine the target device corresponding to the highest signal strength as the second device, and display a control interface of the second device.
  • the display module 74 is further configured to obtain the first target device corresponding to the highest signal strength, and the second target device corresponding to the second highest signal strength;
  • the first target device and the second target device are sequentially displayed.
  • the display module 74 is further configured to: if the difference between the signal strength of the first target device and the signal strength of the second target device is greater than or equal to a preset signal strength threshold, then based on the signal strength Display the first target device and the second target device in order from strong to weak.
  • the display module 74 is further configured to obtain the first target device corresponding to the highest signal strength, and the second target device corresponding to the second highest signal strength;
  • the first target device and the second target device are displayed according to the user's historical usage habits, wherein the user's historical use Habits are used to characterize the statistical information of the user's use of the first target device and the second target device.
  • the display module 74 is further configured to determine the usage probabilities respectively corresponding to the first target device and the second target device according to the user's historical usage habits;
  • the first target device and the second target device are sequentially displayed based on the descending order of usage probability.
  • the first device is a mobile terminal.
  • the second device is a smart home appliance.
  • the device control device 70 provided by the embodiment shown in FIG. 7 can be used to implement the technical solutions of the method embodiments shown in FIGS. 1-6 of this application. For its realization principles and technical effects, please refer to the relevant descriptions in the method embodiments.
  • each step of the above method or each module above can be completed by an integrated logic circuit of hardware in the processor element or an instruction in the form of software.
  • the above modules may be one or more integrated circuits configured to implement the above method, for example: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as: ASIC), or, one or more microprocessors A Digital Signal Processor (hereinafter referred to as: DSP), or, one or more field programmable gate arrays (Field Programmable Gate Array; hereinafter referred to as: FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • these modules can be integrated together and implemented in the form of a System-On-a-Chip (hereinafter referred to as SOC).
  • SOC System-On-a-Chip
  • the interface connection relationship among the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the above-mentioned electronic devices include corresponding hardware structures and/or software modules for performing each function.
  • the embodiments of the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the example units and algorithm steps described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Professionals and technicians may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the embodiments of the present application.
  • the embodiment of the present application may divide the above-mentioned electronic equipment into functional modules according to the above-mentioned method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • Each functional unit in each embodiment of the embodiment of the present application may be integrated into one processing unit, or each unit may physically exist separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage
  • the medium includes several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk, and other various media capable of storing program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Electromagnetism (AREA)
  • Telephone Function (AREA)

Abstract

La présente demande se rapporte au domaine de la technologie informatique et concerne, dans des modes de réalisation de cettedernière, un procédé de commande de dispositif, un dispositif électronique et un support de stockage. Le procédé comprend les étapes consistant : à photographier une image d'un second dispositif; à obtenir un type de dispositif du second dispositif, le type de dispositif du second dispositif étant déterminé en fonction de l'image du second dispositif; à effectuer une mise en correspondance sur le type de dispositif du second dispositif et une première liste de dispositifs, et à déterminer une pluralité de dispositifs cibles qui sont le même type de dispositif que le second dispositif, la première liste de dispositifs comprenant une pluralité de dispositifs à sélectionner, la pluralité de dispositifs à sélectionner comprenant le second dispositif, et chaque dispositif à sélectionner ayant un type de dispositif prédéfini; à obtenir une intensité de signal entre le premier dispositif et chaque dispositif cible, les intensités de signal étant utilisées pour déterminer le second dispositif parmi les dispositifs cibles; et à afficher une interface de commande du second dispositif. Un procédé décrit dans un mode de réalisation de la présente demande peut améliorer l'efficacité de commande sur un appareil ménager intelligent.
PCT/CN2022/106218 2021-07-29 2022-07-18 Procédé de commande de dispositif, dispositif électronique et support de stockage WO2023005706A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110861878.4 2021-07-29
CN202110861878.4A CN115701032A (zh) 2021-07-29 2021-07-29 设备控制方法、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023005706A1 true WO2023005706A1 (fr) 2023-02-02

Family

ID=85086261

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/106218 WO2023005706A1 (fr) 2021-07-29 2022-07-18 Procédé de commande de dispositif, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN115701032A (fr)
WO (1) WO2023005706A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132036A (zh) * 2023-02-17 2023-11-28 荣耀终端有限公司 一种物料配送方法及配送系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160127209A1 (en) * 2014-11-01 2016-05-05 Samsung Electronics Co., Ltd. Method and system for generating a signal strength map
CN107682237A (zh) * 2017-09-14 2018-02-09 珠海格力电器股份有限公司 一种通过移动终端对家电进行控制的方法、移动终端及存储装置
CN108370492A (zh) * 2017-01-20 2018-08-03 华为技术有限公司 一种室内定位的方法和设备
CN109410943A (zh) * 2018-12-10 2019-03-01 珠海格力电器股份有限公司 设备的语音控制方法、系统和智能终端
CN110308660A (zh) * 2019-06-06 2019-10-08 美的集团股份有限公司 智能设备控制方法及装置
CN110908340A (zh) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 智能家居的控制方法及装置
CN112671620A (zh) * 2020-12-21 2021-04-16 珠海格力电器股份有限公司 一种设备控制方法、装置、存储介质及移动终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190171689A1 (en) * 2017-12-05 2019-06-06 Google Llc Optimizing item display on graphical user interfaces
CN110262264B (zh) * 2019-06-24 2021-06-22 珠海格力电器股份有限公司 简化用户操作的家居设备控制方法、装置及家居设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160127209A1 (en) * 2014-11-01 2016-05-05 Samsung Electronics Co., Ltd. Method and system for generating a signal strength map
CN108370492A (zh) * 2017-01-20 2018-08-03 华为技术有限公司 一种室内定位的方法和设备
CN107682237A (zh) * 2017-09-14 2018-02-09 珠海格力电器股份有限公司 一种通过移动终端对家电进行控制的方法、移动终端及存储装置
CN110908340A (zh) * 2018-09-14 2020-03-24 珠海格力电器股份有限公司 智能家居的控制方法及装置
CN109410943A (zh) * 2018-12-10 2019-03-01 珠海格力电器股份有限公司 设备的语音控制方法、系统和智能终端
CN110308660A (zh) * 2019-06-06 2019-10-08 美的集团股份有限公司 智能设备控制方法及装置
CN112671620A (zh) * 2020-12-21 2021-04-16 珠海格力电器股份有限公司 一种设备控制方法、装置、存储介质及移动终端

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132036A (zh) * 2023-02-17 2023-11-28 荣耀终端有限公司 一种物料配送方法及配送系统

Also Published As

Publication number Publication date
CN115701032A (zh) 2023-02-07

Similar Documents

Publication Publication Date Title
CN112351322B (zh) 一种通过遥控器实现一碰投屏的终端设备、方法以及系统
CN111262975B (zh) 亮屏控制方法、电子设备、计算机可读存储介质和程序产品
CN110730114B (zh) 一种网络配置信息的配置方法及设备
CN112492193B (zh) 一种回调流的处理方法及设备
EP4250075A1 (fr) Procédé de partage de contenu, dispositif électronique et support de stockage
CN111835907A (zh) 一种跨电子设备转接服务的方法、设备以及系统
WO2022156555A1 (fr) Procédé de réglage de luminosité d'écran, appareil et dispositif terminal
CN114422340A (zh) 日志上报方法、电子设备及存储介质
CN114125789A (zh) 通信方法、终端设备及存储介质
US20230230343A1 (en) Image Processing Method, Electronic Device, Image Processing System, and Chip System
WO2023005706A1 (fr) Procédé de commande de dispositif, dispositif électronique et support de stockage
CN113676339B (zh) 组播方法、装置、终端设备及计算机可读存储介质
CN114554012A (zh) 来电接听方法、电子设备及存储介质
CN114521878A (zh) 睡眠评估方法、电子设备及存储介质
WO2020078267A1 (fr) Procédé et dispositif de traitement de données vocales dans un processus de traduction en ligne
WO2022135144A1 (fr) Procédé d'affichage auto-adaptatif, dispositif électronique et support de stockage
CN113467904B (zh) 确定协同模式的方法、装置、电子设备和可读存储介质
CN111885768B (zh) 调节光源的方法、电子设备和系统
WO2023005882A1 (fr) Procédé de photographie, procédé d'apprentissage de paramètre de photographie, dispositif électronique et support de stockage
WO2024055881A1 (fr) Procédé de synchronisation d'horloge, dispositif électronique, système, et support de stockage
WO2023020420A1 (fr) Procédé d'affichage de volume, dispositif électronique et support de stockage
WO2022105670A1 (fr) Procédé d'affichage et terminal
WO2024001735A1 (fr) Procédé de connexion de réseau, dispositif électronique et support de stockage
WO2024093614A1 (fr) Procédé et système d'entrée de dispositif, dispositif électronique et support de stockage
CN114691066A (zh) 一种应用的显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22848311

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE