CN114091006A - Device connection method and electronic device - Google Patents

Device connection method and electronic device Download PDF

Info

Publication number
CN114091006A
CN114091006A CN202010858224.1A CN202010858224A CN114091006A CN 114091006 A CN114091006 A CN 114091006A CN 202010858224 A CN202010858224 A CN 202010858224A CN 114091006 A CN114091006 A CN 114091006A
Authority
CN
China
Prior art keywords
electronic device
user
facial
coordinate system
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010858224.1A
Other languages
Chinese (zh)
Inventor
熊刘冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202010858224.1A priority Critical patent/CN114091006A/en
Priority to PCT/CN2021/113948 priority patent/WO2022042460A1/en
Publication of CN114091006A publication Critical patent/CN114091006A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

A device connection method and an electronic device relate to the technical field of communication, and the method comprises the following steps: sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device, acquiring a face image of a user, calculating spatial coordinates of a preset point of the face of the user in the coordinate system of the first electronic device, identifying face orientation information of the user from the face image, determining a first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the preset point of the face and the first position information, and initiating a connection request to the first target device to establish connection with the first target device. The connection method provided by the application can facilitate the user to connect the equipment to be controlled.

Description

Device connection method and electronic device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a device connection method and an electronic device.
Background
The operation of projecting a screen to a large-screen device such as a TV or connecting a sound box to play music is complicated, and a user needs to click the screen or connect the device, wait for the device list to be refreshed, identify which device is the target device, click the target device, and sometimes may make a mistake.
Disclosure of Invention
The application provides a device connection method and an electronic device, and also provides a computer-readable storage medium to provide a device connection method, which can establish communication connection with a target device according to the face orientation of a user.
In a first aspect, the present application provides a device connection method, applied to a first electronic device, including:
sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device;
acquiring a facial image of a user, and calculating the spatial coordinates of a facial preset point of the user in a coordinate system of first electronic equipment;
identifying facial orientation information of a user from the facial image;
determining a first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the face preset point and the first position information;
and initiating a connection request to the first target device to establish connection with the first target device.
Further, sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device, includes:
sending a positioning request to at least one second electronic device to acquire an alignment angle and a distance between the first electronic device and the second electronic device;
based on the alignment angle and the distance, spatial coordinates of the second electronic device in a coordinate system of the first electronic device are determined.
Further, identifying facial orientation information of the user from the facial image includes:
the face image of the user is input into a preset recognition model to obtain the face orientation of the user in the coordinate system of the first electronic device.
Further, calculating spatial coordinates of the user's facial preset points, comprising:
acquiring a 3D model of a face image of a user;
determining the coordinates of the face preset point of the user and the face preset point of the user in a coordinate system of the 3D model;
and converting the coordinates of the face preset points in the 3D model coordinate system into the space coordinates of the space coordinate system of the first electronic equipment.
Further, determining a first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the facial preset point and the first position information, includes:
establishing an indication vector based on the face orientation information of the user and the spatial coordinates of the face preset points;
based on the first location information, a distance between the second electronic device and the indication vector is calculated to determine a first target device in the second electronic device.
Further, sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device, further comprising:
sending a positioning request to at least one second electronic device, so that the second electronic device determines second position information of the first electronic device in a coordinate system of the second electronic device according to the positioning request;
before determining the first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the face preset point and the first position information, the method further includes:
receiving screen orientation information of the second electronic device, wherein the screen orientation information is determined by the second electronic device based on the second location information;
screening the second electronic equipment based on the screen orientation information;
alternatively, the first and second electrodes may be,
receiving positioning information from the second electronic device, and judging whether a screen of the second electronic device faces the first electronic device or not based on the positioning information;
and screening the second electronic equipment based on the judgment result.
Further, before sending the positioning request to at least one second electronic device, the method further includes:
detecting a first instruction for connecting with a second electronic device;
in response to the detected first instruction, a connected mode is opened to send a positioning request.
Further, after initiating a connection request to the first target device to establish a connection with the first target device, the method further includes:
acquiring a detection range based on the indication vector and a preset angle;
detecting whether a second target device except the first target device is included in the detection range;
if a second target device except the first target device exists, detecting a second instruction for connecting with the second target device;
and responding to the second instruction, so that the first electronic equipment establishes connection with the second target equipment.
In a second aspect, the present application further provides an electronic device, including a processor and a storage device, where the storage device stores an application program, and when the application program is executed by the processor, the electronic device executes the following steps:
sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device;
acquiring a facial image of a user, and calculating the spatial coordinates of a facial preset point of the user in a coordinate system of first electronic equipment;
identifying facial orientation information of a user from the facial image;
determining a first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the face preset point and the first position information;
and initiating a connection request to the first target device to establish connection with the first target device.
Further, the application program, when executed by the processor, causes the electronic device to perform the step of sending a positioning request to at least one second electronic device for determining first position information of the second electronic device in a coordinate system of the first electronic device, comprising the steps of:
sending a positioning request to at least one second electronic device to acquire an alignment angle and a distance between the first electronic device and the second electronic device;
based on the alignment angle and the distance, spatial coordinates of the second electronic device in a coordinate system of the first electronic device are determined.
Further, the application program, when executed by the processor, causes the electronic device to perform the step of identifying facial orientation information of the user from the facial image, comprising the steps of:
the face image of the user is input into a preset recognition model to obtain the face orientation of the user in the coordinate system of the first electronic device.
Further, the application program, when executed by the processor, causes the electronic device to perform the step of calculating spatial coordinates of the user's facial preset points, comprising the steps of:
acquiring a 3D model of a face image of a user;
determining the coordinates of the face preset point of the user and the face preset point of the user in a coordinate system of the 3D model;
and converting the coordinates of the face preset points in the 3D model coordinate system into the space coordinates of the space coordinate system of the first electronic equipment.
Further, the application program, when executed by the processor, causes the electronic device to perform the step of determining the first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset points and the first location information, comprising the steps of:
establishing an indication vector based on the face orientation information of the user and the spatial coordinates of the face preset points;
based on the first location information, a distance between the second electronic device and the indication vector is calculated to determine a first target device in the second electronic device.
Further, the application program, when executed by the processor, causes the electronic device to perform the step of sending a positioning request to at least one second electronic device for determining first position information of the second electronic device in a coordinate system of the first electronic device, further comprising the steps of:
sending a positioning request to at least one second electronic device, so that the second electronic device determines second position information of the first electronic device in a coordinate system of the second electronic device according to the positioning request;
when executed by the processor, the application program causes the electronic device to perform the following steps before the step of determining the first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the facial preset point and the first position information:
receiving screen orientation information of the second electronic device, wherein the screen orientation information is determined by the second electronic device based on the second location information;
screening the second electronic equipment based on the screen orientation information;
alternatively, the first and second electrodes may be,
receiving positioning information from the second electronic device, and judging whether a screen of the second electronic device faces the first electronic device or not based on the positioning information;
and screening the second electronic equipment based on the judgment result.
Further, the application program, when executed by the processor, causes the electronic device to perform the following steps before the step of sending the positioning request to the at least one second electronic device:
detecting a first instruction for connecting with a second electronic device;
in response to the detected first instruction, a connected mode is opened to send a positioning request.
Further, when the application program is executed by the processor, the method for establishing connection with the first target device includes the following steps after the step of initiating a connection request to the first target device by the electronic device is executed:
acquiring a detection range based on the indication vector and a preset angle;
detecting whether a second target device except the first target device is included in the detection range;
if a second target device except the first target device exists, detecting a second instruction for connecting with the second target device;
and responding to the second instruction, so that the first electronic equipment establishes connection with the second target equipment.
In a third aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a fourth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program in the fourth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic structural diagram of a first electronic device according to an embodiment of the present invention
Fig. 2 is a schematic structural diagram of a second electronic device according to an embodiment of the present invention
Fig. 3 is a diagram of a bluetooth protocol framework provided by an embodiment of the present invention;
FIGS. 4A-4D are a set of schematic diagrams illustrating the measurement principle of the alignment angle according to the embodiment of the present invention;
fig. 5 is a flowchart of a connection method of a device according to an embodiment of the present invention;
fig. 6 is a flowchart of a connection method of a device according to an embodiment of the present invention;
fig. 7 is an exemplary diagram of an application scenario of a connection method of an electronic device according to an embodiment of the application;
FIG. 8 is an exemplary diagram of a screen orientation of a second electronic device provided by an embodiment of the application;
FIG. 9 is an exemplary diagram of detection ranges provided by embodiments of the present application;
FIG. 10 is an exemplary diagram of another detection range provided by embodiments of the present application;
fig. 11 is a flowchart of a connection method of another device provided in an embodiment of the present application;
fig. 12 is an exemplary diagram of an application scenario of the connection method of the device shown in fig. 11 according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
The operation of projecting a screen to a large-screen device such as a TV or connecting a sound box to play music is complicated, and a user needs to click the screen or connect the device, wait for the device list to be refreshed, identify which device is the target device, click the target device, and sometimes may make a mistake.
Fig. 1 is a diagram illustrating a structure of a first electronic device 100;
the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the first electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the first electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the first electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the first electronic device 100, and may also be used to transmit data between the first electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and does not limit the structure of the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the first electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the first electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the first electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the first electronic device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the first electronic device 100 can communicate with networks and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS) and/or a Satellite Based Augmentation System (SBAS), and it should be noted that the number of antennas shown in the drawings is only exemplary, and more antennas may be provided, such as 3, 4 or more antennas.
The first electronic device 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the first electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The first electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the first electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the first electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The first electronic device 100 may support one or more video codecs. In this way, the first electronic device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU may implement applications such as intelligent recognition of the first electronic device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the first electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phone book, etc.) created during the use of the first electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the first electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The first electronic device 100 can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The first electronic device 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the first electronic device 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The first electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the first electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the first electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The first electronic device 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the first electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The first electronic device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100. In some embodiments, the angular velocity of the first electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the first electronic device 100, calculates a distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the first electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the first electronic device 100 calculates altitude, aiding positioning and navigation from the barometric pressure value measured by the barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The first electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the first electronic device 100 is a flip, the first electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the first electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the first electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The first electronic device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the first electronic device 100 may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The first electronic device 100 emits infrared light to the outside through the light emitting diode. The first electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the first electronic device 100. When insufficient reflected light is detected, the first electronic device 100 may determine that there is no object near the first electronic device 100. The first electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the first electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The first electronic device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the first electronic device 100 is in a pocket, so as to prevent accidental touch.
The fingerprint sensor 180H is used to collect a fingerprint. The first electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the first electronic device 100 executes a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the first electronic device 100 performs a performance reduction on a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the first electronic device 100 heats the battery 142 to avoid the abnormal shutdown of the first electronic device 100 caused by the low temperature. In other embodiments, when the temperature is lower than a further threshold, the first electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the first electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The first electronic device 100 may receive a key input, and generate a key signal input related to user setting and function control of the first electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the first electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The first electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The first electronic device 100 interacts with the network through the SIM card to implement functions such as a call and data communication. In some embodiments, the first electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card may be embedded in the first electronic device 100 and cannot be separated from the first electronic device 100.
Fig. 2 schematically illustrates a structural diagram of a second electronic device 200 according to an embodiment of the present invention.
The second electronic device 200 is taken as an example to specifically describe the embodiment. It should be understood that the second electronic device 200 shown in fig. 2 is only one example, and the second electronic device 200 may have more or less components than those shown in fig. 2, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 2, the second electronic device 200 may include: the device comprises a processor 201, a memory 202, a wireless communication processing module 203, an antenna 204, a power switch 205, a wired LAN communication processing module 206, an HDMI communication processing module 207, a USB communication processing module 208, a display screen 209 and an audio module 210. Wherein: the processor 201 is operable to read and execute computer readable instructions. In particular implementations, the processor 201 may mainly include a controller, an operator, and a register. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a specific implementation, the hardware architecture of the processor 201 may be an Application Specific Integrated Circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, or an NP architecture, etc.
In some embodiments, the processor 201 may be configured to parse a signal received by the wireless communication module 203 and/or the wired LAN communication processing module 206, such as a positioning request broadcast by the first electronic device 100, a request sent by the first electronic device 100 to establish a bluetooth communication connection, and so on. The process 201 may be used to perform corresponding processing operations according to the parsing result, such as responses of the first input and the second input of the user, and so on.
In some embodiments, the processor 201 may also be configured to generate signals, such as bluetooth broadcast signals, beacon signals, and signals for feeding back status information (e.g., standby, power-on, etc.), which are sent out by the wireless communication module 203 and/or the wired LAN communication processing module 206.
A memory 202 is coupled to the processor 201 for storing various software programs and/or sets of instructions. In particular implementations, memory 202 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 202 may store an operating system, such as an embedded operating system like uCOS, VxWorks, RTLinux, etc. The memory 202 may also store a communication program that may be used for the first electronic device 100, one or more servers, or accessory devices to communicate.
The wireless communication module 203 may include one or more of a bluetooth communication module 203A, WLAN communication module 203B, an infrared communication module 204C. The bluetooth communication module 203A may include a classic Bluetooth (BT) module and a Bluetooth Low Energy (BLE) module.
In some embodiments, one or more of the bluetooth communication module 203A, WLAN and the infrared communication module 204C may listen to signals transmitted by other devices (e.g., the first electronic device 100), such as a positioning request signal, and send response signals, such as a positioning response, so that the other devices (e.g., the first electronic device 100) may discover the second electronic device 200 and establish a wireless communication connection with the other devices (e.g., the first electronic device 100) to communicate with the other devices (e.g., the first electronic device 100) through one or more wireless communication technologies of bluetooth, WLAN or infrared.
In other embodiments, one or more of the bluetooth communication module 203A, WLAN and the infrared communication module 203C may also transmit signals, such as broadcast bluetooth signals and beacon signals, so that other devices (e.g., the first electronic device 100) may discover the second electronic device 200 and establish wireless communication connections with other devices (e.g., the first electronic device 100) to communicate with other devices (e.g., the first electronic device 100) via one or more wireless communication technologies in bluetooth or WLAN.
The wireless communication module 203 may also include a cellular mobile communication module (not shown). The cellular mobile communication processing module may communicate with other devices, such as servers, via cellular mobile communication technology.
The wireless communication function of the second electronic device 200 may be realized by the antenna 204, the wireless communication module 203, the modem processor, and the like.
Antenna 204 may be used to transmit and receive electromagnetic wave signals. Each antenna in the second electronic device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna of the WLAN communication module 203B may be multiplexed as the antenna of the bluetooth communication module 203A. In other embodiments, the antenna may be used in conjunction with a tuning switch.
In some embodiments, there may be one or more antennas of the bluetooth communication module 203A, and when there are 2 or more antennas of the bluetooth communication module 203A, the first electronic device 100 can measure an angle of arrival (AOA) or an angle of departure (AOD) of the bluetooth signal transmitted by the second electronic device 200.
The power switch 205 may be used to control the power supply of the power source to the second electronic device 200.
The wired LAN communication processing module 206 is operable to communicate with other devices in the same LAN through a wired LAN, and is also operable to connect to a WAN through a wired LAN, and to communicate with devices in the WAN.
The HDMI communication processing module 207 can be used to communicate with other devices through an HDMI interface (not shown).
The display screen 209 may be used to display images, video, and the like. The display screen 129 may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED) display screen, an active-matrix organic light-emitting diode (AMOLED) display screen, a flexible light-emitting diode (FLED) display screen, a quantum dot light-emitting diode (QLED) display screen, or the like.
The audio module 210 is configured to output an audio signal through the audio output interface, so that the second electronic device 200 supports audio playback. The audio module may also be configured to receive audio data via the audio input interface. The second electronic device 200 may be a media playing device such as a television.
In some embodiments, the second electronic device 200 may also include a serial interface such as an RS-232 interface. The serial interface can be connected to other devices, such as audio play-out devices like a sound box, so that the display and the audio play-out devices can cooperatively play audio and video.
It is to be understood that the structure illustrated in fig. 2 does not constitute a specific limitation of the second electronic device 200. In other embodiments of the invention, the second electronic device 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Fig. 3 is a bluetooth protocol framework diagram according to an embodiment of the present invention, which includes, but is not limited to, a Host protocol stack, an hci (Host Controller interface), and a Controller. In the method according to the embodiment of the present invention, the bluetooth protocol framework used by the first electronic device and the second electronic device may refer to the part of the content. The Host protocol stack defines a plurality of applications (profiles) and a core protocol (protocol) in a bluetooth framework, each profile defines a respective corresponding message format and Application rule, and the profile is a bluetooth service (Application). In order to implement interconnection and interworking of different devices under different platforms, the bluetooth protocol is a specification made for various possible and commonly meaningful application scenarios, such as A2DP (advanced audio distribution profile), HFP (wings-free profile), and the like. The core Protocol includes, but is not limited to, a bluetooth basic service Protocol sdp (service discovery Protocol), a Logical Link Control and Adaptation Protocol L2CAP (Logical Link Control and Adaptation Protocol), and the like. The core protocol is essential in the bluetooth protocol stack.
The HCI provides a unified interface entering a link manager and a unified mode entering a baseband for an upper layer protocol, a plurality of transmission layers exist between a host core protocol stack and a controller, the transmission layers are transparent and complete a data transmission task, and a Bluetooth technical alliance (SIG) specifies four physical bus modes connected with hardware, namely four HCI transmission layers, namely USB, RS232, UART and PC cards.
The controller defines a bottom hardware part, including a Radio Frequency (RF), a baseband (BB) and a Link Management (LM), and the RF layer implements filtering and transmission of a data bit stream through microwaves in an ISM band that does not require authorization at 2.4GHz, and mainly defines conditions that a bluetooth transceiver needs to meet in the frequency band for normal operation. The baseband is responsible for frequency hopping and transmission of bluetooth data and information frames. The link management is responsible for connecting, establishing and removing links and performing security control. The lm (link manager) layer is a link management layer protocol of the bluetooth protocol stack, and is responsible for translating an upper layer HCI command into an operation acceptable by a baseband, establishing an asynchronous link-oriented link (ACL) and a synchronous link (SCO), and entering a bluetooth device into an operation mode of a power saving state. The lc (link control) layer is responsible for responding to upper layer LM commands (e.g., LM commands that perform functions such as establishing a transport link for packets, maintaining a link, etc.) during the transmission of a batch of packets.
The method according to the embodiment of the present invention is implemented by the wireless communication module 160 of the first electronic device 100 shown in fig. 1, and may specifically be executed by a bluetooth module or a bluetooth chip.
Fig. 4A to 4D are schematic diagrams of a bluetooth direction finding function according to an embodiment of the present invention, and particularly relate to a principle of measuring and calculating an angle of arrival (AoA) and an angle of departure (AoD) of a BLE signal for calculating an alignment angle between the first electronic device and the second electronic device.
The following describes a principle of measuring an angle of arrival (AoA) of a BLE signal according to an embodiment of the present invention.
Fig. 4A-4B are schematic diagrams illustrating AoA calculation of a bluetooth arrival angle according to an embodiment of the present invention. As shown in fig. 4A, a Transmitter (Transmitter)410 may transmit an AoA broadcast packet (constant tone extension (CTE) packet) through BLE, the AoA CTE packet being a continuous modulated sequence without check and without Cyclic Redundancy Check (CRC) and Message Integrity Code (MIC). The Host of the Receiver (Receiver)420 issues antenna sequence information used in measurement through the HCI command. After the measurement is started, the controller 421 sequentially switches to the designated antenna according to the antenna sequence information issued by the Host to perform I & Q sampling, and reports the sampling information to the Host through the HCI command, and the Host calculates the relative angle.
The transmitting side 410 may control one antenna 410 to transmit the AoA CTE packet through the controller 411. The receiver 420 may control the rf switch 422 through the controller 421, and switch to the designated antenna in a certain order for receiving. The receiving side 420 has at least two antennas 423. Therein, the receiver 420 measures the angle of arrival AoA of the BLE signal of the transmitter 410. In some embodiments, in order to improve the accuracy of the estimation of the wave arrival angle, a plurality of antennas, for example, three antennas, may be provided, and the array may be arranged in a straight line, a ring, a sphere, and the like, which is not limited herein.
As shown in fig. 4B, it is assumed that the receiving side 420 has two antennas (antenna a and antenna B) with a distance d, and the received BLE signal is a planar electromagnetic wave. Since antenna a and antenna B receive the BLE signals asynchronously, receiver 420 may compare the phases of BLE signals received by antenna a and antenna B, and determine the phase difference between BLE signals received by antenna a and antenna B.
Wherein, the angle of arrival (AoA) of the BLE signal can be calculated by the following formula (1):
Figure BDA0002647124930000151
where θ is an arrival angle (AoA) of the BLE signal, ψ is a phase difference of the antenna a and the antenna B receiving the BLE signal, λ is a wavelength of the BLE signal, and d is a distance between the antenna a and the antenna B. Here, the θ angle may be set as an alignment angle between the first electronic device and the second electronic device, that is, the alignment angle between the first electronic device and the second electronic device may be calculated by formula (1).
In some possible embodiments, the transmitting party 410 may be the second electronic device 200 and the receiving party 410 may be the first electronic device 100. The first electronic device 100 has at least two antennas for transmitting/receiving BLE signals, and the second electronic device 200 has at least one antenna for transmitting/receiving BLE signals. The first electronic device 100 may determine the angle of alignment of the first electronic device 100 relative to the second electronic device 200 by the AoA principle described above for measuring BLE signals shown in fig. 4A and 4B.
In some possible embodiments, the transmitting party 410 may be the first electronic device 100, the receiving party 420 may be the second electronic device 200, and the first electronic device 100 has at least one antenna for transmitting/receiving BLE signals and the second electronic device 200 has at least two antennas for transmitting/receiving BLE signals. The second electronic device 200 may determine the angle of alignment of the first electronic device 100 relative to the second electronic device 200 by the AoA principle described above for measuring BLE signals shown in figures 4A and 4B.
The principle of measuring the angle of departure (AoD) of the BLE signal according to the embodiment of the present invention is described below.
Fig. 4C-4D are schematic diagrams illustrating AoD calculation of a bluetooth departure angle according to an embodiment of the present invention. As shown in fig. 4D, the Transmitter (Transmitter)430 may transmit an AoD broadcast packet (constant frequency extension (CTE) packet) through BLE, the AoD CTE packet being a continuous modulated sequence, without checking, and without Cyclic Redundancy Check (CRC) and Message Integrity Code (MIC). The Host of the transmitting side 430 issues antenna sequence information used in measurement through the HCI command. After the measurement is started, the controller 441 of the receiving party 440 performs I & Q sampling on BLE signals sent by each antenna of the receiving party 440 through a single antenna according to antenna sequence information issued by the Host of the transmitting party 430, reports the sampling information to the Host through an HCI command, and calculates a relative angle by the Host.
The receiver 440 may control an antenna 442 through a controller 441 to receive the AoD CTE packet transmitted by the transmitter 430. The transmitting side 430 may control the rf switch 432 via the controller 431 to switch to the designated antenna in a certain order to transmit the AoD CTE packet. The transmitting party 430 has at least two antennas 433. Therein, the angle of departure AoD of the BLE signal of the transmitting party 430 is measured by the receiving party 440. In some embodiments, in order to improve the accuracy of the estimation of the wave arrival angle, a plurality of antennas may be provided, and the array may be arranged in a straight line, a ring, a sphere, and the like, which is not limited herein.
As shown in fig. 4C, it is assumed that the transmitting party 430 has two antennas (antenna a and antenna B) with a distance d, and the transmitted BLE signal is a planar electromagnetic wave. Since the antenna a and the antenna B transmit the BLE signals asynchronously, the receiving party 440 may compare the phases of the BLE signals transmitted by the antenna a and the antenna B that receive the transmitting party 430, and determine the phase difference between the BLE signals transmitted by the antenna a and the antenna B.
Wherein, the departure angle (AoD) of the BLE signal can be calculated by the following formula (2):
Figure BDA0002647124930000152
where θ is the departure angle (AoD) of the BLE signal, ψ is the phase difference of the BLE signal emitted by the antenna a and the antenna B, λ is the wavelength of the BLE signal, and d is the distance between the antenna a and the antenna B. Here, the θ angle may be set as an alignment angle between the first electronic device and the second electronic device, that is, the alignment angle between the first electronic device and the second electronic device may be calculated by formula (2).
In some possible embodiments, the transmitting party 430 may be the second electronic device 200 and the receiving party 440 may be the first electronic device 100. The first electronic device 100 has at least two antennas for transmitting/receiving BLE signals, and the second electronic device 200 has at least one antenna for transmitting/receiving BLE signals. The first electronic device 100 may determine the alignment angle of the first electronic device 100 relative to the second electronic device 200 by the AoD principle of measuring BLE signals as illustrated in figures 4C and 4D described above.
In some possible embodiments, the transmitting party 430 may be the first electronic device 100, and the receiving party 440 may be the second electronic device 200, where the first electronic device 100 has at least one antenna for transmitting/receiving BLE signals, and the second electronic device 200 has at least two antennas for transmitting/receiving BLE signals. The second electronic device 200 may determine the alignment angle of the first electronic device 100 relative to the second electronic device 200 by the AoD principle of measuring BLE signals as illustrated in figures 4C and 4D described above.
Fig. 5 is a flowchart of a method for connecting devices, which can be applied to the first electronic device 100 according to an embodiment of the present invention. The connection method of the equipment comprises the following steps:
step 502, sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device
The first location information includes coordinates of the second electronic device in a coordinate system of the first electronic device.
Step 504, acquiring a facial image of a user, and calculating a spatial coordinate of the facial preset point of the user in a coordinate system of the first electronic device;
step 506, identifying face bearing information of the user from the face image, specifically, the face bearing information includes a face orientation of the user.
Step 508, determining a first target device in the second electronic device based on the face orientation information of the user, the spatial coordinates of the face preset point and the first position information;
step 510, initiating a connection request to the first target device to establish a connection with the first target device.
The device connection method provided by the application confirms which device the user is looking at by placing the face position, face orientation and the peripheral connectable device of the user in the coordinate system, so as to facilitate connection.
Fig. 6 is a flowchart of a method for connecting devices, which can be applied to the first electronic device 100 according to an embodiment of the present invention. The connection method of the equipment comprises the following steps:
step 602, a first electronic device detects a first instruction for connecting with a second electronic device.
The second electronic device includes, but is not limited to, a smart phone having a bluetooth module, a bluetooth sound, a projection device, and the like, and specifically, these devices correspond to the second electronic device 200 described above and include a bluetooth module supporting a bluetooth protocol of 5.1 or more.
The second electronic device comprises a target device, and the target device is a device to be connected or controlled by a user. For example, the user wants to control a screen projection device, which is a target device, through the first electronic device 100.
Exemplarily, refer to fig. 7, and particularly relate to an exemplary diagram of an application scenario of a connection method for an electronic device according to an embodiment of the present application. When the user is facing the TV 1/speaker 1, the user is likely to want to connect or control the TV 1/speaker 1, and the TV 1/speaker 1 is the target device.
The first instruction is used for turning on the bluetooth module of the first electronic device 100, and specifically, the first instruction may include an instruction input by a user, and an input manner of the instruction may include screen input, voice input, and the like.
Step 604, the first electronic device responds to the first instruction to open a connection mode to send the location request.
When the first instruction is detected, the first electronic device 100 sends a positioning request to at least one second electronic device in response to the first instruction.
In one possible implementation, the first electronic device 100 may send a positioning request to the second electronic device 200 by broadcasting periodically (for example, at a period of 1 second). Or the first electronic device 100 sends a positioning request after detecting the first instruction.
Step 606, the second electronic device sends data for positioning to the first electronic device
The data may include a CTE data packet, data for calculating a distance between the first electronic device and the second electronic device, data for indicating whether the second electronic device is oriented toward the first electronic device, an alignment angle between the second electronic device and the first electronic device, and so on. In particular, the type of data is associated with the positioning request described above.
Step 608, determining first position information of the second electronic device in a coordinate system of the first electronic device.
In one embodiment, the data for positioning is a CTE data packet sent by the second electronic device to the first electronic device.
Specifically, in the embodiment of the present invention, the second electronic device 200 responds to the positioning request of the first electronic device 100. To which a specially tailored direction-finding signal is sent, the second electronic device 200 sends a CTE packet to the first electronic device 100 for the first electronic device 100 to calculate the alignment angle of the second electronic device 200 with respect to the first electronic device 100.
In some embodiments of the present invention, the BLE signal AOA measurement principle as illustrated in figures 4A-4B above, wherein the first electronic device 100 may be the recipient of the BLE signal and the second electronic device 200 may be the sender of the BLE signal. After the first electronic device 100 sends the positioning request and is confirmed by the second electronic device 200, the second electronic device 200 may send a CTE broadcast packet to the first electronic device 100, and the first electronic device 100 may switch to a designated antenna in a certain order to receive the CTE broadcast packet, and calculate AoA of the BLE signal transmitted by the second electronic device 200 according to the above formula (1), that is, an alignment angle of the second electronic device 200 with respect to the first electronic device 100.
In some embodiments of the present invention, the BLE signal AOD measurement principle as illustrated in figures 4C-4D above, wherein the first electronic device 100 may be a receiver of the BLE signal and the second electronic device 200 may be a sender of the BLE signal. After the first electronic device 100 sends the positioning request and is confirmed by the second electronic device 200, the second electronic device 200 may switch the designated antenna in a certain order to send the CTE broadcast packet to the first electronic device 100, and the first electronic device 100 may receive the CTE broadcast packet and calculate AoD of the BLE signal transmitted by the second electronic device 200 according to the above formula (2), that is, an alignment angle of the second electronic device 200 with respect to the first electronic device 100.
In one embodiment, the data for positioning may be directly the alignment angle of the second electronic device 200 with respect to the first electronic device 100.
Specifically, the BLE signal AOA measurement principle as shown in fig. 4A-4B above, wherein the first electronic device 100 may be the sender of the BLE signal and the second electronic device 200 may be the receiver of the BLE signal. After the first electronic device 100 sends the positioning request and is confirmed by the second electronic device 200, the first electronic device 100 may send the CTE broadcast packet to the second electronic device 200, and the second electronic device 200 may switch to a designated antenna in a certain order to receive the CTE broadcast packet, and calculate AoA of the BLE signal transmitted by the first electronic device 100, that is, an alignment angle of the first electronic device 100 with respect to the second electronic device 200 according to the above formula (2). The alignment angle is then transmitted to the first electronic device 100 by the second electronic device 200.
In some embodiments of the present invention, the BLE signal AOD measurement principle as illustrated in fig. 4C-4D above, wherein the first electronic device 100 may be the sender of the BLE signal and the second electronic device 200 may be the receiver of the BLE signal. After the first electronic device 100 sends the positioning request and is confirmed by the second electronic device 200, the first electronic device 100 may switch the designated antenna in a certain order to send the CTE broadcast packet to the second electronic device 200, and the second electronic device 200 may receive the CTE broadcast packet and calculate AoD of the BLE signal transmitted by the first electronic device 200, that is, an alignment angle of the first electronic device 100 with respect to the second electronic device 200 according to the above formula (2). The alignment angle is then transmitted to the first electronic device 100 by the second electronic device 200.
In another embodiment of the present invention, the data sent by the second electronic device to the first electronic device in step 506 includes other data used for ranging, specifically, for determining the distance between the first electronic device and the second electronic device, for example, the data packet includes the RSSI. Specifically, the RSSI is determined by the second electronic device and then sent to the first electronic device. The first electronic device calculates according to a formula d 10^ (abs (rssi) -A)/(10 × n), d is a calculated distance (unit: m), rssi is a received signal strength, A is the received signal strength when the transmitting end and the receiving end are spaced 1 meter apart, and n is an environmental attenuation factor. It should be noted that the first electronic device and the second electronic device may also calculate the distance in other manners, and the method for calculating the distance illustrated in this embodiment does not limit the manner of calculating the distance.
In one embodiment, the alignment angle and the distance between the first electronic device 100 and the second electronic device 200 may be calculated by the first electronic device 100 or the second electronic device 200 and then transmitted to the first electronic device 100.
In one embodiment, based on the distance information and the above-described alignment angle, coordinates of the second electronic device 200 in a spatial coordinate system of the first electronic device 100 may be determined.
In one embodiment, the first position information includes coordinates of the second electronic device 200 in a spatial coordinate system of the first electronic device 100.
Step 610, acquiring a facial image of a user, and calculating a spatial coordinate of a facial preset point of the user in a coordinate system of the first electronic device; this step may be performed simultaneously when the positioning request is sent in step 604.
Specifically, the facial image of the user is captured by a 3D camera, the 3D camera includes ToF, binocular, structured light and other cameras, and the ToF, binocular, structured light cameras can position the face coordinates into the coordinate system of the first electronic device.
In one example, the 3D camera performs 3D modeling on a human face when taking a picture, and identifies 3D modeling coordinates of the human face. The center of each eye is taken as a representative of the position of the face, the coordinates of the center point of each eye in the 3D model can be directly obtained, the coordinates of the 3D model and the coordinate system of the first electronic device in the scheme are converted, and the coordinates of the position of the face in the coordinate system of the first electronic device can be obtained.
Specifically, taking a structured light 3D camera as an example, the structured light 3D camera includes an infrared projection module, a color camera, and a processor dedicated to calculating depth. Structured light, which is a special light source, is generally classified into three types: discrete facula, bar light, code structure light, during operation, the facula through specific code is projected on user's face from infrared projection module, and user's face reflection infrared light, infrared projection module receive the reverberation, forms the picture, according to the distortion condition of facula, calculates the facial each point of user's face distance from the plane of camera, through the processor of calculating the degree of depth to the facial degree of depth of the user who obtains. The image of the user's face is then 3D modeled to form a 3D model. The method comprises the steps of determining a face preset point of a user in a coordinate system of a 3D model, and converting coordinates of the face preset point in the coordinate system of the 3D model into coordinates in a space coordinate system of the first electronic device.
The converting step is performed when the coordinate system of the 3D camera and the coordinate system of the first electronic device are different, and if the coordinate system of the 3D camera and the coordinate system of the first electronic device are the same, the converting is not required.
Step 612, identify facial orientation information of the user from the facial image
The face orientation information specifically includes a face orientation. This step may be performed simultaneously when the positioning request is sent in step 604.
In one embodiment, the acquired facial image of the user may be input into an existing pre-set image recognition model, based on which the facial orientation of the user is determined, the facial orientation of the user being located in the coordinate system of the first electronic device. It should be noted that, the step may further include a step of recognizing a face of the user to determine whether the user is an actual operator. For example, by creating an image database of the operator, it is detected whether the current operator is a person recorded in the image database, and if not, the current operator is not the user of the first electronic device, and it is not necessary to execute step 612 and the following steps.
And 614, receiving screen orientation information of the second electronic device, wherein the screen orientation information is determined by the second electronic device based on the second position information, and screening the second electronic device based on the screen orientation information.
Specifically, the second position information includes an alignment angle between the first electronic device and the second electronic device, and specifically, how to determine the alignment angle is shown above, which is not described herein again. The second electronic device may also calculate the coordinates of the first electronic device in the second electronic device coordinate system according to the above-described principle of calculating the alignment angle.
In one embodiment, the step 606 of sending the data for positioning to the first electronic device by the second electronic device may further include screen orientation information of the second electronic device, and the screen orientation information may be determined by the second electronic device and then sent to the first electronic device.
In one embodiment, the second electronic device determines the screen orientation information of the second electronic device according to the orientation of a preset coordinate axis in a spatial coordinate system of the second electronic device under the condition that the alignment angle between the first electronic device and the second electronic device is known.
Referring to fig. 8, an exemplary diagram of determining a screen orientation of a second electronic device in an embodiment of a method for connecting devices according to the present application is specifically described.
In the spatial coordinate system of the second electronic device 200, a plane formed by the electronic device screen being the X/Y axis with the front of the electronic device screen facing the Z-axis direction is used as a spatial coordinate system. The second electronic device can determine whether the screen of the second electronic device faces the first electronic device by positioning and detecting the position coordinate of the first electronic device in the coordinate system by aligning the angle and the distance to see whether the component of the Z axis of the position coordinate is greater than 0.
Specifically, if the component of the Z axis is greater than zero, the screen faces the first electronic device. If the screen size is less than or equal to 0, the screen of the second electronic device is not oriented to the first electronic device. As in fig. 7, the Z-axis component of TV1 is greater than 0 and the Z-axis of TV2 is less than 0. I.e., TV1 is facing the first electronic device and TV2 is not facing the electronic device. The TV2 is not directed towards the first electronic device, and the TV2 can be set directly as a non-target device, relieving subsequent processing pressure.
In this embodiment, the second electronic device may directly send conclusive information as to whether the screen of the second electronic device faces the first electronic device to the first electronic device, so that the first electronic device may directly know the screen facing direction of the second electronic device, and filter the second electronic device based on the screen facing direction determination result, for example, the second electronic device whose screen does not face the first electronic device is not necessarily a device that the user wants to control, and may delete or shield the second electronic device, and does not send a communication connection to the second electronic device, thereby further improving the recognition efficiency. It should be noted that how to filter the second electronic device specifically can also be set according to the needs of the user.
In one embodiment, the second electronic device may send the component magnitude of the Z-axis to the first electronic device, and then the first electronic device determines whether the screen of the second electronic device faces itself. Specifically, the second electronic device sends positioning information to the first electronic device, and the first electronic device receives the positioning information (Z-axis component size) to determine whether the screen of the second electronic device faces itself, so as to screen the second electronic device.
Step 616, determining a first target device in a second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset points and the position information.
In one embodiment, the indication vector is established based on the user's facial orientation information and the spatial coordinates of the facial preset points. And calculating the distance between each second electronic device and the indication vector, and determining the second electronic device with the minimum distance between the second electronic device and the indication vector as the first target device.
Specifically, the first target device is a second electronic device that a user wants to control or connect based on the first electronic device.
Specifically, the first electronic device 100 may establish an indication vector along the face orientation based on the obtained spatial coordinates of the facial preset points of the user, where the spatial coordinates of the facial preset points are located on the indication vector. In particular, it has been shown above that the first electronic device 100 may acquire the spatial coordinates of the facial preset points through the 3D camera described above.
Knowing the spatial coordinates of the second electronic device and the pointing vector in the coordinate system of the first electronic device, the distance between the second electronic device and the pointing vector can be calculated, and the first electronic device can designate the second electronic device with the smallest distance vector as the first target device. Referring to fig. 8, the currently established pointing vector L is towards TV/audio 1, and is closer to TV/audio 1 than to TV/audio 2, so TV/audio 1 is the target device.
Step 618, initiating a request for establishing communication connection to the target device
The first electronic device may send a communication connection request directly thereto, with the target device known, to control or communicate with the target device.
It should be noted that the method disclosed herein is executed on a device supporting bluetooth 5.1 or more protocols, but may also be executed in other communication networks supporting spatial positioning.
In one embodiment, the method disclosed herein may further comprise the steps of:
and acquiring a detection range based on the indication vector and a preset angle, detecting whether second target equipment except the first target equipment is included in the detection range, and switching the first target equipment into the second target equipment if the second target equipment except the first target equipment exists.
The purpose of this step is to determine whether the connection is wrong after establishing a communication connection with the first target device. The detection range indicates a range of orientations of the user within which the electronic device that the user really wants to control is probably included. For example, after the first electronic device is connected with the target device, the user finds that the device is not a device that the user wants to connect with, the first electronic device may display all the second electronic devices in the detection range of the user on the screen, and the second electronic devices in the current detection range are screened, so that the user can select the second electronic devices more conveniently.
Referring to fig. 9 and 10, the detection range established by the first electronic device based on the indication vector and the spatial coordinates of the preset point of the face of the user is specifically referred to.
In one embodiment, as shown in fig. 9, the detection range may be established by taking the spatial coordinates of the preset point of the face as a vertex and the indication vector L as an axis according to the required angle a. For example, the angle a is 45 degrees, thereby forming a cone with the spatial coordinates of the facial preset points as the vertex, and the devices falling within the cone may include the target device to be controlled by the user. Through the step, the equipment to be controlled by the user can be screened once, and the screening result is displayed so that the user can correct or switch the connection again or close the connection.
In another embodiment, as shown in fig. 10, a circle may be made with the spatial coordinates of the preset point of the user's face as the center and the set distance as the radius r, and then projected along the direction of the indicated vector to form a cylinder as shown in the figure, for example, the radius r is 0.5m, and the devices falling into the cylinder include the target device to be controlled by the user. Through the step, the equipment to be controlled by the user can be screened once, and the screening result is displayed so that the user can correct or switch the connection again or close the connection.
The space range covered by the cylinder and the cone is the detection range.
The present application further provides another method for connecting devices, referring to fig. 11, and in particular to a flowchart of the method for connecting devices, where the method is applicable to a first electronic device, and includes:
step 1102 of receiving a positioning request and a facial image of a user from a second electronic device
Specifically, the user who wants to connect the second electronic device 200 with the first electronic device 100 may send a positioning request, which may include an assistance request for the first electronic device 100 to confirm whether the user is facing himself. Specifically, the second electronic device may send a picture of the user's face to the first electronic device 100 for use in subsequent steps to determine whether the user is facing himself.
At step 1104, a user orientation is identified
The determining of the user orientation may include a method of tracking an eyeball of the user and determining the user orientation based on a tracking result, and in particular, the eyeball tracking is a common notification in the art and is not described herein.
In another embodiment, an image of the user's face is acquired and input into an existing recognition model to determine the user's face orientation.
In a possible implementation manner, the first electronic device may capture an image of the user in real time after obtaining the positioning request, and determine whether the face of the user faces itself through identification, specifically, the first electronic device may analyze whether the face image of the user is complete, for example, when the user faces away from the first electronic device 100 or faces the first electronic device sideways, it may be determined that the user does not face itself.
Step 1106, if the user faces the electronic device, sending feedback information to the second electronic device, where the feedback information is used to instruct the second electronic device to initiate a connection establishment request to the first electronic device.
When the first electronic device detects that the user faces to the first electronic device, the first electronic device can determine that the user has the intention of controlling the first electronic device, and can send feedback information to the second electronic device, so that the second electronic device can initiate a communication connection request to the first electronic device.
Referring to fig. 12, fig. 11 is a diagram illustrating an example of a scenario in which the method shown in fig. 11 is performed.
The TV1 may acquire a user image for recognition or capture of an eyeball of the user in real time upon receiving a positioning request of the second electronic device 200 after acquiring a facial image of the user, determine whether the user is directed to the TV1 based on the recognition or capture result, and transmit feedback information to the second electronic device 200 instructing the second electronic device 200 to initiate a connection establishment request to itself when confirming that the user is directed to itself.
It is to be understood that the above-mentioned terminal and the like include hardware structures and/or software modules corresponding to the respective functions for realizing the above-mentioned functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
In the embodiment of the present application, the terminal and the like may be divided into functional modules according to the method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the embodiment of the present invention is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (18)

1. A method for connecting devices, applied to a first electronic device, includes:
sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device;
acquiring a facial image of a user, and calculating the spatial coordinates of the facial preset point of the user in the coordinate system of the first electronic equipment;
identifying facial orientation information of the user from the facial image;
determining a first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset points and the first position information;
and initiating a connection request to the first target equipment to establish connection with the first target equipment.
2. The method according to claim 1, wherein the sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device comprises:
sending a positioning request to at least one second electronic device to acquire an alignment angle and a distance between the first electronic device and the second electronic device;
determining spatial coordinates of the second electronic device in a coordinate system of the first electronic device based on the alignment angle and the distance.
3. The method of claim 1, wherein the identifying facial orientation information of the user from the facial image comprises:
inputting the face image of the user into a preset recognition model to obtain the face orientation of the user in the coordinate system of the first electronic device.
4. The method of claim 1, wherein the calculating spatial coordinates of the facial preset points of the user comprises:
obtaining a 3D model of the user's facial image;
determining coordinates of the user's facial preset points and the user's facial preset points in a coordinate system of the 3D model;
and converting the coordinates of the face preset points in the 3D model coordinate system into the space coordinates of the space coordinate system of the first electronic equipment.
5. The method of claim 1, wherein determining the first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset point, and the first location information comprises:
establishing an indication vector based on the facial orientation information of the user and the spatial coordinates of the facial preset points;
calculating a distance between the second electronic device and the indication vector based on the first location information to determine the first target device in the second electronic device.
6. The method according to claim 1, wherein the sending a positioning request to at least one second electronic device to determine first location information of the second electronic device in a coordinate system of the first electronic device further comprises:
sending a positioning request to at least one second electronic device, so that the second electronic device determines second position information of the first electronic device in a coordinate system of the second electronic device according to the positioning request;
before the determining the first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset point, and the first position information, the method further includes:
receiving screen orientation information of the second electronic device, wherein the screen orientation information is determined by the second electronic device based on the second location information;
screening the second electronic equipment based on the screen orientation information;
alternatively, the first and second electrodes may be,
receiving positioning information from the second electronic device, and judging whether a screen of the second electronic device faces the first electronic device or not based on the positioning information;
and screening the second electronic equipment based on the judgment result.
7. The method according to any one of claims 1 to 6, wherein before sending the location request to the at least one second electronic device, the method further comprises:
detecting a first instruction for connecting with the second electronic equipment;
and opening a connection mode to send the positioning request in response to the detected first instruction.
8. The connection method according to any one of claims 4 to 7, wherein after initiating the connection request to the first target device to establish a connection with the first target device, further comprising:
acquiring a detection range based on the indication vector and a preset angle;
detecting whether a second target device except the first target device is included in the detection range;
if a second target device except the first target device exists, detecting a second instruction for connecting with the second target device;
and responding to the second instruction, so that the first electronic equipment establishes connection with the second target equipment.
9. An electronic device comprising a processor and a storage device, the storage device having an application stored thereon, the application when executed by the processor causing the electronic device to perform the steps of:
sending a positioning request to at least one second electronic device to determine first position information of the second electronic device in a coordinate system of the first electronic device;
acquiring a facial image of a user, and calculating the spatial coordinates of the facial preset point of the user in the coordinate system of the first electronic equipment;
identifying facial orientation information of the user from the facial image;
determining a first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset points and the first position information;
and initiating a connection request to the first target equipment to establish connection with the first target equipment.
10. The electronic device of claim 9, wherein the application, when executed by the processor, causes the electronic device to perform the step of sending a location request to at least one second electronic device to determine first location information of the second electronic device in a coordinate system of the first electronic device, comprising the steps of:
sending a positioning request to at least one second electronic device to acquire an alignment angle and a distance between the first electronic device and the second electronic device;
determining spatial coordinates of the second electronic device in a coordinate system of the first electronic device based on the alignment angle and the distance.
11. The electronic device of claim 9, wherein the application, when executed by the processor, causes the electronic device to perform the step of identifying facial orientation information of the user from the facial image, comprising the steps of:
inputting the face image of the user into a preset recognition model to obtain the face orientation of the user in the coordinate system of the first electronic device.
12. The electronic device of claim 9, wherein the application program, when executed by the processor, causes the electronic device to perform the step of calculating the spatial coordinates of the user's facial preset points, comprising the steps of:
obtaining a 3D model of the user's facial image;
determining coordinates of the user's facial preset points and the user's facial preset points in a coordinate system of the 3D model;
and converting the coordinates of the face preset points in the 3D model coordinate system into the space coordinates of the space coordinate system of the first electronic equipment.
13. The electronic device of claim 9, wherein the application, when executed by the processor, causes the electronic device to perform the step of determining the first target device in the second electronic device based on the user's facial orientation information, the spatial coordinates of the facial preset point, and the first location information, comprising the steps of:
establishing an indication vector based on the facial orientation information of the user and the spatial coordinates of the facial preset points;
calculating a distance between the second electronic device and the indication vector based on the first location information to determine the first target device in the second electronic device.
14. The electronic device of claim 9, wherein the application, when executed by the processor, causes the electronic device to perform the step of sending a location request to at least one second electronic device to determine first location information of the second electronic device in a coordinate system of the first electronic device, further comprising the steps of:
sending a positioning request to at least one second electronic device, so that the second electronic device determines second position information of the first electronic device in a coordinate system of the second electronic device according to the positioning request;
the application, when executed by the processor, causes the electronic device to perform the following steps prior to the step of determining the first target device in the second electronic device based on the facial orientation information of the user, the spatial coordinates of the facial preset point, and the first location information:
receiving screen orientation information of the second electronic device, wherein the screen orientation information is determined by the second electronic device based on the second location information;
screening the second electronic equipment based on the screen orientation information;
or;
receiving positioning information from the second electronic device, and judging whether a screen of the second electronic device faces the first electronic device or not based on the positioning information;
and screening the second electronic equipment based on the judgment result.
15. The electronic device of any of claims 8-14, wherein the application, when executed by the processor, causes the electronic device to perform the step of sending a location request to at least one second electronic device, further comprising the steps of:
detecting a first instruction for connecting with the second electronic equipment;
and opening a connection mode to send the positioning request in response to the detected first instruction.
16. The electronic device of any of claims 8-15, wherein the application, when executed by the processor, causes the electronic device to perform the step of initiating a connection request to the first target device to establish a connection with the first target device, further comprising the step of:
acquiring a detection range based on the indication vector and a preset angle;
detecting whether a second target device except the first target device is included in the detection range;
if a second target device except the first target device exists, detecting a second instruction for connecting with the second target device;
and responding to the second instruction, so that the first electronic equipment establishes connection with the second target equipment.
17. A computer-readable storage medium comprising computer instructions which, when run on the electronic device, cause the electronic device to perform the method of connecting the device of any of claims 1-8.
18. A computer program product, which, when run on a computer, causes the computer to perform the method of connection of a device according to any one of claims 1-8.
CN202010858224.1A 2020-08-24 2020-08-24 Device connection method and electronic device Pending CN114091006A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010858224.1A CN114091006A (en) 2020-08-24 2020-08-24 Device connection method and electronic device
PCT/CN2021/113948 WO2022042460A1 (en) 2020-08-24 2021-08-23 Device connection method, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010858224.1A CN114091006A (en) 2020-08-24 2020-08-24 Device connection method and electronic device

Publications (1)

Publication Number Publication Date
CN114091006A true CN114091006A (en) 2022-02-25

Family

ID=80295536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010858224.1A Pending CN114091006A (en) 2020-08-24 2020-08-24 Device connection method and electronic device

Country Status (2)

Country Link
CN (1) CN114091006A (en)
WO (1) WO2022042460A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657859B (en) * 2015-12-23 2020-08-25 联想(北京)有限公司 Equipment connection method and electronic equipment
JP7075237B2 (en) * 2018-02-23 2022-05-25 ラピスセミコンダクタ株式会社 Operation judgment device and operation judgment method
CN109274563B (en) * 2018-10-10 2021-12-31 耘申信息科技(上海)有限公司 Equipment connection method and device
CN110113826B (en) * 2019-04-25 2021-07-16 维沃移动通信有限公司 D2D device-to-device connection method and terminal device
CN110167194A (en) * 2019-05-20 2019-08-23 安徽华米信息科技有限公司 Equipment connection method, device, bluetooth headset and system

Also Published As

Publication number Publication date
WO2022042460A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
CN112469013B (en) Bluetooth connection method and related device
WO2021023046A1 (en) Electronic device control method and electronic device
WO2020259542A1 (en) Control method for display apparatus, and related device
CN114710768B (en) Bluetooth connection method and related device
CN111835907A (en) Method, equipment and system for switching service across electronic equipment
CN112469011A (en) Bluetooth pairing method and related device
CN114422340A (en) Log reporting method, electronic device and storage medium
CN114610193A (en) Content sharing method, electronic device, and storage medium
CN114257920B (en) Audio playing method and system and electronic equipment
CN113225661A (en) Loudspeaker identification method and device and electronic equipment
CN113490291B (en) Data downloading method and device and terminal equipment
CN112738794B (en) Network residing method, chip, mobile terminal and storage medium
CN112272191B (en) Data transfer method and related device
CN115550986A (en) Equipment detection method and electronic equipment
CN115119336B (en) Earphone connection system, earphone connection method, earphone, electronic device and readable storage medium
CN113126948A (en) Audio playing method and related equipment
CN115525366A (en) Screen projection method and related device
CN114554012A (en) Incoming call answering method, electronic equipment and storage medium
WO2022042460A1 (en) Device connection method, and electronic device
CN114661258A (en) Adaptive display method, electronic device, and storage medium
CN113436635A (en) Self-calibration method and device of distributed microphone array and electronic equipment
CN113129636B (en) Vehicle information pushing method and device, user account server and user equipment
CN113364067B (en) Charging precision calibration method and electronic equipment
WO2022105670A1 (en) Display method and terminal
CN115705769A (en) Alarm method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination