CN114339709A - Wireless communication method and terminal device - Google Patents

Wireless communication method and terminal device Download PDF

Info

Publication number
CN114339709A
CN114339709A CN202011063160.2A CN202011063160A CN114339709A CN 114339709 A CN114339709 A CN 114339709A CN 202011063160 A CN202011063160 A CN 202011063160A CN 114339709 A CN114339709 A CN 114339709A
Authority
CN
China
Prior art keywords
terminal device
target
terminal
information
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011063160.2A
Other languages
Chinese (zh)
Inventor
陈刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011063160.2A priority Critical patent/CN114339709A/en
Priority to PCT/CN2021/116120 priority patent/WO2022068513A1/en
Publication of CN114339709A publication Critical patent/CN114339709A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/08Access restriction or access information delivery, e.g. discovery data delivery
    • H04W48/10Access restriction or access information delivery, e.g. discovery data delivery using broadcasted information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/16Discovering, processing access restriction or access information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/15Setup of multiple wireless link connections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a wireless communication method and terminal equipment, and relates to the technical field of communication, wherein the method comprises the following steps: a first terminal device broadcasts a detection request message carrying filtering information of the first terminal device; if a probe response message carrying filtering information of second terminal equipment is received and the filtering information of the first terminal equipment is matched with the filtering information of the second terminal equipment, determining that wireless fidelity point-to-point connection between the first terminal equipment and the second terminal equipment is established; and responding to the sharing operation of the user on the target file, and sharing the target file to the target equipment in the connected terminal equipment, wherein the pointing direction of the target equipment is opposite to that of the first terminal equipment. According to the technical scheme, the Wi-Fi P2P connection process is simple, and a user can share files to the aligned terminal device quickly through simple sharing operation, so that the complexity of the data transmission process based on P2P connection can be reduced.

Description

Wireless communication method and terminal device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a wireless communication method and a terminal device.
Background
Wireless fidelity Direct (Wi-Fi Direct) is a point-to-point (P2P) connection technology, by which terminal devices can discover each other and establish point-to-point connection without Access Point (AP) participation, and then can perform data transmission between devices based on the established P2P connection. However, the current process of data transmission based on P2P connection is generally complicated.
Disclosure of Invention
In view of the above, the present application provides a wireless communication method and a terminal device for reducing the complexity of a data transmission process based on a P2P connection.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides a wireless communication method, applied to a first terminal device, including:
broadcasting a detection request message carrying the filtering information of the first terminal equipment;
and if a probe response message carrying filtering information of second terminal equipment is received and the filtering information of the first terminal equipment is matched with the filtering information of the second terminal equipment, determining that the wireless fidelity point-to-point connection between the first terminal equipment and the second terminal equipment is established.
According to the technical scheme provided by the embodiment, after the first terminal device searches the second terminal device meeting the self filtering condition based on the filtering information, the Wi-Fi P2P connection with the second terminal device is established, and compared with a standard Wi-Fi direct connection process, the connection process is simpler, so that the complexity of a data transmission process based on P2P connection can be reduced; but also can reduce the resource demand and enlarge the application range; in addition, search and filter are carried out on the terminal equipment through the filtering information, so that irrelevant terminal equipment can be filtered, and the search efficiency and the safety are improved.
In one possible implementation of the first aspect, the method further comprises: if a probe request message carrying filter information of a third terminal device is received and the filter information of the first terminal device is matched with the filter information of the third terminal device, determining that wireless fidelity point-to-point connection between the first terminal device and the third terminal device is established, and sending a probe response message carrying the filter information of the first terminal device to the third terminal device.
In one possible implementation of the first aspect, the filtering information includes at least one of the following information: account information, group information, input information of a user and near field communication identification information, wherein the input information comprises text information or voice information. In the embodiment, the user can set required filtering information according to the requirement, and the flexibility is high.
In one possible implementation of the first aspect, the method further comprises:
generating a virtual Internet Protocol (IP) address of the first terminal device and a virtual IP address of a target terminal device, where the target terminal device is a terminal device that has established a wireless fidelity point-to-point connection with the first terminal device;
and carrying out data transmission with the target terminal equipment based on the virtual IP address.
By adopting the virtual IP address, the method can be compatible with upper-layer applications such as a network layer and the like, so that the upper-layer applications do not need to be changed, and the development cost can be reduced.
In a possible implementation manner of the first aspect, in a data transmission process, neither a first message sent by the first terminal device to the target terminal device nor a second message received from the target terminal device includes IP address information;
in the process that the first terminal device processes the second message, adding virtual IP address information to a data packet corresponding to the second message transmitted to a network layer, wherein a destination IP address in the virtual IP address information is a virtual IP address of the first terminal device, and a source IP address in the virtual IP address information is a virtual IP address of the target terminal device that sends the second message.
In one possible implementation of the first aspect, the method further comprises:
responding to the triggering operation of a user on a video call function before broadcasting the detection request message carrying the filtering information of the first terminal equipment, and displaying a first interface;
responding to filtering information setting operation performed by a user on the first interface, and determining filtering information of the first terminal equipment;
after the wireless fidelity point-to-point connection is established, transmitting the collected video image to a target terminal device, wherein the target terminal device is a terminal device which is in wireless fidelity point-to-point connection with the first terminal device;
and receiving and displaying the video image collected by the target terminal equipment.
By the implementation mode, the video call among multiple devices can be conveniently realized; in addition, the video call mode does not depend on base station signals, so that the video call can be carried out in areas with weak mobile communication signals or no mobile communication signals by adopting the mode, and the user requirements can be better met; in addition, the video call mode does not occupy the mobile data flow, so that the flow resource can be saved.
In a possible implementation manner of the first aspect, the displaying a video image captured by the target terminal device includes: if the target terminal equipment comprises a plurality of target terminal equipment, displaying the video image acquired by one of the target terminal equipment in the main window, and displaying the video images acquired by other target terminal equipment in the suspended sub-window, wherein the size of the main window is larger than that of the sub-window.
In a possible implementation manner of the first aspect, the filtering information includes a pointing direction of the device, and the pointing directions of any two terminal devices that match the filtering information are opposite to each other, and the method further includes:
the method comprises the steps of sharing a first target file to target terminal equipment and/or receiving a second target file shared by the target terminal equipment, wherein the target terminal equipment is the terminal equipment which is in wireless fidelity point-to-point connection with the first terminal equipment. Through the embodiment, the file sharing among multiple devices can be conveniently carried out.
In a second aspect, an embodiment of the present application provides a wireless communication method, applied to a first terminal device, including:
establishing a wireless communication connection with at least one second terminal device;
responding to a sharing operation of a user on a first target file, and broadcasting a first notification message corresponding to the first target file;
receiving response messages sent by the at least one second terminal device, wherein each received response message carries the pointing direction of the corresponding second terminal device;
determining target equipment from the at least one second terminal equipment according to the pointing direction of the first terminal equipment and the pointing directions of the second terminal equipment, wherein the pointing direction of the target equipment is opposite to that of the first terminal equipment;
and if the target equipment is determined, sharing the first target file with the target equipment.
According to the technical scheme provided by the embodiment, the target device is automatically determined through the alignment operation, so that the user can quickly share the file to the aligned terminal device through simple sharing operation, the complexity of the data transmission process based on the P2P connection can be reduced, and the convenience of the user for sharing the file is improved.
In a possible implementation manner of the second aspect, the sharing operation is a touch operation or a gesture operation.
In one possible implementation of the second aspect, the target device is located within a first preset range around the first terminal device. Therefore, some devices which are too far away from the first terminal device can be filtered, and the accuracy of data sharing is improved.
In a possible implementation manner of the second aspect, the determining, by the at least one second terminal device, a target device from the at least one second terminal device according to the pointing direction of the first terminal device and the pointing directions of the second terminal devices, where each received reply message further carries a device location of a corresponding second terminal device, includes:
for each received response message, if the device position in the response message is located in a first preset range centered on the device position of the first terminal device, and the pointing direction in the response message is opposite to the pointing direction of the first terminal device, determining the second terminal device sending the response message as the target device.
In a possible implementation manner of the second aspect, the sharing the first target file to the target device includes:
sending a second notification message corresponding to the first target file to the target device;
and if a resource request message for requesting the first target file returned by the target equipment is received, transmitting the first target file to the target equipment.
In the foregoing embodiment, the second terminal device may notify the first terminal device whether to receive the first target file by determining whether the resource request message occurs, so that flexibility of file sharing may be improved.
In one possible implementation of the second aspect, the method further comprises: and if the target equipment is not determined, prompting that no target equipment exists. Therefore, the user can conveniently know the equipment determination condition, and the user experience is improved.
In a possible implementation manner of the second aspect, when the target device is located in a second preset range taking the pointing line of the first terminal device as a center line, and an azimuth angle between the pointing azimuth and the opposite azimuth of the pointing direction of the first terminal device is within a preset angle range, the pointing azimuth of the target device is determined to be opposite to the pointing azimuth of the first terminal device.
In the above embodiment, when determining whether the pointing directions of the two devices are opposite to each other, a certain pointing error is allowed, which may be convenient for the user to use.
In one possible implementation of the second aspect, the method further comprises:
if a first notification message corresponding to a second target file is received, sending a response message to a second terminal device sending the first notification message, wherein the sent response message carries the pointing direction of the first terminal device;
if a second notification message corresponding to the second target file is received, sending a resource request message for requesting the second target file to a second terminal device sending the second notification message under the condition that the second target file is not stored;
and receiving the second target file.
In a third aspect, an embodiment of the present application provides a wireless communication apparatus, which is applied to a first terminal device, and includes:
a communication module, configured to broadcast a probe request message carrying filtering information of the first terminal device;
and the processing module is used for determining that the wireless fidelity point-to-point connection between the first terminal equipment and the second terminal equipment is established if the communication module receives a probe response message carrying filtering information of the second terminal equipment and the filtering information of the first terminal equipment is matched with the filtering information of the second terminal equipment.
In a possible implementation manner of the third aspect, the processing module is further configured to: if the communication module receives a probe request message carrying filter information of a third terminal device, and the filter information of the first terminal device is matched with the filter information of the third terminal device, it is determined that a wireless fidelity point-to-point connection between the first terminal device and the third terminal device is established, and a probe response message carrying the filter information of the first terminal device is sent to the third terminal device.
In one possible embodiment of the third aspect, the filtering information includes at least one of the following information: account information, group information, input information of a user and near field communication identification information, wherein the input information comprises text information or voice information.
In a possible implementation manner of the third aspect, the processing module is further configured to:
generating a virtual IP address of the first terminal device and a virtual IP address of a target terminal device, wherein the target terminal device is a terminal device which establishes a wireless fidelity point-to-point connection with the first terminal device;
and carrying out data transmission with the target terminal equipment based on the virtual IP address.
In a possible implementation manner of the third aspect, in a data transmission process, neither a first message sent by the first terminal device to the target terminal device nor a second message received from the target terminal device includes IP address information;
in the process that the first terminal device processes the second message, adding virtual IP address information to a data packet corresponding to the second message transmitted to a network layer, wherein a destination IP address in the virtual IP address information is a virtual IP address of the first terminal device, and a source IP address in the virtual IP address information is a virtual IP address of the target terminal device that sends the second message.
In one possible implementation of the third aspect, the apparatus further comprises:
the display module is used for responding to the triggering operation of a user on the video call function and displaying a first interface before the detection request message carrying the filtering information of the first terminal equipment is broadcasted;
the input module is used for receiving filtering information setting operation performed on the first interface by a user;
the processing module is further configured to: responding to the filtering information setting operation, and determining filtering information of the first terminal equipment;
the communication module is further configured to: after the wireless fidelity point-to-point connection is established, transmitting the collected video image to a target terminal device, wherein the target terminal device is a terminal device which is in wireless fidelity point-to-point connection with the first terminal device;
the display module is further configured to: and displaying the video image acquired by the target terminal equipment after the communication module receives the video image.
In a possible implementation manner of the third aspect, the display module is specifically configured to: if the target terminal equipment comprises a plurality of target terminal equipment, displaying the video image acquired by one of the target terminal equipment in the main window, and displaying the video images acquired by other target terminal equipment in the suspended sub-window, wherein the size of the main window is larger than that of the sub-window.
In a possible implementation manner of the third aspect, the filtering information includes pointing directions of devices, and the pointing directions of any two terminal devices that match the filtering information are opposite to each other, and the communication module is further configured to:
the method comprises the steps of sharing a first target file to target terminal equipment and/or receiving a second target file shared by the target terminal equipment, wherein the target terminal equipment is the terminal equipment which is in wireless fidelity point-to-point connection with the first terminal equipment.
In a fourth aspect, an embodiment of the present application provides a wireless communication apparatus, which is applied to a first terminal device, and includes: communication module, input module and processing module, wherein:
the communication module is configured to: establishing a wireless communication connection with at least one second terminal device;
the input module is used for: receiving sharing operation of a user on a first target file;
the processing module is used for: responding to a sharing operation of a user on a first target file, and broadcasting a first notification message corresponding to the first target file through the communication module;
the communication module is further configured to: receiving response messages sent by the at least one second terminal device, wherein each received response message carries the pointing direction of the corresponding second terminal device;
the processing module is further configured to: determining target equipment from the at least one second terminal device according to the pointing direction of the first terminal device and the pointing directions of the second terminal devices; and sharing the first target file to the target device through the communication module under the condition that the target device is determined, wherein the pointing direction of the target device is opposite to the pointing direction of the first terminal device.
In a possible implementation manner of the fourth aspect, the sharing operation is a touch operation or a gesture operation.
In a possible implementation manner of the fourth aspect, the target device is located within a first preset range around the first terminal device.
In a possible implementation manner of the fourth aspect, each received response message further carries a device location of a corresponding second terminal device, and the processing module is specifically configured to:
for each received response message, if the device position in the response message is located in a first preset range centered on the device position of the first terminal device, and the pointing direction in the response message is opposite to the pointing direction of the first terminal device, determining the second terminal device sending the response message as the target device.
In a possible implementation manner of the fourth aspect, the communication module is specifically configured to:
sending a second notification message corresponding to the first target file to the target device;
and if a resource request message for requesting the first target file returned by the target equipment is received, transmitting the first target file to the target equipment.
In a possible implementation manner of the fourth aspect, the processing module is further configured to: and if the target equipment is not determined, prompting that no target equipment exists.
In a possible implementation manner of the fourth aspect, the processing module is specifically configured to: and when the target equipment is positioned in a second preset range taking the pointing line of the first terminal equipment as a central line and an azimuth included angle between the pointing azimuth and the reverse azimuth of the pointing direction of the first terminal equipment is in a preset angle range, determining that the pointing azimuth of the target equipment is opposite to the pointing azimuth of the first terminal equipment.
In a possible implementation manner of the fourth aspect, the communication module is further configured to:
if a first notification message corresponding to a second target file is received, sending a response message to a second terminal device sending the first notification message, wherein the sent response message carries the pointing direction of the first terminal device;
if a second notification message corresponding to the second target file is received, sending a resource request message for requesting the second target file to a second terminal device sending the second notification message under the condition that the second target file is not stored;
and receiving the second target file.
In a fifth aspect, an embodiment of the present application provides a terminal device, including: a memory for storing a computer program and a processor; the processor is adapted to perform the method of the first or second aspect when the computer program is invoked.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method of the first aspect or the second aspect.
In a seventh aspect, an embodiment of the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to execute the method of the first aspect or the second aspect.
In an eighth aspect, an embodiment of the present application provides a chip system, which includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the method according to the first aspect or the second aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
It is to be understood that the methods described in the first and second aspects above may be combined with each other to form a new embodiment, and when combined, the second terminal device described in the second aspect may further include the third terminal device described in the first aspect, that is, the second terminal device described in the second aspect may be represented by the target terminal device described in the first aspect; in addition, the beneficial effects of the third aspect to the eighth aspect can be referred to the relevant description in the first aspect and the second aspect, and are not described herein again.
Drawings
Fig. 1 is a system architecture diagram of a wireless communication method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an application interface provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a device discovery process provided in an embodiment of the present application;
fig. 4 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 5 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 6 is a flowchart illustrating a wireless communication method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another application interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of yet another application interface provided by an embodiment of the present application;
FIG. 9 is a schematic illustration of some of the application interfaces provided by embodiments of the present application;
FIG. 10 is a schematic flow chart of a filtered search provided by an embodiment of the present application;
fig. 11 is a frame structure diagram of a probe request frame and a probe response frame according to an embodiment of the present application;
FIG. 12 is a schematic diagram of yet another application interface provided by an embodiment of the present application;
FIG. 13 is a schematic illustration of yet another application interface provided by an embodiment of the present application;
fig. 14 is a schematic diagram of a message structure provided in an embodiment of the present application;
FIG. 15 is a schematic illustration of yet another application interface provided by an embodiment of the present application;
FIG. 16 is a schematic illustration of yet another application interface provided by an embodiment of the present application;
fig. 17 is a schematic view of file sharing provided in an embodiment of the present application;
fig. 18 is a schematic view of a file sharing process according to an embodiment of the present application;
fig. 19 is a schematic view of an application scenario of file sharing according to an embodiment of the present application;
fig. 20 is a schematic diagram of address information of each message in file sharing according to an embodiment of the present disclosure;
fig. 21 is a schematic diagram illustrating determining a location range of a target device according to an embodiment of the present application;
fig. 22 is a schematic view of another file sharing method according to an embodiment of the present application;
fig. 23 is a schematic view of another file sharing provided in the embodiment of the present application;
fig. 24 is a schematic structural diagram of a wireless communication device according to an embodiment of the present application;
fig. 25 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments herein only and is not intended to be limiting of the application.
First, an application scenario of the embodiment of the present application is described. The technical scheme provided by the embodiment of the application can be applied to various Wi-Fi P2P connection scenes, and for convenience of description, the Wi-Fi direct connection scene is taken as an example in the embodiment for exemplary description.
Fig. 1 is a schematic diagram of a system architecture of a wireless communication method provided in an embodiment of the present application, and as shown in fig. 1, the system may include a plurality of terminal devices (three are exemplarily illustrated in the figure), and Wi-Fi direct connections may be established between the terminal devices, and data may be shared with each other through the established connections. The terminal device may be the mobile phone 1100, the tablet computer (pad)1200, and the television 1300 shown in fig. 1, or may be a terminal device supporting a Wi-Fi direct connection function, such as a large screen device, a desktop computer, a notebook, or a wearable device, which is not shown.
Specifically, a wireless connection can be established between terminal devices by using a standard Wi-Fi direct connection technology, that is, a group (group) can be established between terminal devices, in the group, one terminal device plays a Group Owner (GO), other terminal devices play Group Clients (GC), and a Wi-Fi direct connection is established between the GO and each GC.
The connection establishment process between the GO and any GC may include device discovery (device discovery) and group establishment (group format), and the two devices discover each other through the device discovery process, and perform GO negotiation and exchange security configuration information through the group establishment process.
Fig. 2 is a schematic view of an application interface provided in an embodiment of the present application, and as shown in fig. 2 (a), a user may click a Wi-Fi direct option 101 in a Wi-Fi setup interface 10 on a mobile phone 1100 to open a Wi-Fi direct interface 20, and start a Wi-Fi direct function. After the Wi-Fi direct function is started, the terminal device may initiate a device discovery process to search for surrounding Wi-Fi direct devices, as shown in (b) of fig. 2, prompt information such as "searching" may be displayed in the available device bar 202 to prompt the user that an available Wi-Fi direct device is being searched. In the Wi-Fi direct interface 20, the user may edit the Wi-Fi direct device name of the home terminal through the device name option 2011 in the my device bar 201, such as the Wi-Fi direct device name "AAA" of the mobile phone 1100 shown in fig. 2.
The Wi-Fi direct connection establishment procedure is described below by taking two terminal devices (a first terminal device and a second terminal device, respectively) as an example.
Fig. 3 is a schematic flowchart of a device discovery process provided in an embodiment of the present application, and as shown in fig. 3, the device discovery process may include: a scan phase and a find phase.
A scanning stage: after the device discovery process is started, the terminal device (including the first terminal device and the second terminal device) may first enter a scanning phase, in which the terminal device sends Probe Request (Probe Request) frames on all frequency bands supported by the terminal device, and the terminal device does not process Probe Request frames from other devices in this phase, and may receive Probe Response (Probe Response) frames to discover other terminal devices in a listening state.
A discovery phase: and after the scanning stage is finished, the terminal equipment enters a discovery stage. In this phase, the terminal device switches back and forth between a listening state (listen state) and a search state (search state).
In the listening state, the terminal device randomly selects one of the 1, 6 and 11 frequency bands (i.e. ch1, ch6 and ch11 in the figure, which are called social channels) as a listening band (listen channel), listens to the probe request frame and replies to the probe response frame accordingly. For example, in fig. 2, the listening band of the first terminal device is 1, and the listening band of the second terminal device is 6. The terminal equipment enters a searching state after monitoring for a period of time.
In the search state, the terminal device will transmit probe request frames on frequency bands 1, 6, and 11, respectively. When two terminal devices are in the same frequency band, a frame transmitted by one terminal device may be received by the other terminal device, for example, in fig. 2, the second terminal device listens to the probe request frame transmitted by the first terminal device on ch6, and replies a probe response frame to the first terminal device correspondingly.
The terminal device completes the interaction of the probe request frame and the probe response frame, that is, finds the Wi-Fi direct device, and may store device information of the opposite device, including a Media Access Control (MAC) address, a device type (device type), a device name (device name), and the like.
As shown in (c) of fig. 2, once the terminal device searches for the Wi-Fi direct device, the interface is updated to display the searched Wi-Fi direct device, and as shown in the available device column 202, the searched "BBB" (device name of pad 1200) and "CCC" (device name of tv 1300) are displayed. The user may select a certain Wi-Fi direct device in the interface to establish a connection with, for example, as shown in (c) and (d) of fig. 2, the user may click "BBB" to establish a connection with.
After the user selects the Wi-Fi direct device to be connected, the terminal device may perform a Group establishment procedure (corresponding to the start of establishing a Group in the drawing) to prepare for constructing a Group, where the Group establishment procedure may include a GO negotiation (GON) procedure and a security configuration (provisioning) procedure. The GON process is used for negotiating who makes GO, and after the GON process is executed, the safety configuration process is carried out, wherein the process mainly utilizes a Wi-Fi simple configuration (WSC) protocol to negotiate safety configuration information.
After the group establishment process is finished, the GC applies for an Internet Protocol (IP) address from the GO, the GO starts a Dynamic Host Configuration Protocol (DHCP) server to allocate an IP address to the GC, and after the IP address is allocated, the terminal device may update the connection state to notify the user that the connection is successful, for example, as shown in (d) of fig. 2, the connection state of updating "BBB" is "connected". After the group is disassembled, the IP address of the GC may be released.
After the group is established, GO and GC may also invite other terminal devices to join the group through a P2P invitation (P2P invitation) flow; during the device discovery process, the terminal device may also activate a previously established persistent group through the P2P invitation process.
In addition, as shown in (b) and (c) of fig. 2, the user may stop the search by clicking the stop icon 203 in the Wi-Fi direct interface 20 during the search; as shown in (d) of fig. 2, after the search is finished, the user may start the device discovery process again by clicking the search icon 204 in the Wi-Fi direct interface 20.
According to the Wi-Fi direct protocol, the IP address of the GO is 192.168.49.1 by default, namely the IP addresses of the GO in any group are 192.168.49.1, so that the IP addresses of the GO in different groups are overlapped and possibly interfere with each other, and the connection of the terminal equipment is unstable. Fig. 4 is a schematic view of an application scenario provided by an embodiment of the present application, for example, as shown in fig. 4, a1, B1, and C1 are a group, a2, B2, C2, and D2 are a group, a1 and a2 are GO of respective groups, and default IP addresses of both are 192.168.49.1. In a relatively small space (such as the same subway carriage), two groups simultaneously hope to internally share data such as audio, video or pictures; during communication, since the IP addresses of a1 and a2 are the same, each GC: b1, C1, B2, C2 and D2, which GO cannot be automatically determined to connect to, nor cannot be manually intervened; and after connecting the wrong GO, even if the wrong GO leaves, the GC still needs a long time to recover the initial state, and reconnects the new GO, which all affect the connection stability of the terminal device.
In addition, the address of the GC is dynamically allocated by the GO, which may also affect the connection stability of the terminal device. Fig. 5 is a schematic view of another application scenario provided in the embodiment of the present application, as shown in fig. 5, A3, B3, and C3 are a Group, and share data with each other, where A3 is GO; in the sharing process, A3 leaves, D3 hopes to join in a group, since A3 is GO, after the A3 leaves, the group is dispersed, B3 and C3 cannot share data continuously, and D3 cannot acquire IP addresses in time and cannot participate in data sharing, and B3, C3 and D3 need to be rebuilt to share data continuously.
In addition, the device discovery, negotiation and connection processes in the Wi-Fi direct connection protocol are complex, the required resources are excessive, the system overhead is large, and the device is too thick and heavy for thin devices; in addition, in the device discovery process, all the surrounding terminal devices in the device discovery state are scanned by default, so that the search efficiency is low, the operation interfaces corresponding to the search results are not concise, and if irrelevant terminal devices are searched and connected by mistake, the safety risk can also exist.
Therefore, the embodiment of the application provides a lightweight wireless communication method, which mainly simplifies Wi-Fi direct connection protocols and flows, cuts out group establishment and a P2P invitation flow, and adopts a decentralized mode for group management, namely two ends of equipment are connected in a peer-to-peer mode, and GO and GC role differentiation is cancelled; the IP address management can also adopt a decentralized mode, the unified management and distribution of the IP addresses of the GC are cancelled, the IP addresses are virtualized, the virtual IP addresses of the local terminal and the opposite terminal are locally managed by the terminal equipment, and data are forwarded by a data link layer, so that the problem that the automatic connection of the terminal equipment is unstable due to the overlapping of the IP addresses of GO is solved; in addition, in the process of equipment discovery, the terminal equipment is searched and filtered, so that the searching efficiency and safety are improved, and an operation interface is optimized. This technical solution is explained in detail below.
Fig. 6 is a flowchart illustrating a wireless communication method according to an embodiment of the present application, and as shown in fig. 6, the wireless communication method according to the embodiment may include the following steps:
and S110, the first terminal equipment responds to the connection triggering operation of the user and searches for second terminal equipment meeting the filtering condition of the local terminal.
And S120, the first terminal device sets the Wi-Fi direct connection state between the first terminal device and the searched second terminal device to be a connected state.
Specifically, a user may start the Wi-Fi direct function through a connection triggering operation, where the connection triggering operation may be a triggering operation of the target system function by the user, for example, as shown in fig. 3 (a), the user may click a Wi-Fi direct option in a Wi-Fi setting interface to start the Wi-Fi direct function; as another implementation manner, a shortcut option may also be provided on the first terminal device, for example, as shown in fig. 7, a Wi-Fi direct icon may be provided in the drop-down notification bar, and the user may start a Wi-Fi direct function by clicking the icon. The Wi-Fi direct connection function can be started only when the Wi-Fi function is started, so that when connection triggering operation of a user is received, the terminal equipment can automatically start the Wi-Fi function and then start the Wi-Fi direct connection function when the Wi-Fi function is not started; the user can also be prompted to start the Wi-Fi direct function after the user authorizes to start the Wi-Fi function.
In order to improve the convenience of the user, in this embodiment, when the user starts a target Application (APP) or a target function, the target APP or the target function may be triggered to automatically start the Wi-Fi direct function. For example: after a user clicks a gallery icon to open a gallery, the gallery automatically starts a Wi-Fi direct connection function; for another example, when a user clicks a video call function and initiates a video call, the video call function automatically starts a Wi-Fi direct connection function, searches for available Wi-Fi direct devices and establishes a connection.
It should be understood that the target application is not limited to the gallery, and may also be a file management APP or other instant messaging APPs, and the like, which is not particularly limited in this embodiment of the application.
Certainly, the user may start the Wi-Fi direct function by inputting a voice command, which is not particularly limited in this embodiment. In addition, the connection triggering operation may also be a click operation of a search icon in the Wi-Fi direct interface after the user clicks the Wi-Fi direct option to open the Wi-Fi direct interface.
After detecting the connection triggering operation of the user, the first terminal device may respond to the operation, and enter the device discovery process to perform filtering search to search for a second terminal device that meets the filtering condition of the local terminal, where the filtering condition may be preset by the system, for example: the first terminal equipment automatically searches for second terminal equipment logging in the same system account; in order to improve the flexibility of the user, the first terminal device may also provide a filtering condition setting function for the user to set the filtering condition autonomously.
In a specific implementation, as shown in (a) of fig. 8, in addition to the various function options shown in fig. 2, a filter condition option 203 may be included in the Wi-Fi direct interface 20, and a user may click the filter condition option 203 to open the filter condition interface 40; as shown in (b) of fig. 8, various options of filter conditions may be included in the filter condition interface 40, and a user may set a desired filter condition in the filter condition interface 40.
Specifically, as shown in fig. 8 (b), the filtering conditions may include system account filtering, group filtering, password filtering, voice filtering, Near Field Communication (NFC) filtering, and the like, switch controls 401 to 405 corresponding to each filtering condition may be provided in the filtering condition interface 40, and a user may select to turn on one or more filtering conditions through the switch controls, that is, the filtering conditions may include one or more filtering conditions.
For system account filtering, if the system account filtering is started by a user, the first terminal device can automatically search a second terminal device which logs in the same system account with the first terminal device in the searching process. Optionally, in the embodiment of the present application, the filtering is not limited to filtering the system account, and in a specific implementation, other accounts may also be filtered.
For group filtering, if a user establishes a target group in advance, such as a family group (including information related to terminal devices of all members of the family) and an adult group (not including information related to terminal devices held by minors among the family members), the group filtering may be started and a group filtering condition may be added when performing terminal search filtering. As shown in the figure, an editing option 4021 of a group name may be set in a column of the group filtering option for a user to edit a specific condition of group filtering, and optionally, the specific condition of group filtering may also be other group identification information such as a group number, which is not particularly limited in this embodiment of the present application. If the group filtering is started, the first terminal device may automatically search for the second terminal device in the target group selected by the user in the search process. For example, the user sets the group name in the group filter as a family group, and the first terminal device searches for the terminal devices (i.e., the second terminal devices) of the members in the family group.
For password filtering (also called password filtering), as shown in the figure, a password editing option 4031 may be set in the column of the password filtering option, and a user may set a password through the option while starting the password filtering. If the password filtering is started, the first terminal device can automatically search the second terminal device with the same password in the searching process. Optionally, in the embodiment of the present application, the password is not limited to be filtered, and in specific implementation, the text information may also be filtered.
For voice filtering, if the Wi-Fi direct connection function is started by a user, the first terminal device may automatically acquire voice information input by the user after the Wi-Fi direct connection function is started, generate filtering information according to the acquired voice information, and then automatically search for a second terminal device carrying the same filtering information in a detection message in a search process. For the filtering condition, in specific application, as the distance between the terminal devices is short, voice information can be sent out by any user around the terminal devices, and each terminal device can acquire the voice information and generate filtering information corresponding to the filtering condition according to the voice information.
For the NFC filtering, after the Wi-Fi direct connection function of the first terminal device is started, a user can carry out 'touch-touch' operation on the first terminal device and a second terminal device needing to be connected; the first terminal device may collect NFC information (for example, may be an NFC identifier) of the second terminal device, and then may search for the second terminal device corresponding to the collected NFC information in the search process. In specific implementation, the first terminal device may generate filtering information according to the NFC information of the local terminal and the acquired NFC information.
It can be understood that, if the user does not turn on any filtering condition, the first terminal device may also perform the procedures of device discovery and negotiation according to the standard Wi-Fi direct protocol to establish the connection.
As another implementation manner, the first terminal device may provide filtering functions corresponding to the various filtering conditions for the target application program to call, and the target application program automatically starts at least one filtering function after starting the Wi-Fi direct connection function, for example, after the user opens the gallery, the gallery starts the system account filtering function while starting the Wi-Fi direct connection function, and searches for a second terminal device that logs in the same system account; for another example, after the user opens the video call function, the user can input voice to initiate the video call, and the video call function starts the voice filtering function while starting the Wi-Fi direct connection function, and performs filtering search by using the voice input by the user as a filtering condition; or, the target APP or the target function may also provide a start option of the filtering function, and the user may start the corresponding filtering function by clicking the start option, and the following description exemplifies a process in which the user starts the Wi-Fi direct connection function and starts the filtering function, taking the target function as the video call function as an example.
Fig. 9 is a schematic view of some application interfaces provided in the embodiment of the present application, and as shown in fig. 9 (a), a video call icon 3012 may be provided in the drop-down notification bar, and a user may start a video call function by clicking the icon. After the video call function is started, the Wi-Fi direct function may be automatically started as described above, or the user may be prompted to authorize the start of the Wi-Fi direct function and the Wi-Fi function as shown in (b) of fig. 9. The user may select the start option to start the Wi-Fi direct function and the Wi-Fi function, as shown in (c) of fig. 9, after the Wi-Fi direct function is started, the video call function may provide start options of various filtering conditions corresponding to the filtering function: the mobile terminal comprises a system account filtering option 001, a group filtering option 002, a password filtering option 003, a voice filtering option 004 and an NFC filtering option 005, wherein for the system account filtering option 001, a first terminal device can search available Wi-Fi direct connection devices to establish connection after clicking; for the group filtering option 002 and the password filtering option 003, after the user selects, the video call function may provide an input interface for the user to input group information or password information; for the voice filtering option 004 and the NFC filtering option 005, after the user selects, the first terminal device may prompt the user to input voice information or perform a "touch and click" operation. Taking password filtering as an example, as shown in (c) and (d) of fig. 9, the user may click the password filtering option 003, and then may input the password in the pop-up input interface, where the number of digits of the password may be preset to 4 digits as shown in the figure, or may not be limited to the number of digits.
It can be understood that the video call function may also be presented to the user in the form of APP, and the specific implementation form is not particularly limited in this embodiment.
The following describes the search procedure of the first terminal device and the second terminal device in detail.
As described above, the first terminal device may carry filtering information corresponding to the filtering condition in the probe message, where the probe message may include a probe request message and a probe request response message, where the probe request message may specifically be a probe request frame, and the probe request response message may specifically be a probe response frame, that is, the first terminal device may carry the filtering information in the probe request frame and the probe response frame; the filtering information may be information such as a system account, a group identifier, a password, voice information, or NFC information in the filtering condition, and in order to improve the security of data transmission, the filtering information may also be character information generated according to the information such as the system account, the group identifier, the password, the voice information, or the NFC information. Correspondingly, when the first terminal device and the second terminal device search for each other, the method shown in fig. 10 may be used to implement the method, where fig. 10 is a schematic flow diagram of filtering search provided in the embodiment of the present application, and as shown in fig. 10, the method may include the following steps:
s111, the first terminal equipment broadcasts a detection request frame carrying the first filtering information.
Specifically, the second terminal device may also set the filtering condition, and the specific setting process is similar to the filtering condition setting process of the first terminal device, and is not described herein again. For convenience of distinction, the filtering information corresponding to the filtering condition in the first terminal device is referred to as first filtering information, and the filtering information corresponding to the filtering condition in the second terminal device is referred to as second filtering information.
After the Wi-Fi direct function is turned on, the first terminal device may broadcast the probe request frame on all channels or the social channel, as described above. In order to facilitate the peer device to know the filtering condition of the first terminal device, the first terminal device may carry the first filtering information in the probe request frame, so that the second terminal device may reply the response message when the filtering condition is satisfied, so as to inform the first terminal device that it meets the filtering condition.
Fig. 11 is a schematic frame structure diagram of a probe request frame and a probe response frame provided in an embodiment of the present invention, and as shown in fig. 11, the probe request frame may include a MAC header (MAC header), a frame body (frame body), and a Frame Check Sequence (FCS) field, where the MAC header may include some frame control information, address information, and the like; the frame body represents a data field, which is a payload; the FCS is used to guarantee frame data integrity.
The first filtering information may be carried in a frame body, and in a specific implementation, a characteristic parameter (charactercode) field may be added in the frame body, where the field indicates the first filtering information, and the characteristic parameter field may be located in any position of the frame body.
And S112, after receiving the detection request frame, the second terminal device judges whether the first filtering information and the second filtering information carried in the detection request frame are matched, and if so, the first terminal device is determined to meet the filtering condition of the local terminal.
The second terminal device may be similar to the first terminal device, respond to a connection trigger operation of the user, and start the Wi-Fi direct connection function, and a specific process is similar to a process in which the first terminal device starts the Wi-Fi direct connection function, and is not described here again.
After the Wi-Fi direct function is turned on, the second terminal device may monitor the probe request frame on the monitored frequency band as described above. If the second terminal device monitors the probe request frame, the first filtering information in the probe request frame can be extracted, and the first filtering information is compared with the filtering information (namely, the second filtering information) corresponding to the filtering condition of the local terminal, so that whether the first terminal device meets the filtering condition of the local terminal or not is determined. If the first filtering information is matched with the second filtering information, the first terminal equipment accords with the filtering condition of the second terminal equipment; if the first filtering information does not match the second filtering information, the second terminal device may discard the probe request frame and refuse to connect to the first terminal device.
In specific judgment, the account number, the group, the password, the NFC and other filtering conditions can be compared to judge whether the first filtering information is consistent with the second filtering information, and if so, the first filtering information and the second filtering information are considered to be matched; for voice filtering, the similarity between the first filtering information and the second filtering information can be compared, if the similarity meets the preset similarity, the first filtering information and the second filtering information can be considered to be matched, wherein the size of the preset similarity can be set according to actual needs.
And S113, the second terminal equipment returns a detection response frame carrying the second filtering information to the first terminal equipment.
If the second terminal device determines that the first filtering information carried in the probe request frame matches the second filtering information, the second terminal device may reply the probe response frame correspondingly to inform the first terminal device that the filtering conditions of the first terminal device and the second terminal device are consistent, that is, the first terminal device and the second terminal device mutually meet the filtering conditions of the opposite terminal. Or, the probe response frame may also carry acknowledgement information to inform the first terminal device that the filtering conditions of the two devices are consistent.
In order to improve reliability, the second terminal device may also carry second filtering information in the frame to inform the first terminal device of the local filtering condition.
As shown in fig. 11, the probe response frame may include a MAC header, a frame body, and FCS fields, and the second filtering information may be carried in the frame body of the probe response frame, similarly to the first filtering information, and a characteristic parameter field may be added in the frame body of the probe response frame, through which the second filtering information is indicated, wherein the characteristic parameter field may be located at any position of the frame body.
It can be understood that the probe request frame and the probe response frame may also carry device information such as a device identifier, a device name, a device type, and an MAC address of the local device, so as to be identified by the peer device.
And S114, after receiving the detection response frame, the first terminal device judges whether the second filtering information carried in the detection response frame is matched with the first filtering information, and if so, the second terminal device is determined to meet the filtering condition of the local terminal.
Specifically, the first terminal device receives the probe response frame returned by the second terminal device, and may determine that the second terminal device satisfies the local filtering condition.
As described above, in order to improve reliability, the probe response frame may carry second filtering information, and accordingly, the first terminal device may extract the second filtering information, compare the second filtering information with filtering information (i.e., the first filtering information) corresponding to the filtering condition of the local terminal, and determine whether the second terminal device meets the filtering condition of the local terminal based on a comparison result.
Specifically, if the first filtering information is matched with the second filtering information, it is indicated that the second terminal device meets the filtering condition of the first terminal device; if the first filtering information is not matched with the second filtering information, it indicates that the second terminal device is not in accordance with the filtering condition of the first terminal device, and the first terminal device may discard the probe response frame and refuse to connect with the second terminal device. The process of the first terminal device determining whether the first filtering information is matched with the second filtering information is similar to the process of the second terminal device determining the filtering information, and is not repeated here.
In a near field communication scenario, two communication parties can communicate with each other directly through a physical layer and a data link layer in the same local area network, and therefore, in this embodiment, when data transmission is performed between terminal devices, an IP address may not be carried in a data packet, and communication is performed through the physical layer and the data link layer. Correspondingly, the IP address assignment procedure may be cut down, and the group establishment procedure and the P2P invitation procedure may be cut down. In this way, the first terminal device searches for the second terminal device meeting the filtering condition, that is, the connection establishment is successful, the searched Wi-Fi direct connection state of the second terminal device can be displayed as the connected state, and similarly, the searched Wi-Fi direct connection state of the first terminal device can also be displayed as the connected state on the second terminal device. The second terminal device searched by the first terminal device may include one or more second terminal devices.
It is to be understood that the first terminal device and the second terminal device are only used for distinguishing descriptions, and a certain terminal device may be used as the first terminal device or the second terminal device, that is, both the first terminal device and the second terminal device may have functions of the other terminal device. For example: for the first terminal device, it may also receive a probe request frame broadcast by another terminal device (referred to as a third terminal device), and in a case that it is determined that the first filtering information matches filtering information of the third terminal device carried in the probe request frame, it may be determined that the third terminal device satisfies a filtering condition of the local terminal, and a probe response frame carrying the first filtering information may be returned to the third terminal device; correspondingly, the third terminal device may determine that the first terminal device satisfies the filtering condition of the third terminal device when it is determined that the filtering information of the third terminal device matches the first filtering information in the probe response frame. That is, any one terminal device may establish a Wi-Fi direct connection with another terminal device by using the search procedure performed by the first terminal device shown in fig. 10 and/or the search procedure performed by the second terminal device.
Taking an example that three terminal devices aaa, bbb and ccc search and establish connection with each other, assuming that filter conditions of aaa and bbb are the same, both login to the same system account, and the system account login by ccc is different from aaa and bbb. As shown in fig. 12 (a), the user clicks Wi-Fi direct options of the three terminal devices, respectively, to start a Wi-Fi direct function; as shown in (b) of fig. 12, three terminal devices start to search for Wi-Fi direct devices available around, as shown in (c) of fig. 12, aaa and bbb may search for each other to establish a connection, ccc does not satisfy the filtering condition of aaa and bbb, and is filtered out during the search of aaa and bbb, and cannot establish a connection with aaa and bbb.
As shown in fig. 13 (a), the user may also click the file management icons 5 of the three terminal devices to open the file management APP, and the file management APP automatically starts the Wi-Fi direct function; the aaa and bbb can search each other based on the same system account (filtering information) to establish connection, and the ccc cannot establish connection with the aaa and bbb. As shown in (b) in fig. 13, the user may view the connected Wi-Fi direct device in the other device column 501 of the file management interface 50 corresponding to the file management icon 5, where the user may click the browse icon 502 of the file management interface 50 to open the browse interface to browse files on the local computer and other devices, and the other device column 501 may be located in the browse interface; other functionality controls such as a search box may also be provided in the file management interface 50.
In order to be compatible with upper-layer applications such as a network layer, the first terminal device and the second terminal device can locally generate virtual IP addresses of both sides, wherein the virtual IP addresses are only locally effective, when data transmission is carried out, the IP addresses can not be carried in a first message sent outwards, after the second terminal device receives the first message, the virtual IP addresses can be added into a data packet transmitted to the network layer for the network layer to identify, wherein the target IP address is the virtual IP address of the second terminal device, and the source virtual IP address is the virtual IP address of the first terminal device; similarly, when the second terminal device returns the second message to the first terminal device, the second message may not carry the IP address, and after receiving the second message returned by the second terminal device, the first terminal device adds the virtual IP address to be identified by the network layer, where the destination IP address is the virtual IP address of the first terminal device, and the source virtual IP address is the virtual IP address of the second terminal device.
Fig. 14 is a schematic diagram of a message structure provided in an embodiment of the present application, and as shown in fig. 14 (a), in a standard Wi-Fi direct protocol, a packet header of a network layer includes an IP address, a packet header of a data link layer includes a MAC address, and an ethernet payload portion includes the IP address and an IP payload; as shown in fig. 14 (b), in this scheme, the unicast message includes the virtual IP addresses of the first terminal device and the second terminal device in the packet corresponding to the network layer, the header of the packet corresponding to the data link layer includes the MAC address, and the ethernet payload portion does not include IP address information.
For multicast messages, when terminal equipment sends the multicast messages, a multicast address can be added in an IP header of the multicast messages, wherein a destination IP address can be a multicast IP address, and a source IP address can be an appointed virtual IP address; optionally, the multicast message may also be similar to the unicast message, and the IP address is not carried when the multicast message is sent out. The setting method of the broadcast message is similar to the multicast message, and is not described herein again.
The terminal device may generate the virtual IP address according to the MAC address or other device information, for example, the terminal device may generate a local virtual IP address according to its own MAC address after the Wi-Fi direct connection function is started, and generate the virtual IP address of the peer device according to the MAC address of the peer device after the peer device is searched.
The beneficial effects of this solution are explained below: for the scenario shown in fig. 4, in the present solution, a1, B1, and C1 may set the same filtering condition (for example, input the same password), so that two of the three may establish Wi-Fi direct connection, and share resources with each other; similarly, a2, B2, C2 and D2 may also set the same filtering condition, and establish Wi-Fi direct connections with each other for resource sharing. The terminal devices communicate with each other through respective unique MAC addresses, and the virtual IP addresses are locally managed, so that the phenomenon of connection errors can be reduced, and the connection stability between the terminal devices is effectively improved.
For the scenario shown in fig. 5, the same filtering condition (for example, logging in the same system account) may be set between a3, B3, and C3, and a connection is established between two of them; if A3 left during the sharing process and D3 wished to join, on the one hand, the leaving of A3 did not affect the connection between B3 and C3 due to the peer-to-peer connection between A3, B3 and C3; on the other hand, D3 may search for B3 and C3 using the same filtering condition to establish connection with B3 and C3, and thus, data may be continuously shared among B3, C3, and D3, which further improves connection stability between terminal devices.
In addition, in the scheme, the connection process of the Wi-Fi direct connection is simplified, and the group establishment and the P2P invitation flow are cut off, so that the complexity of the Wi-Fi direct connection can be reduced, the resource requirement can be reduced, and the application range can be expanded; in addition, in the process of equipment discovery, the terminal equipment is searched and filtered through the filtering condition, so that irrelevant terminal equipment can be filtered, the searching efficiency and the safety are improved, and only the searched terminal equipment meeting the filtering condition is displayed on the interface, so that the operation interface can be optimized.
After the Wi-Fi direct connection is established between the first terminal device and the second terminal device, files can be shared mutually through the Wi-Fi direct connection, video calls can be conducted, cross-device editing can be conducted, wireless screen projection or remote control can be conducted, and the like. The following takes video call and file sharing as examples for illustration.
Carrying out video call based on Wi-Fi direct connection:
for example, in some areas where the mobile communication signal is weak or no mobile communication signal exists and no wireless local area network exists, such as a suburban area, the users cannot perform communication with each other with better quality, and at this time, the users can perform communication with each other through a video call function based on Wi-Fi direct connection on the terminal device.
Specifically, each user may establish a video call connection by using the video call process shown in fig. 9, and continue to use the password filtering as an example, as shown in (a) in fig. 15, after the user inputs the password "1122" and clicks the confirmation option 0031 for confirmation, the first terminal device starts to search for available Wi-Fi direct connection devices to establish a connection; assuming that the user of the second terminal device inputs the same password "1122" in the same manner, as shown in (b) of fig. 15, the first terminal device may establish a connection with the second terminal device and perform a video call through the connection, that is, transmit a collected video stream based on the established Wi-Fi direct connection.
The video call interface can provide hang-up options and convert into voice options and the like as shown in the figure; or under the condition that a plurality of second terminal devices are connected, displaying the video image acquired by one of the second terminal devices in the main window, and displaying the video images of other second terminal devices in the sub-window, so that the user can switch the video image displayed in the main window, wherein the size of the sub-window is smaller than that of the main window, and the sub-window can be displayed on the main window in a suspension manner.
For example: the user 1, the user 2 and the user 3 respectively establish video call connection with the terminal equipment of the other party through the password filtering mode, so that for each user, the terminal equipment (namely, the first terminal equipment) held by the user can display video images collected by the terminal equipment (namely, the second terminal equipment) held by the other two users; and one of the video images collected by the two second terminal devices is displayed in the main window, and the other video image is displayed in the sub-window. Assuming that the first terminal device shown in fig. 15 is a terminal device of the user 1, and the first terminal device establishes a video call connection with the terminal device held by the user 2 first, and then establishes a video call connection with the terminal device held by the user 3, as shown in (b) in fig. 15, a video image acquired by the terminal device held by the user 2 may be displayed on the first terminal device in a main window, and a video image acquired by the terminal device held by the user 3 in a sub-window; the user can click the sub-window to switch the video image displayed in the main window to the video image collected by the terminal device held by the user 3.
Of course, the first terminal device may also display a video image (not shown) acquired by the first terminal device in another sub-window, and the user may also click the sub-window to switch the video image displayed in the main window to the video image acquired by the first terminal device; the first terminal device may also display, by default, a video image acquired by the terminal device that has established the connection last in the main window, that is, the first terminal device may also display, by default, a video image acquired by the terminal device held by the user 3 in the main window.
It should be noted that the position and size of the main window and the sub-window in fig. 15 are only an example, and are not intended to limit the present application, and may be set as needed in specific implementation, which is not particularly limited in the present embodiment.
In some scenarios, if two users (e.g., user 1 and user 3) are too far away from each other and another user (e.g., user 2) is closer to the two users, the terminal devices of user 1 and user 3 cannot establish a Wi-Fi direct connection therebetween, and the terminal device of user 2 may establish a Wi-Fi direct connection with the terminal devices of user 1 and user 3, respectively, in which case, user 1 and user 3 may communicate with each other through the terminal device of user 2, that is, the terminal device of user 2 may serve as an intermediate device for communication, so as to implement long-distance communication.
The video call mode does not depend on base station signals, and carries out video call anytime and anywhere, so that the video call can be carried out in areas with weak mobile communication signals or no mobile communication signals by adopting the mode so as to better meet the requirements of users; in addition, the video call mode does not occupy the mobile data flow, so that the flow resource can be saved.
Sharing files based on Wi-Fi direct connection:
taking the terminal apparatus aaa shown in (b) in fig. 13 as an example, as shown in (a) in fig. 16, the user can select the terminal apparatus bbb desired to be accessed in the other apparatus column 501; as shown in (b) of fig. 16, after the user selects the terminal device bbb, the user can access the files on the terminal bbb, and edit and manage the files on the terminal bbb, wherein the accessible files may include the illustrated latest files, pictures, videos, audios, documents and other types of files, and the files under each type of files may include a plurality of folders, for example, as shown in the figure, the video files include all video folders, camera folders, screen recording folders, download folders, WeChat folders, QQ folders and the like.
Fig. 17 is a schematic view of file sharing provided in an embodiment of the present application, as shown in fig. 17 (a), a user may select a target picture S to be shared on a first terminal device, then click a sharing option 6, and open a sharing interface 60; as shown in fig. 17 (b), the sharing interface 60 may include other sharing icons such as a wechat icon 601, a Wi-Fi direct icon 602, a bluetooth icon 603, and a QQ icon 604, and the user may select the Wi-Fi direct icon 602 and open the Wi-Fi direct device selection interface 70; as shown in (c) of fig. 17, in the Wi-Fi direct device selection interface 70, the user may select a target device bbb from the connected second terminal device to share the target picture S with the selected target device bbb; as shown in (d) of fig. 17, after the user selects the target device bbb, the target device bbb may pop up a dialog box 80 asking whether the user receives, and the user may click on a reception option 801 in the dialog box 80 to receive the target picture S or click on a rejection option 802 to reject receiving the target picture S.
It can be understood that a user may also share a file with the first terminal device through the second terminal device, the shared file is not limited to a picture, and may also be a video, an audio, or a document and other types of files, for convenience of description, the picture is taken as an example in the following; in addition, when the target file to be shared is selected, other selection operations such as a click operation and a long-time press operation may be performed.
In order to improve the convenience of the user, an embodiment of the present application provides another file sharing method, which mainly determines a target device through an alignment operation, so that the user can share a file to an aligned terminal device quickly through a simple sharing operation.
It can be understood that after the first terminal device and the second terminal device are connected, the two terminals may share the file with each other, and for convenience of description, the file sharing from the first terminal device to the connected second terminal device is described as an example below.
In specific implementation, the first terminal device may detect a touch operation of a user on a target file, and if the touch operation is detected, may respond to the touch operation and share the target file with a target device in the connected second terminal device, where a pointing direction of the target device is opposite to a pointing direction of the first terminal device.
Referring to fig. 18, a specific process of the first terminal device sharing the target file with the target device may be shown, where fig. 18 is a schematic view of a file sharing process provided in the embodiment of the present application, and as shown in fig. 18, the process may include the following steps:
s210, the first terminal device responds to the sharing operation of the user on the selected target file and broadcasts a first notification message corresponding to the target file.
Specifically, the sharing operation may be a touch operation or a gesture operation, where the touch operation may be a screen sliding operation, such as a sliding operation on a target area of a screen (e.g., a top of the screen), or other predefined touch operations such as a continuous click operation; the gesture operation may be a slide gesture or other predefined gestures, which are not particularly limited in the embodiments of the present application. The target file may include one or more files that the user may select by clicking and/or long-pressing.
Fig. 19 is a schematic view of an application scenario of file sharing provided in the embodiment of the present application, as shown in fig. 19, the scenario includes a mobile phone a2100, a mobile phone B2200, a large-screen device 2300, and a pad2400, where the mobile phone a2100 represents a first terminal device, and Wi-Fi direct connection is respectively established with the mobile phone B2200, the large-screen device 2300, and the pad2400 (a second terminal device). As shown in fig. 19, a user may select a file on a mobile phone, align the mobile phone to a pad2400, and then share the file to the pad2400 through a sharing operation.
It can be understood that the user can also align the mobile phone a2100 with the large-screen device 2300 to share files with the large-screen device 2300; moreover, the mobile phone a2100, the mobile phone B2200, the large-screen device 2300, and the pad2400 may mutually discover each other, establish a Wi-Fi direct connection, and the user may also share a file to another connected terminal device through the pad2400 or the mobile phone B2200, where fig. 19 is only exemplarily illustrated by taking the case that the user shares the file to the pad2400 through the mobile phone a 2100. The following describes an interaction process between terminal devices by taking a scenario shown in fig. 19 as an example.
In a specific implementation, after the first terminal device detects the sharing operation of the user, a notification message (referred to as a first notification message) may be broadcast to notify the second terminal devices.
The first notification message may carry a file identifier of the target file, such as a file name (hereinafter, the file name is also used as an example for description), so as to notify the second terminal devices that the first terminal device is to share the target file.
The first notification message may also carry a pointing direction of the first terminal device, so as to identify the first notification message, for example: the user successively and continuously executes multiple sharing operations on the same target file, each sharing operation is directed at different second terminal equipment, the first terminal equipment correspondingly broadcasts multiple first notification messages, and the first notification messages can be distinguished through the pointing direction of the first terminal equipment carried in the first notification messages.
The first notification message may specifically be a broadcast message, and in order to save resources, the first notification message may also be a multicast message. Fig. 20 is a schematic view of address information of each message in file sharing according to an embodiment of the present application, and as shown in fig. 20, a destination MAC address in a first notification message may be a multicast MAC address, and a source MAC address is a MAC address of a mobile phone a.
And S220, the second terminal equipment returns a response message corresponding to the first notification message.
Specifically, after receiving the first notification message, each second terminal device may return a response message to the first terminal device.
The response message may carry a pointing direction of the corresponding second terminal device, so that the first terminal device determines a pointing relationship with the second terminal device. Wherein the pointing direction can be measured by a direction sensor.
The response message may also carry a file name, so that the first terminal device determines which first notification message the second terminal device replies to. For example, a user successively and continuously executes multiple sharing operations on a first terminal device to share different target files, the first terminal device correspondingly broadcasts multiple first notification messages, each second terminal device can respond to each first notification message after receiving each first notification message, and the response message carries a file name to inform the first terminal device of which first notification message the first terminal device responds.
In addition, the response message may also carry the pointing direction of the first terminal device to further identify the response message, for example, the user successively and continuously performs multiple sharing operations on the same target file, each sharing operation is directed to a different second terminal device, and the first terminal device correspondingly broadcasts multiple first notification messages; after receiving the first notification messages, the second terminal devices may respond to each first notification message, where the response message carries the file name and the pointing direction of the first terminal device, so as to inform the first terminal device which first notification message the first terminal device responds to.
In order to improve the accuracy of data sharing, if the target device may be located within a preset range (referred to as a first preset range) around the first terminal device, the response message may further carry a device location of the corresponding second terminal device, so that the first terminal device determines the target device based on the location, where the preset range may be a spherical area with the first terminal device as a center, that is, a distance between the target device and the first terminal device does not exceed a preset distance; the preset range may also be a cylindrical region centered on the first terminal device, i.e., the position of the target device relative to the first terminal device may be limited in the horizontal and vertical directions, respectively. The specific size of the preset range can be set as required, and this embodiment does not limit this specifically; the device location may be obtained through a Global Positioning System (GPS) module and/or a mapping application.
As shown in fig. 20, after the mobile phone B2200 receives the first notification message, the destination MAC address in the returned response message is the MAC address of the mobile phone a, and the source MAC address is the MAC address of the mobile phone B; after the large-screen device 2300 receives the first notification message, the destination MAC address in the returned response message is the MAC address of the mobile phone a, and the source MAC address is the MAC address of the large-screen device; after the pad2400 receives the first notification message, the destination MAC address in the returned response message is the MAC address of the mobile phone a, and the source MAC address is the MAC address of the pad.
And S230, after the first terminal equipment receives the response messages returned by the second terminal equipment, determining target equipment from the second terminal equipment according to the pointing direction of the first terminal equipment corresponding to the sharing operation and the pointing direction of the second terminal equipment in the response messages.
Specifically, after receiving the response message returned by each second terminal device, the first terminal device may determine the pointing relationship between the first terminal device and each second terminal device according to the pointing direction of the first terminal device when receiving the sharing operation of the target file by the user and the pointing direction of the second terminal device in the response message, and use the second terminal device opposite to the pointing direction of the first terminal device as the target device. The pointing direction of the first terminal device can be recorded when the sharing operation of the user on the target file is received, and if the response message carries the pointing direction of the first terminal device, the pointing relationship between the pointing direction of the first terminal device and the pointing direction of the second terminal device can be determined directly according to the pointing direction of the first terminal device and the pointing direction of the second terminal device in the response message.
If the response message carries the device location of the corresponding second terminal device, when the first terminal device determines the target device, in addition to the above-mentioned pointing relationship between the first terminal device and each second terminal device, the location relationship between the first terminal device and each second terminal device may also be determined according to the device location of the first terminal device and the device location of the second terminal device; and under the condition that the device position of the second terminal device is located in a preset range with the device position of the first terminal device as the center and the pointing direction is opposite to that of the first terminal device, determining the second terminal device as the target device. In order to improve the accuracy of the determined position relationship, the position relationship between the first terminal device and the second terminal device may be corrected by combining a ranging technique such as Round Trip Time (RTT) ranging.
When determining whether the pointing direction of the second terminal device is opposite to the pointing direction of the first terminal device, in order to improve the accuracy of the determination result, the determination may be performed in combination with the device positions of the first terminal device and the second terminal device. Fig. 21 is a schematic diagram of determining a location range of a target device according to an embodiment of the present application, and as shown in fig. 21, a point O is a device location of a first terminal device, and if a second terminal device is located on a pointing line of the first terminal device and a pointing direction of the second terminal device is consistent with a reverse direction of the pointing direction of the first terminal device, it may be considered that the pointing direction of the second terminal device is opposite to the pointing direction of the first terminal device, where the pointing line of the first terminal device extends along the pointing direction of the first terminal device with the device location of the first terminal device as a starting point.
For user convenience, a certain pointing error may be allowed when determining whether the pointing orientation of the second terminal device is opposite to the pointing orientation of the first terminal device. As shown in fig. 21, if the second terminal device is located in a preset range (referred to as a second preset range) with the pointing line of the first terminal device as a center line, and an orientation angle between the pointing direction and the opposite direction of the pointing direction of the first terminal device is within a preset angle range, the pointing direction of the second terminal device may be considered to be opposite to the pointing direction of the first terminal device. The second preset range may be a cylindrical area or a square columnar area, that is, the position of the target device relative to the pointing line of the first terminal device may be limited in the horizontal and vertical directions. The size of the second preset range and the size of the preset angle range may be set according to the requirement, which is not particularly limited in this embodiment. It should be noted that fig. 21 illustrates the position range of the target device in the horizontal direction as an example, and the position range in the vertical direction is not shown.
Continuing with fig. 19 as an example, for convenience of explanation, it is assumed that the device positions of the respective terminal devices are on the same horizontal plane, the preset angle range is 15 °, and the north orientation is 0 °. As shown in fig. 19, the pointing direction of the mobile phone a2100 is 30 ° (i.e., 30 °), the pointing direction of the mobile phone B2200 is 35 ° (i.e., 215 °), the pointing direction of the large-screen device 2300 is a true south direction (i.e., 180 °), the pointing direction of the pad2400 is 40 ° (i.e., 220 °), the large-screen devices 2300 and the pad2400 are located within a first preset range around the mobile phone a2100, the mobile phone B2200 is located outside the first preset range around the mobile phone a2100, the mobile phones B2200 and the pad2400 are located within a second preset range with the pointing line of the mobile phone a2100 as a center line, and the large-screen device 2300 is located outside the second preset range with the pointing line of the mobile phone a2100 as a center line. Wherein the first predetermined range is shown in the upper half and not in the lower half for simplicity.
Based on the above information, the pad2400 is located in a first preset range around the mobile phone a 2100; the reverse direction of the pointing direction of the mobile phone a2100 is 30 ° (210 °), an azimuth angle (10 °) between the pointing direction (220 °) of the pad2400 and the reverse direction (210 °) of the pointing direction of the mobile phone a2100 is smaller than a preset angle range (15 °), and the pad2400 is located in a second preset range with the pointing line of the mobile phone a2100 as a center line, so that the mobile phone a2100 can determine the pad2400 as a target device to share the picture a with the target device.
An azimuth angle (5 °) between the pointing direction (215 °) of the cell phone B2200 and the reverse direction (210 °) of the pointing direction of the cell phone a2100 is smaller than a preset angle range (15 °), and the cell phone B2200 is located within a second preset range with the pointing line of the cell phone a2100 as the center line, but the cell phone B2200 is located outside the first preset range around the cell phone a2100, and therefore, the cell phone B2200 is not determined as the target device of the cell phone a 2100.
The large-screen device 2300 is located within a first preset range around the cell phone a2100, however, the large-screen device 2300 is located outside a second preset range with the pointing line of the cell phone a2100 as the center line, and an azimuth angle (30 °) between the pointing direction (180 °) of the large-screen device 2300 and the reverse direction (210 °) of the pointing direction of the cell phone a2100 is larger than a preset angle range (15 °), and thus, the large-screen device 2300 is not determined as a target device of the cell phone a 2100.
It can be understood that the above target device determining process may also be executed in the second terminal device, that is, when the first terminal device sends the first notification message to the second terminal device, the first terminal device may carry the device position and the pointing direction of the first terminal device therein, and after receiving the first notification message, the second terminal device may determine whether the local terminal is the target device according to the location and the pointing direction, and then return the determination result to the first terminal device.
If the target device is not determined, the first terminal device may prompt the user that the target device is absent, where the prompt mode may be a text prompt or a voice prompt.
And S240, if the first terminal device determines the target device, sending a second notification message corresponding to the target file to the target device.
And if the first terminal equipment determines the target equipment, the first terminal equipment can share the target file with the target equipment. In order to save resources, the target device may first check whether the target file is stored locally, and then request resources from the first terminal device if the target file is not stored.
In specific implementation, after the target device is determined, the first terminal device may send a notification message (referred to as a second notification message) to the target device, where the second notification message may carry a file name of the target file to notify the target device of the target file to be shared, so that the target device may perform subsequent viewing operation.
And S250, if the target device receives the second notification message, checking whether the target file is stored or not.
After receiving the second notification message, the target device may check whether the target file is stored locally according to the file name carried in the second notification message.
If the target equipment determines that the target file is locally stored, the target file can be loaded and displayed for a user to view; or prompt information can be displayed to prompt the user that the target file to be shared by the first terminal device is stored locally.
And S260, if the target device determines that the target file is not stored, sending a resource request message corresponding to the target file to the first terminal device.
If the target device determines that the target file is not stored locally, the target device may send a resource request message to the first terminal device to request the first terminal device to transmit the target file. The resource request message may carry a file name of the target file, so that the first terminal device can identify the target file to be sent.
Optionally, the target device may ask the user whether to receive the target file as a pop-up dialog box shown in (d) of fig. 17, so as to facilitate user operation, the target device may not pop-up the dialog box, and the user may not align the second terminal device with the first terminal device if the user does not want to receive the target file, so that the scheme may also be directly applied to other terminal devices which are not convenient for user operation, such as a large-screen device, and the like, thereby expanding the applicable range of file sharing and unifying file sharing schemes among various terminal devices.
And S270, after receiving the resource request message, the first terminal device sends the target file to the target device.
S280, the target device receives the target file.
Fig. 22 is another schematic view of file sharing provided in the embodiment of the present application, and the picture a (target file) is shared by a mobile phone a2100 to a pad2400 (target device) for example. As shown in (a) in fig. 22, after the user opens a picture a, the picture is shared to the aligned pad2400 by a slide-up operation on the top of the screen; as shown in fig. 22 (b), after the pad2400 receives the picture a, the picture a may be loaded and displayed for the user to view; of course, the pad2400 may also display a prompt message to prompt the user that the target file has been received.
Fig. 23 is a schematic view of another file sharing provided in the embodiment of the present application, which is illustrated by taking an example in which a mobile phone a2100 shares multiple pictures (target files) to a pad2400 (target device). As shown in fig. 23 (a), after opening the picture selection interface by a long-press operation, the user clicks and selects the picture a, the picture B, and the picture C, and then shares the pictures to the aligned pad2400 by a slide-up operation on the top of the screen; as shown in fig. 23 (B), after the pad receives the picture a, the picture B, and the picture C, a last received picture, such as the picture C, may be loaded and displayed for the user to view; similarly, the pad2400 may display a prompt to prompt the user that the target file has been received.
It will be appreciated by those skilled in the art that the above embodiments are exemplary and not intended to limit the present application. Where possible, the order of execution of one or more of the above steps may be adjusted, or selectively combined, to arrive at one or more other embodiments. The skilled person can select any combination of the above steps according to the needs, and all that does not depart from the essence of the scheme of the present application falls into the protection scope of the present application.
As mentioned above, the process of the first terminal device sharing the target file (referred to as a first target file) with the second terminal device is shown, as described above, the second terminal device may also share the file (referred to as a second target file) with the first terminal device, and the specific sharing process is similar to the process of the first terminal device sharing the first target file with the second terminal device, and is not repeated here.
As an optional implementation manner, before file sharing is performed between the first terminal device and the second terminal device, a Wi-Fi direct connection may also be established by using a direction filtering manner, that is, filtering information carried in the probe message may include a position and a pointing direction of the terminal device, when the first terminal device searches for an available WiFi direct device, the second terminal device that meets the local filtering condition may be searched by using the method for determining the target device, the second terminal device that establishes the connection is the target device, and after the connection is established, the step for determining the target device (i.e., steps S210 to S230) may not be executed, and the target file is directly shared with the second terminal device that establishes the connection.
In addition, the file sharing method may also be applied to other wireless communication scenarios, that is, the wireless communication connection established between the first terminal device and the second terminal device is not limited to Wi-Fi direct connection, and may also be bluetooth or other near field communication connection.
According to the wireless communication method, the target device is automatically determined through the alignment operation, so that a user can quickly share files with the aligned terminal device through simple sharing operation, and the convenience of the user in sharing the files can be effectively improved.
Based on the same inventive concept, as an implementation of the foregoing method, an embodiment of the present application provides a wireless communication apparatus, where the apparatus embodiment corresponds to the foregoing method embodiment, and for convenience of reading, details in the foregoing method embodiment are not repeated in this apparatus embodiment one by one, but it should be clear that the apparatus in this embodiment can correspondingly implement all the contents in the foregoing method embodiment.
Fig. 24 is a schematic structural diagram of a wireless communication apparatus according to an embodiment of the present application, where the apparatus may be applied to a terminal device according to the foregoing embodiment. As shown in fig. 24, the apparatus provided in this embodiment may include: an input module 201, a communication module 202, a processing module 203 and a display module 204.
Wherein, the input module 201 is configured to receive an input of a user on a display interface of the terminal device, such as a touch input, a voice input, a gesture input, and the like, and the input module 201 is configured to support the terminal device to perform the processes related to receiving a user operation in step S110, step S210 in the above-described method embodiments, and/or other processes for the technology described herein. The input module may be a touch screen or other hardware or a combination of hardware and software.
The communication module 202 is used to support the terminal device to perform operations related to message transmission in steps S110, S111 to S114, S210 to S280 in the above-described method embodiments and/or other processes for the techniques described herein.
The processing module 203 is used to support the terminal device to perform the operations related to data processing in steps S110, S120, S111 to S114, S210 to S280 and/or other processes for the techniques described herein in the above-described method embodiments.
The display module 204 is used to support the terminal device to perform the operations related to the interface display in steps S120, S210, S280 in the above method embodiments and/or other processes for the technology described herein.
The apparatus provided in this embodiment may perform the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a terminal device, and fig. 25 is a schematic structural diagram of the terminal device provided in the embodiment of the present application.
The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the terminal device 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other terminal devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time division-synchronous code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a Global Navigation Satellite System (GNSS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a Mini LED, a Micro LED, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be an Open Mobile Terminal Platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation based on the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a folder, the terminal device 100 may detect the opening and closing of the folder according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 100 is stationary. The method can also be used for recognizing the posture of the terminal equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
The terminal device provided in this embodiment may execute the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Embodiments of the present application further provide a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the method described in the above method embodiments.
The embodiment of the present application further provides a computer program product, which when running on a terminal device, enables the terminal device to implement the method described in the above method embodiment when executed.
An embodiment of the present application further provides a chip system, which includes a processor, where the processor is coupled to the memory, and the processor executes a computer program stored in the memory to implement the method in the foregoing method embodiment. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optics, digital subscriber line) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, or a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium may include: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other ways. For example, the above-described apparatus/device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In the description of the present application, a "/" indicates a relationship in which the objects associated before and after are an "or", for example, a/B may indicate a or B; in the present application, "and/or" is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural.
Also, in the description of the present application, "a plurality" means two or more than two unless otherwise specified. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (19)

1. A wireless communication method applied to a first terminal device is characterized by comprising the following steps:
broadcasting a detection request message carrying the filtering information of the first terminal equipment;
and if a probe response message carrying filtering information of second terminal equipment is received and the filtering information of the first terminal equipment is matched with the filtering information of the second terminal equipment, determining that the wireless fidelity point-to-point connection between the first terminal equipment and the second terminal equipment is established.
2. The method of claim 1, further comprising:
if a probe request message carrying filter information of a third terminal device is received and the filter information of the first terminal device is matched with the filter information of the third terminal device, determining that wireless fidelity point-to-point connection between the first terminal device and the third terminal device is established, and sending a probe response message carrying the filter information of the first terminal device to the third terminal device.
3. The method of claim 1, wherein the filtering information comprises at least one of: account information, group information, input information of a user and near field communication identification information, wherein the input information comprises text information or voice information.
4. The method of claim 1, further comprising:
generating a virtual Internet Protocol (IP) address of the first terminal device and a virtual IP address of a target terminal device, wherein the target terminal device is a terminal device which is in wireless fidelity point-to-point connection with the first terminal device;
and carrying out data transmission with the target terminal equipment based on the virtual IP address.
5. The method according to claim 4, wherein during the data transmission, neither the first message sent by the first terminal device to the target terminal device nor the second message received from the target terminal device includes IP address information;
in the process that the first terminal device processes the second message, adding virtual IP address information to a data packet corresponding to the second message transmitted to a network layer, wherein a destination IP address in the virtual IP address information is a virtual IP address of the first terminal device, and a source IP address in the virtual IP address information is a virtual IP address of the target terminal device that sends the second message.
6. The method according to any one of claims 1-5, further comprising:
responding to the triggering operation of a user on a video call function before broadcasting the detection request message carrying the filtering information of the first terminal equipment, and displaying a first interface;
responding to filtering information setting operation performed by a user on the first interface, and determining filtering information of the first terminal equipment;
after the wireless fidelity point-to-point connection is established, transmitting the collected video image to a target terminal device, wherein the target terminal device is a terminal device which is in wireless fidelity point-to-point connection with the first terminal device;
and receiving and displaying the video image collected by the target terminal equipment.
7. The method according to claim 6, wherein the displaying the video image captured by the target terminal device comprises:
if the target terminal equipment comprises a plurality of target terminal equipment, displaying the video image acquired by one of the target terminal equipment in the main window, and displaying the video images acquired by other target terminal equipment in the suspended sub-window, wherein the size of the main window is larger than that of the sub-window.
8. The method according to any one of claims 1 to 5, wherein the filtering information includes pointing directions of devices, and the pointing directions of any two terminal devices matched with the filtering information are opposite to each other, the method further comprising:
the method comprises the steps of sharing a first target file to target terminal equipment and/or receiving a second target file shared by the target terminal equipment, wherein the target terminal equipment is the terminal equipment which is in wireless fidelity point-to-point connection with the first terminal equipment.
9. A wireless communication method applied to a first terminal device is characterized by comprising the following steps:
establishing a wireless communication connection with at least one second terminal device;
responding to a sharing operation of a user on a first target file, and broadcasting a first notification message corresponding to the first target file;
receiving response messages sent by the at least one second terminal device, wherein each received response message carries the pointing direction of the corresponding second terminal device;
determining target equipment from the at least one second terminal equipment according to the pointing direction of the first terminal equipment and the pointing directions of the second terminal equipment, wherein the pointing direction of the target equipment is opposite to that of the first terminal equipment;
and if the target equipment is determined, sharing the first target file with the target equipment.
10. The method according to claim 9, wherein the sharing operation is a touch operation or a gesture operation.
11. The method of claim 9, wherein the target device is located within a first predetermined range around the first terminal device.
12. The method according to claim 11, wherein each received reply message further carries a device location of a corresponding second terminal device, and the determining a target device from the at least one second terminal device according to the pointing direction of the first terminal device and the pointing directions of the second terminal devices comprises:
for each received response message, if the device position in the response message is located in a first preset range centered on the device position of the first terminal device, and the pointing direction in the response message is opposite to the pointing direction of the first terminal device, determining the second terminal device sending the response message as the target device.
13. The method of claim 9, wherein the sharing the first target file to the target device comprises:
sending a second notification message corresponding to the first target file to the target device;
and if a resource request message for requesting the first target file returned by the target equipment is received, transmitting the first target file to the target equipment.
14. The method of claim 9, further comprising:
and if the target equipment is not determined, prompting that no target equipment exists.
15. The method according to claim 9, wherein the pointing direction of the target device is determined to be opposite to the pointing direction of the first terminal device when the target device is located within a second preset range with the pointing line of the first terminal device as a center line and an azimuth angle between the pointing direction and a reverse direction of the pointing direction of the first terminal device is within a preset angle range.
16. The method according to any one of claims 9-15, further comprising:
if a first notification message corresponding to a second target file is received, sending a response message to a second terminal device sending the first notification message, wherein the sent response message carries the pointing direction of the first terminal device;
if a second notification message corresponding to the second target file is received, sending a resource request message for requesting the second target file to a second terminal device sending the second notification message under the condition that the second target file is not stored;
and receiving the second target file.
17. A terminal device, comprising: a memory for storing a computer program and a processor; the processor is adapted to perform the method of any of claims 1-16 when the computer program is invoked.
18. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-16.
19. A chip system, comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of claims 1-16.
CN202011063160.2A 2020-09-30 2020-09-30 Wireless communication method and terminal device Pending CN114339709A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011063160.2A CN114339709A (en) 2020-09-30 2020-09-30 Wireless communication method and terminal device
PCT/CN2021/116120 WO2022068513A1 (en) 2020-09-30 2021-09-02 Wireless communication method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011063160.2A CN114339709A (en) 2020-09-30 2020-09-30 Wireless communication method and terminal device

Publications (1)

Publication Number Publication Date
CN114339709A true CN114339709A (en) 2022-04-12

Family

ID=80949612

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011063160.2A Pending CN114339709A (en) 2020-09-30 2020-09-30 Wireless communication method and terminal device

Country Status (2)

Country Link
CN (1) CN114339709A (en)
WO (1) WO2022068513A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114826591A (en) * 2022-05-26 2022-07-29 京东方科技集团股份有限公司 Cross-device data transmission method, system and terminal

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114928900B (en) * 2022-07-18 2022-10-04 荣耀终端有限公司 Method and apparatus for transmission over a WiFi direct connection
CN115002939B (en) * 2022-07-18 2022-10-04 荣耀终端有限公司 Method and device for joining WiFi group
CN116684517B (en) * 2022-09-29 2024-06-14 荣耀终端有限公司 Method and device for sending response message
CN117615466B (en) * 2023-01-04 2024-05-03 广州星际悦动股份有限公司 Connection control method, device, equipment and medium for oral care equipment and terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979307A (en) * 2015-11-06 2016-09-28 乐视致新电子科技(天津)有限公司 One-key connection method and system between mobile terminal and display device
CN106535301A (en) * 2016-12-30 2017-03-22 珠海赛纳打印科技股份有限公司 Method, device and system for establishing communication connection
CN108377286A (en) * 2016-10-28 2018-08-07 中兴通讯股份有限公司 A kind of method and device of data transmission
WO2020042119A1 (en) * 2018-08-30 2020-03-05 华为技术有限公司 Message transmission method and device
CN110995665A (en) * 2019-11-15 2020-04-10 北京小米移动软件有限公司 Network distribution method and device, electronic equipment and storage medium
CN111314400A (en) * 2018-12-11 2020-06-19 中兴通讯股份有限公司 Vehicle data processing method and device, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105682182B (en) * 2014-11-19 2019-05-31 中国移动通信集团公司 A kind of discovery of equipment and equipment connection method, equipment and system
CN111328051B (en) * 2020-02-25 2023-08-29 上海银基信息安全技术股份有限公司 Digital key sharing method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105979307A (en) * 2015-11-06 2016-09-28 乐视致新电子科技(天津)有限公司 One-key connection method and system between mobile terminal and display device
CN108377286A (en) * 2016-10-28 2018-08-07 中兴通讯股份有限公司 A kind of method and device of data transmission
CN106535301A (en) * 2016-12-30 2017-03-22 珠海赛纳打印科技股份有限公司 Method, device and system for establishing communication connection
WO2018121234A1 (en) * 2016-12-30 2018-07-05 珠海赛纳打印科技股份有限公司 Method, device and system for establishing communication connection
WO2020042119A1 (en) * 2018-08-30 2020-03-05 华为技术有限公司 Message transmission method and device
CN111314400A (en) * 2018-12-11 2020-06-19 中兴通讯股份有限公司 Vehicle data processing method and device, computer equipment and storage medium
CN110995665A (en) * 2019-11-15 2020-04-10 北京小米移动软件有限公司 Network distribution method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114826591A (en) * 2022-05-26 2022-07-29 京东方科技集团股份有限公司 Cross-device data transmission method, system and terminal
WO2023226683A1 (en) * 2022-05-26 2023-11-30 京东方科技集团股份有限公司 Cross-device data transmission method and system, and terminal

Also Published As

Publication number Publication date
WO2022068513A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
JP7378576B2 (en) Terminal device, method and system for implementing one-touch screen projection using remote control device
US11653398B2 (en) Bluetooth connection method and device
WO2022068513A1 (en) Wireless communication method and terminal device
CN110198362B (en) Method and system for adding intelligent household equipment into contact
EP4250075A1 (en) Content sharing method, electronic device, and storage medium
EP4247030A1 (en) Device network distribution method, and mobile terminal and storage medium
CN117014859A (en) Address book-based device discovery method, audio and video communication method and electronic device
US20220353665A1 (en) Device capability discovery method and p2p device
CN113498108A (en) Chip, equipment and method for adjusting data transmission strategy based on service type
CN113038627B (en) File sharing method, mobile device and computer storage medium
EP4102927B1 (en) Dual wifi connection
CN113365274B (en) Network access method and electronic equipment
CN113746945B (en) Reverse address resolution method and electronic equipment
JP2022501968A (en) File transfer method and electronic device
CN113676902B (en) System, method and electronic equipment for providing wireless internet surfing
WO2023051204A1 (en) Cross-device connection method, electronic device and storage medium
EP4362507A1 (en) Communication system and communication method
EP4357906A1 (en) First electronic device, second electronic device, and method for screen-casting
EP4351181A1 (en) Bluetooth communication method and system
EP4277351A1 (en) Wi-fi access method and related device
CN115703006A (en) Equipment connection method, device and system
CN117014377A (en) Congestion control negotiation method, electronic equipment and storage medium
CN117098253A (en) Networking method and device
CN115802326A (en) Bluetooth connection method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination