CN111666055B - Data transmission method and device - Google Patents

Data transmission method and device Download PDF

Info

Publication number
CN111666055B
CN111666055B CN202010333906.0A CN202010333906A CN111666055B CN 111666055 B CN111666055 B CN 111666055B CN 202010333906 A CN202010333906 A CN 202010333906A CN 111666055 B CN111666055 B CN 111666055B
Authority
CN
China
Prior art keywords
terminal device
data type
terminal
cross
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010333906.0A
Other languages
Chinese (zh)
Other versions
CN111666055A (en
Inventor
周星辰
杜仲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN202010333906.0A priority Critical patent/CN111666055B/en
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111463198.3A priority patent/CN114356198A/en
Priority to CN202111463143.2A priority patent/CN114356197A/en
Publication of CN111666055A publication Critical patent/CN111666055A/en
Priority to CN202080100104.3A priority patent/CN115516413A/en
Priority to US17/920,867 priority patent/US20240053879A1/en
Priority to PCT/CN2020/142420 priority patent/WO2021212922A1/en
Priority to EP20932244.5A priority patent/EP4130963A4/en
Application granted granted Critical
Publication of CN111666055B publication Critical patent/CN111666055B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a data transmission method and device. The method comprises the following steps: the method comprises the steps that when a first cross-device transmission operation aiming at an object on a first terminal device is detected by the first terminal device, the data type of the object is obtained, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to a second terminal device; sending a judgment request carrying a data type to the second terminal equipment, receiving a judgment result made by the second terminal equipment based on the data type, wherein the judgment result represents whether the object can be transmitted to the second terminal equipment; and displaying the judgment result. Through the data transmission method and device in each embodiment of the application, visual feedback can be provided for a user according to the data type of the object and the matching degree of the receiving end, misoperation and repeated operation are avoided, and the operation efficiency is improved.

Description

Data transmission method and device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a data transmission method and apparatus.
Background
For example, when data is dragged and transmitted between a mobile phone with the same operating system and a computer, a specific application program is required to be used on the computer to log in an account which is the same as the account logged in the mobile phone, so that a photo or a file on the mobile phone is dragged to the computer.
If the one-time dragging operation is unsuccessful, the user can drag repeatedly, which causes complex operation and low efficiency.
Disclosure of Invention
In view of this, a data transmission method, apparatus, terminal device, storage medium, and computer program product are provided, which can avoid erroneous operation and repeated operation by providing intuitive visual feedback to a user, and improve operation efficiency.
In a first aspect, an embodiment of the present application provides a data transmission method, where the method is applied to a first terminal device, and the first terminal device is connected to a second terminal device, and the method includes:
the method comprises the steps that when a first cross-device transmission operation aiming at an object on a first terminal device is detected by the first terminal device, the data type of the object is obtained, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to a second terminal device;
sending a judgment request carrying the data type to a second terminal device,
receiving a judgment result made by a second terminal device based on the data type, wherein the judgment result represents whether the object can be transmitted to the second terminal device;
and displaying the judgment result.
With reference to the first possible implementation manner of the first aspect, the determination request carries a data type field and/or an extended data type field, and the data type field and the extended data type field are used to indicate a data type of the object. According to the data transmission method of the implementation manner, draggable contents can be expanded, and the expanded data type field can indicate a custom data type, a data type suitable for a specific device or application, or a new data type, so that more data types and devices are suitable for cross-device data transmission, and user operation is facilitated.
With reference to the first aspect or the first possible implementation manner of the first aspect, in a second possible implementation manner, the method further includes sending the object to the second terminal device when the determination result is that the object can be transmitted to the second terminal device and a second cross-device transmission operation is detected, where the second cross-device transmission operation is used to confirm that the object is transmitted to the second terminal device.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner, the sending the object to the second terminal device may specifically be implemented as:
temporarily storing the object in the local of the first terminal equipment;
sending a data transmission request to the second terminal equipment, wherein the data transmission request is used for transmitting the object;
when a first response signal which is returned by the second terminal equipment and accepts the data transmission request is received, sending an object according to the first response signal;
and when a second response signal which is returned by the second terminal equipment and does not accept the data transmission request is received, the sending object is cancelled.
With reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner, after temporarily storing the object locally in the first terminal device, the method further includes:
and when a call-out instruction for the temporarily stored object is detected, calling out the temporarily stored object, and sending the data transmission request to a second terminal device.
With reference to the third or fourth possible implementation manner of the first aspect, in a fifth possible implementation manner, the sending the object according to the first response signal may specifically be implemented as:
directly transmitting the object upon receiving the first response signal; alternatively, the transmission of the object is delayed when the first response signal is received.
According to the data transmission method of the third, fourth, and fifth possible implementation manners, the selectable operation for the dragged object can be displayed, so that the user can adjust the transmission process or the processing manner according to the actual requirement, and the second terminal device can process the object according to the actual requirement.
With reference to the first aspect or the first, second, third, fourth, or fifth possible implementation manner of the first aspect, in a sixth possible implementation manner, a first terminal device is connected to more than two candidate terminal devices, and the data transmission method of the present application may further include:
the method comprises the steps that when a first terminal device detects a first cross-device transmission operation aiming at an object on the first terminal device, information of more than two candidate terminal devices is displayed;
and determining the second terminal equipment corresponding to the first cross-equipment transmission operation according to the relation between the stop position of the first cross-equipment transmission operation and the display positions of the information of more than two candidate terminal equipment.
With reference to the first aspect or the first, second, third, fourth, or fifth possible implementation manner of the first aspect, in a seventh possible implementation manner, a first terminal device is connected to more than two candidate terminal devices, and the data transmission method of the present application may further include:
and determining the second terminal equipment corresponding to the first cross-equipment transmission operation according to the position relation between the stop position of the first cross-equipment transmission operation and the edge of the first terminal equipment.
According to the data transmission method of the sixth possible implementation manner or the seventh possible implementation manner, interaction between a plurality of terminal devices can be achieved, and the types of the plurality of terminal devices can be different, so that a user can more conveniently share data among a plurality of different devices.
With reference to the first aspect or the seventh possible implementation manner of the first aspect, in an eighth possible implementation manner, the method further includes:
sending a display instruction to a second terminal device to instruct the second terminal device to display a first image of an object according to the display instruction and a position relation between the second terminal device and the first terminal device, wherein a second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be spliced into a complete image of the object; and if the area of the second image of the object displayed on the display screen is detected to meet the sending condition, sending the object to the second terminal equipment. According to the data transmission method of the implementation mode, the process of dragging interaction is simply and visually displayed, meanwhile, the method is beneficial for a user to judge the dragging time, repeated operation is avoided, and user operation is simplified.
With reference to the first aspect or the sixth possible implementation manner of the first aspect, in a ninth possible implementation manner, the method further includes:
sending a display state request to the second terminal equipment so that the second terminal equipment returns to the current display interface of the second terminal equipment in response to the display state request and displays the display interface;
in this implementation manner, in the first aspect, the sending the determination request carrying the data type to the second terminal device may specifically be implemented as:
sending a judgment request to the second terminal device, wherein the judgment request requests the data type and the coordinate of the position where the first cross-device transmission operation stays in the display interface,
in this implementation manner, in the first aspect, the receiving a determination result made by the second terminal device based on the data type may specifically be implemented as:
and receiving a judgment result made by the second terminal device based on the data type and the coordinate.
According to the data transmission method of the ninth possible implementation manner, the object can be directly dragged to the target position, and compared with the related art that the process of dragging to the target position at the receiving end needs to be further operated, the user operation can be simplified. The judgment result is displayed in real time in the dragging process, repeated operation can be avoided, and the operation efficiency is improved.
With reference to the second possible implementation manner of the first aspect, in a tenth possible implementation manner, the method further includes:
determining a processing mode of the second terminal equipment to the object;
sending the object to the second terminal device, including:
and sending the object and indication information to the second terminal equipment, wherein the indication information is used for indicating the second terminal equipment to process the object in the processing mode.
In a second aspect, an embodiment of the present application provides a data transmission method, where the method is applied to a second terminal device, and the second terminal device is connected to a first terminal device, and the method includes:
receiving a judgment request sent by a first terminal device, wherein the judgment request carries the data type of an object to be transmitted, and the judgment request is used for requesting a second terminal device to judge whether the object of the data type can be transmitted to the second terminal device;
and making a judgment result according to the data type, and sending the judgment result to the first terminal equipment so as to enable the first terminal equipment to display the judgment result.
With reference to the second aspect, in a first possible implementation manner, the method further includes:
when a display state request sent by first terminal equipment is received, returning a current display interface of second terminal equipment to the first terminal equipment so that the first terminal equipment determines coordinates of a stop position of first cross-equipment transmission operation in the display interface, wherein the first cross-equipment transmission operation is operation used by the first terminal equipment for initiating a process of transmitting an object to the second terminal equipment;
wherein the judgment request in the implementation manner of the second aspect carries the coordinates,
in an implementation manner of the second aspect, making a determination result according to the data type includes:
and making a judgment result according to the data type and the coordinate. According to the data transmission method of the implementation mode, the object can be directly dragged to the target position, and compared with the related technology that the process of dragging the object to the target position at the receiving end needs to be further operated, the user operation can be simplified. The judgment result is displayed in real time in the dragging process, repeated operation can be avoided, and the operation efficiency is improved.
With reference to the first possible implementation manner of the second aspect, in a second possible implementation manner, the method further includes:
and when receiving the object sent by the first terminal equipment, processing the object according to one or more items of data type, local storage state, application service installed by the second terminal equipment and coordinates.
With reference to the second possible implementation manner of the second aspect, in a third possible implementation manner, the processing the object according to one or more of a data type, a local storage state, an application service installed by the second terminal device, and a coordinate may specifically be implemented as:
when the local storage state is that the local storage state does not have the storage capacity, opening the object according to the application service installed by the second terminal equipment;
and when the local storage state is the storage capacity, storing the object locally.
With reference to the third possible implementation manner of the second aspect, in a fourth possible implementation manner, when the local storage state is storage-capable, the storing the object locally may specifically be implemented as:
when the local storage state is storage capacity and the corresponding position of the coordinates in the display interface does not allow the data type to be inserted, selecting an application program to open a pair of objects according to the data type in the local storage object;
and when the local storage state is storage capacity and the corresponding position of the coordinate in the display interface allows the data type to be inserted, opening the object in the display interface according to the coordinate in the local storage object.
According to the data transmission method of the implementation mode, the receiving end can process the object according to the local information, and user operation is simplified.
With reference to the second aspect or the first, second, third, or fourth possible implementation manner of the second aspect, in a fifth possible implementation manner, the method further includes:
when the object sent by the first terminal equipment is received, if the second terminal equipment has the storage capacity, the object is temporarily stored locally;
displaying the processing options;
and processing the object according to the selection operation aiming at the processing option.
With reference to the second aspect, in a sixth possible implementation manner, the method further includes:
when a display instruction sent by the first terminal equipment is received, displaying a first image of an object according to the display instruction and the position relation between the second terminal equipment and the first terminal equipment,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be stitched into a complete image of the object. According to the data transmission method of the implementation mode, the process of dragging interaction is simply and visually displayed, meanwhile, the method is beneficial for a user to judge the dragging time, repeated operation is avoided, and user operation is simplified.
In a third aspect, an embodiment of the present application provides a data transmission apparatus, where the method is applied to a first terminal device, and the first terminal device is connected to a second terminal device, and the apparatus includes:
a first obtaining module, configured to, when the first terminal device detects a first cross-device transmission operation for an object on the first terminal device, obtain a data type of the object, where the first cross-device transmission operation is used to initiate a process of transmitting the object to a second terminal device;
a first sending module, configured to send a determination request carrying the data type to a second terminal device,
a first receiving module, configured to receive a determination result made by a second terminal device based on the data type, where the determination result indicates whether the object can be transmitted to the second terminal device;
and the first display module is used for displaying the judgment result.
With reference to the third aspect, in a first possible implementation manner, the determination request carries a data type field and/or an extended data type field, and the data type field and the extended data type field are used to indicate a data type of the object.
With reference to the third aspect or the first possible implementation manner of the third aspect, in a second possible implementation manner, the apparatus further includes:
and the second sending module is used for sending the object to the second terminal equipment when the judgment result is that the object can be transmitted to the second terminal equipment and a second cross-equipment transmission operation is detected, wherein the second cross-equipment transmission operation is used for confirming the transmission of the object to the second terminal equipment.
With reference to the second possible implementation manner of the third aspect, in a third possible implementation manner, the second sending module may include:
the first saving unit is used for temporarily storing the object in the local of the first terminal equipment;
a first sending unit, configured to send a data transmission request to a second terminal device, where the data transmission request is used to transmit the object;
a second sending unit, configured to send an object according to a first response signal when receiving the first response signal, which is returned by the second terminal device and receives the data transmission request;
and the cancellation sending unit is used for canceling the sending object when receiving a second response signal which is returned by the second terminal device and does not accept the data transmission request.
With reference to the third possible implementation manner of the third aspect, in a fourth possible implementation manner, the first sending unit is further configured to, when a call-out instruction for the temporarily stored object is detected, call out the temporarily stored object, and send the data transmission request to the second terminal device.
With reference to the third possible implementation manner of the third aspect, in a fifth possible implementation manner, the second sending unit is further configured to: the object is directly transmitted upon receiving the first response signal, or the object is delayed from being transmitted upon receiving the first response signal.
With reference to the third aspect or the first, second, third, fourth, or fifth possible implementation manner of the third aspect, in a sixth possible implementation manner, the first terminal device is connected to more than two candidate terminal devices, and the apparatus further includes:
the second display module is used for displaying the information of more than two candidate terminal devices when the first terminal device detects the first cross-device transmission operation aiming at the local object of the first terminal device;
and the second determining module is used for determining the second terminal equipment corresponding to the first cross-equipment transmission operation according to the relation between the stop position of the first cross-equipment transmission operation and the display positions of the information of more than two candidate terminal equipment.
With reference to the third aspect or the first, second, third, fourth, or fifth possible implementation manner of the third aspect, in a seventh possible implementation manner, a first terminal device is connected to more than two candidate terminal devices, and the apparatus further includes: and the third determining module is used for determining the second terminal equipment corresponding to the first cross-equipment transmission operation according to the position relation between the stop position of the first cross-equipment transmission operation and the edge of the first terminal equipment.
With reference to the third aspect or the seventh possible implementation manner of the third aspect, in an eighth possible implementation manner, the apparatus further includes:
a third sending module, configured to send a display instruction to the second terminal device to instruct the second terminal device to display the first image of the object according to the display instruction and the positional relationship between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal equipment and the first image of the object displayed on the second terminal equipment can be spliced into a complete image of the object;
and the fourth sending module is used for sending the object to the second terminal equipment if the area of the second image of the object displayed on the display screen is detected to meet the sending condition.
With reference to the third aspect or the sixth possible implementation manner of the third aspect, in a ninth possible implementation manner, the apparatus further includes:
a fifth sending module, configured to send a display status request to the second terminal device, so that the second terminal device returns to a current display interface of the second terminal device in response to the display status request, and displays the display interface;
the first transmitting module includes:
a third sending unit, configured to send a determination request to the second terminal device according to the data type and the coordinate of the location where the first cross-device transmission operation stays in the display interface,
the first receiving module includes:
and a receiving unit for receiving a judgment result made by the second terminal device based on the data type and the coordinate.
With reference to the second possible implementation manner of the third aspect, in a tenth possible implementation manner, the apparatus further includes:
the third determining module is used for determining the processing mode of the second terminal equipment to the object;
the second sending module further comprises:
and a fourth sending unit, configured to send the object and indication information to the second terminal device, where the indication information is used to instruct the second terminal device to process the object in the processing manner.
In a fourth aspect, an embodiment of the present application provides an apparatus for transmitting data, where the method is applied to a second terminal device, where the second terminal device is connected to a first terminal device, and the apparatus includes:
the second receiving module is used for receiving a judgment request sent by the first terminal device, wherein the judgment request carries the data type of an object to be transmitted, and the judgment request is used for requesting the second terminal device to judge whether the object of the data type can be transmitted to the second terminal device;
and the first judgment module is used for making a judgment result according to the data type and sending the judgment result to the first terminal equipment so as to enable the first terminal equipment to display the judgment result.
With reference to the first possible implementation manner of the fourth aspect, the apparatus further includes:
a sixth sending module, configured to, when receiving a display state request sent by a first terminal device, return a current display interface of a second terminal device to the first terminal device, so that the first terminal device determines coordinates of a position where a first cross-device transmission operation stays in the display interface, where the first cross-device transmission operation is an operation in which the first terminal device is used to initiate a process of transmitting an object to the second terminal device;
wherein, the judgment request in the fourth aspect carries the coordinate, and the first judgment module includes:
and the first judgment unit is used for making a judgment result according to the data type and the coordinates.
With reference to the first possible implementation manner of the fourth aspect, in a second possible implementation manner, the apparatus further includes:
and the first processing module is used for processing the object according to one or more of the data type, the local storage state, the application service installed by the second terminal equipment and the coordinates when the object sent by the first terminal equipment is received.
With reference to the second possible implementation manner of the fourth aspect, in a third possible implementation manner, the first processing module includes:
the first processing unit is used for opening an object according to the application service installed by the second terminal equipment when the local storage state is that the local storage state does not have the storage capacity;
and the second processing unit is used for locally storing the object when the local storage state is the storage capacity.
With reference to the third possible implementation manner of the fourth aspect, in a fourth possible implementation manner, the second processing unit is further configured to select an application program to open a pair of objects according to the data type and in the local storage object when the local storage state is that the local storage state has storage capability, and the corresponding position of the coordinate in the display interface does not allow the data type to be inserted; and when the local storage state is storage capacity and the corresponding position of the coordinate in the display interface allows the data type to be inserted, opening the object in the display interface according to the coordinate in the local storage object.
With reference to the fourth aspect or the first, second, third or fourth possible implementation manner of the fourth aspect, in a fifth possible implementation manner, the apparatus further includes:
the storage module is used for temporarily storing the object in the local area if a second terminal device has storage capacity when receiving the object sent by the first terminal device;
the third display module is used for displaying the processing options;
and the second processing module is used for processing the object according to the selection operation aiming at the processing option.
With reference to the fourth aspect, in a sixth possible implementation manner, the apparatus further includes:
a fourth display module, configured to display the first image of the object according to the display instruction and the positional relationship between the second terminal device and the first terminal device when receiving the display instruction sent by the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be stitched into a complete image of the object.
With reference to the first aspect, the multiple implementations of the first aspect, the second aspect, the multiple implementations of the second aspect, the third aspect, the multiple implementations of the third aspect, the fourth aspect, and the multiple implementations of the fourth aspect, in one possible implementation, the first cross-device transmission operation is a drag operation, and the second cross-device transmission operation is a drag release operation.
In a fifth aspect, an embodiment of the present application provides an apparatus for transmitting data, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the above method when executing the instructions.
In a sixth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium having stored thereon computer program instructions, wherein the computer program instructions, when executed by a processor, implement the above-described method.
In a seventh aspect, an embodiment of the present application provides a terminal device, where the terminal device may perform the method for transmitting data in the first aspect or in one or more of the foregoing implementations of the first aspect.
With reference to the seventh aspect, in a first possible implementation manner, the terminal device may further perform a data transmission method of the second aspect or one or more of the multiple implementation manners of the second aspect.
In an eighth aspect, embodiments of the present application provide a computer program product comprising computer readable code or a non-transitory computer readable storage medium carrying computer readable code, which when run in an electronic device, a processor in the electronic device performs the above method.
Through the data transmission method and the data transmission device in each embodiment, visual feedback can be provided for a user according to the data type of the object and the matching degree of the receiving end, error operation and repeated operation are avoided, and the operation efficiency is improved.
Other features and aspects of the present application will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the application and, together with the description, serve to explain the principles of the application.
Fig. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of an internal system of a terminal device according to an embodiment of the present application.
Fig. 3a and 3b respectively show a flow chart of a data transmission method according to an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating an application scenario of a data transmission method according to an embodiment of the present application.
Fig. 5a and 5b respectively show examples of ways of displaying the determination result according to an embodiment of the present application.
FIG. 6 shows a diagram of fields included in a managed event according to an embodiment of the present application.
Fig. 7 shows a schematic diagram of an application scenario according to an embodiment of the present application.
Fig. 8a and 8b respectively show a flow chart of a data transmission method according to an embodiment of the present application.
FIG. 8c shows an interaction flow diagram of an application scenario according to an embodiment of the present application.
Fig. 9a shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 9b shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 9c shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 9d and 9e respectively show schematic diagrams of a display interface of a terminal device according to an embodiment of the present application.
Fig. 9f shows a schematic diagram of a specific application scenario according to an embodiment of the present application.
Fig. 9g shows a schematic diagram of a display interface of a terminal device according to an embodiment of the present application.
Fig. 10 shows a schematic diagram of an application scenario according to an embodiment of the present application.
Fig. 11a shows a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application.
Fig. 11b shows a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application.
Fig. 11c shows a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application.
Fig. 11d illustrates an application scenario diagram of a display interface of a first terminal device according to an embodiment of the present application.
Fig. 12a shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 12b shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 13a shows a schematic diagram of an application scenario according to an embodiment of the present application.
FIG. 13b shows a schematic diagram of a cross-screen display according to an embodiment of the present application.
Fig. 14a shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 14b shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 15 shows a schematic diagram of an application scenario according to an embodiment of the present application.
Fig. 16 shows a flow chart of a method of transmitting data according to an embodiment of the present application.
Fig. 17 shows a block diagram of a transmission apparatus of data according to an embodiment of the present application.
Fig. 18 shows a block diagram of a transmission apparatus of data according to an embodiment of the present application.
Fig. 19 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
Fig. 20 shows a block diagram of a software configuration of a terminal device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments, features and aspects of the present application will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present application.
In order to solve the technical problem, the data transmission method can achieve cross-device data transmission, can be applied to terminal devices, provides visual feedback according to the matching degree of an object to be transmitted and receiving devices, and improves operation efficiency.
In the embodiments of the present application, a party that initiates data transmission across devices and sends the data may be referred to as a source (source) side, and a party that receives the data may be referred to as a sink (sink) side. It should be noted that a device acting as a source in one pair of relationships may also act as a sink in another pair of relationships, that is, a terminal device may act as a source of another terminal device or as a sink of another terminal device.
The terminal device (including the device at the source end and the device at the receiving end) related to the present application may refer to a device having a wireless connection function, the wireless connection function may refer to being connected with other terminal devices through wireless connection modes such as wifi and bluetooth, and the terminal device of the present application may also have a function of communicating through wired connection. The terminal device of the application can be a touch screen, can also be a non-touch screen, and can also be screen-free, the touch screen can control the terminal device in a manner of clicking, sliding and the like on a display screen through fingers, a touch pen and the like, the non-touch screen device can be connected with input devices such as a mouse, a keyboard, a touch panel and the like, the terminal device is controlled through the input devices, and the screen-free device can be a screen-free Bluetooth sound box and the like.
For example, the terminal device of the present application may be a smart phone, a netbook, a tablet computer, a notebook computer, a wearable electronic device (such as a smart band, a smart watch, and the like), a TV, a virtual reality device, a sound, electronic ink, and the like.
Fig. 19 shows a schematic structural diagram of a terminal device according to an embodiment of the present application. Taking the terminal device as a mobile phone as an example, fig. 19 shows a schematic structural diagram of the mobile phone 200.
The mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a SIM card interface 295, and the like. The sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the mobile phone 200 may also include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like, which are not shown in the figure).
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 200. In other embodiments of the present application, handset 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Wherein the controller can be the neural center and the command center of the cell phone 200. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
The processor 210 may operate the data transmission method provided in the embodiment of the present application, so as to reduce the operation complexity of the user, improve the intelligent degree of the terminal device, and improve the user experience. The processor 210 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the data transmission method provided in the embodiment of the present application, for example, part of the algorithm in the data transmission method is executed by the CPU, and another part of the algorithm is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 200 may include 1 or N display screens 294, N being a positive integer greater than 1. The display screen 294 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display 294 may display a photograph, video, web page, or file, among others. As another example, the display 294 may display a graphical user interface. The graphical user interface comprises a status bar, a hidden navigation bar, a time and weather widget (widget) and an application icon, such as a browser icon. The status bar includes the name of the operator (e.g., china mobile), the mobile network (e.g., 4G), the time and the remaining power. The navigation bar includes a back key icon, a home key icon, and a forward key icon. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock bar may also be included in the graphical user interface, and that a commonly used application icon may be included in the Dock bar, etc. When the processor 210 detects a touch event of a finger (or a stylus, etc.) of a user with respect to an application icon, in response to the touch event, a user interface of an application corresponding to the application icon is opened and displayed on the display 294.
In the embodiment of the present application, the display screen 294 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens may be adopted.
After the processor 210 runs the data transmission method provided in the embodiment of the present application, the terminal device may establish a connection with another terminal device through the antenna 1, the antenna 2, or the USB interface, transmit data according to the data transmission method provided in the embodiment of the present application, and control the display screen 294 to display a corresponding graphical user interface.
The cameras 293 (front camera or rear camera, or one camera may be used as both front camera and rear camera) are used for capturing still images or video. In general, the camera 293 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave lenses) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
Internal memory 221 may be used to store computer-executable program code, including instructions. The processor 210 executes various functional applications and data processing of the cellular phone 200 by executing instructions stored in the internal memory 221. The internal memory 221 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, codes of application programs (such as a camera application, a WeChat application, etc.), and the like. The data storage area can store data (such as images, videos and the like acquired by a camera application) and the like created in the use process of the mobile phone 200.
The internal memory 221 may further store one or more computer programs 1310 corresponding to the data transmission method provided by the embodiment of the present application. The one or more computer programs 1304 stored in the memory 221 and configured to be executed by the one or more processors 210, the one or more computer programs 1310 comprising instructions that can be used to perform the steps as in the respective embodiments of fig. 3a, 3b, 8a, 8b, 9 a-9 c, 12a, 12b, 14a, 14b, or 16, the computer programs 1310 can include the first obtaining module 61, the first sending module 62, the first receiving module 63, and the first display module 64. The first obtaining module 61 is configured to, when the first terminal device detects a first cross-device transmission operation for an object on the first terminal device, obtain a data type of the object; a first sending module 62, configured to send a determination request carrying the data type to the second terminal device, and a first receiving module 63, configured to receive a determination result made by the second terminal device based on the data type, where the determination result indicates whether the object can be transmitted to the second terminal device; and a first display module 64, configured to display the determination result. When the code of the transmission method of the data stored in the internal memory 221 is executed by the processor 210, the processor 210 may control the display screen to display the determination result.
In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the codes of the data transmission method provided by the embodiment of the present application may also be stored in the external memory. In this case, the processor 210 may execute the code of the transmission method of the data stored in the external memory through the external memory interface 220.
The function of the sensor module 280 is described below.
The gyro sensor 280A may be used to determine the motion attitude of the cellular phone 200. In some embodiments, the angular velocity of the cell phone 200 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 280A. I.e., the gyro sensor 280A may be used to detect the current state of motion of the handset 200, such as shaking or standing still.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 280A may be used to detect a folding or unfolding operation acting on the display screen 294. The gyro sensor 280A may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The acceleration sensor 280B can detect the magnitude of acceleration of the cellular phone 200 in various directions (typically three axes). I.e., the gyro sensor 280A may be used to detect the current state of motion of the handset 200, such as shaking or standing still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B may be used to detect a folding or unfolding operation acting on the display screen 294. The acceleration sensor 280B may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. When the display screen in the embodiment of the present application is a foldable display screen, the proximity optical sensor 280G may be disposed on a first screen of the foldable display screen 294, and the proximity optical sensor 280G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 280A (or the acceleration sensor 280B) may transmit the detected motion state information (such as an angular velocity) to the processor 210. The processor 210 determines whether the mobile phone 200 is currently in the hand-held state or the tripod state (for example, when the angular velocity is not 0, it indicates that the mobile phone 200 is in the hand-held state) based on the motion state information.
The fingerprint sensor 280H is used to collect a fingerprint. The mobile phone 200 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 280K is also referred to as a "touch panel". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 294. In other embodiments, the touch sensor 280K can be disposed on the surface of the mobile phone 200 at a different location than the display 294.
Illustratively, the display 294 of the cell phone 200 displays a home interface that includes icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks an icon of the camera application in the main interface through the touch sensor 280K, and the processor 210 is triggered to start the camera application and open the camera 293. Display screen 294 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the mobile phone 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 251 can provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 200. The mobile communication module 251 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 251 can receive electromagnetic waves from the antenna 1, and filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 251 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the same device as at least some of the modules of the processor 210. In this embodiment, the mobile communication module 251 may also be used for information interaction with other terminal devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays images or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 251 or other functional modules, independent of the processor 210.
The wireless communication module 252 may provide solutions for wireless communication applied to the mobile phone 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 252 may be one or more devices that integrate at least one communication processing module. The wireless communication module 252 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 252 may also receive a signal to be transmitted from the processor 210, perform frequency modulation on the signal, amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves. In this embodiment, the wireless communication module 252 is configured to transmit data with other terminal devices under the control of the processor 210, for example, when the processor 210 executes the data transmission method provided in this embodiment, the processor may control the wireless communication module 252 to send a determination request to the other terminal devices, and may also receive a determination result made by the other terminal devices based on the determination request, where the determination result indicates whether data to be transmitted can be transmitted to the other terminal devices, and then control the display 294 to display the determination result, so as to provide visual feedback for a user, avoid erroneous operation and repeated operation, and improve operation efficiency
In addition, the mobile phone 200 can implement an audio function through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the earphone interface 270D, and the application processor. Such as music playing, recording, etc. The handset 200 may receive key 290 inputs, generating key signal inputs relating to user settings and function control of the handset 200. The cell phone 200 can generate a vibration alert (e.g., an incoming call vibration alert) using the motor 291. The indicator 292 in the mobile phone 200 may be an indicator light, and may be used to indicate a charging status, a power change, or an indication message, a missed call, a notification, or the like. The SIM card interface 295 in the handset 200 is used to connect a SIM card. The SIM card can be attached to and detached from the mobile phone 200 by being inserted into the SIM card interface 295 or being pulled out from the SIM card interface 295.
It should be understood that in practical applications, the mobile phone 200 may include more or less components than those shown in fig. 19, and the embodiment of the present application is not limited thereto. The illustrated handset 200 is merely an example, and the handset 200 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a terminal device.
Fig. 20 is a block diagram of a software configuration of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 20, the application packages may include phone, camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 20, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The window manager may also be configured to detect whether a device expansion transmission operation, such as a drag operation, exists according to an embodiment of the present disclosure.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Fig. 1 shows a schematic diagram of an application scenario according to an embodiment of the present application. As shown in fig. 1, a first terminal device and a second terminal device may establish a wireless connection and perform pairing through bluetooth, hotspot, WIFI, and the like, and may also connect through a wired connection, for example, a mobile phone and a notebook computer are connected through a USB data line, and the like, and the specific connection manner is not limited in the present application. Those skilled in the art can understand that both the first terminal device and the second terminal device may be a source end or a sink end, and in the embodiment of the present application, for convenience of clearly explaining the technical solution, without particular description, the first terminal device is a source end and the second terminal device is a sink end.
Fig. 2 is a schematic diagram illustrating an internal system structure of a terminal device according to an embodiment of the present application, and as shown in fig. 2, a system of the terminal device may include an application service layer, an application Framework layer (Framework in fig. 2 is an example), and a cross-device transmission service system that implements a cross-device data transmission function, for example, a drag service system in fig. 2 is an example of the cross-device transmission service system.
In this embodiment, the application Framework layer may include an inputDispatcher, a View system, and a wms (windows manager service), where the inputDispatcher in the Framework layer is responsible for receiving user operations and distributing the user operations to each window for processing; WmS can manage a displayed window as a part of a window manager together with a View system, WmS can respond to a user operation of a user in the window, such as a drag operation, a touch screen operation, a click operation, etc., WmS can judge whether the user operation is a cross-device transfer operation (e.g., a drag operation) in response to the user operation, where the cross-device transfer operation can refer to an operation for transferring an object on one terminal device to another terminal device, that is, the terminal device detects the operation as a source, the cross-device transfer operation can be a preset operation or a series of operations, and when WmS compares the user operation with the preset operation and judges that the user operation conforms to the preset operation, the user operation can be determined as the cross-device transfer operation. If WmS it is determined that the user operation is a cross-device transmission operation, a cross-device transmission event may be generated and sent to a drag service system, where the drag service system may be a service added in a system runtime library layer for implementing cross-device data transmission based on various types of operations such as drag, and the drag service system may monitor a cross-device transmission event of a Framework layer and send the cross-device transmission event to a drag service system of another connected terminal device.
The application service in the application service layer may register listening (e.g., drag and listen) of a cross-device transmission operation at WmS of the Framework layer, and after registering the listening of the cross-device transmission operation, if the Framework layer of the terminal device detects that a cross-device transmission event (e.g., a drag event) for the application service exists in a window or a drag service system of one terminal receives a cross-device transmission event sent by a drag service system of another terminal, the Framework layer may send the cross-device transmission event to the application service, and the application service may determine whether an object (data) of the cross-device transmission event can be accepted. The cross-device transmission event for the application service may refer to an application service in which a user drags an object on one terminal to another terminal, a position in the application service, and the like, and the detection of the cross-device transmission event for the application service in a window by a Framework layer of the terminal device may refer to an operation for dragging an object into the application service or the position in the application service, such as dragging of the object, detected in a local display window after the device serving as a receiving end receives the object sent by a source end, at this time, the Framework layer of the receiving end may send the cross-device transmission event to the application service, and the application service may determine whether the object (data) of the cross-device transmission event can be received.
In one possible implementation, the drag service system in FIG. 2 may include a first interface and a second interface. The first interface may be used for communication with a Framework layer, the second interface may be used for communication with a communication component of the terminal device, and the communication component of the terminal device may be a wireless communication component, such as an antenna, or may be a hardware interface for implementing wired communication, which is not limited in this application.
The first interface and the second interface may be implemented in the form of software interfaces, for example, the first interface and the second interface may be implemented in the form of callback functions, and other functions of the drag service system may also be implemented in the form of software. For example, a callback function for monitoring a cross-device transmission operation may be registered in a Framework layer, that is, a pointer for calling a drag service system (drag service function) may be registered in the Framework layer, and when the Framework layer monitors the cross-device transmission operation, a cross-device transmission event may be generated, thereby triggering the call of the drag service system (drag service function). The dragging service system can generate a judgment request according to the cross-device transmission event of the Framework layer, and send the judgment request to another terminal device through the second interface and the communication assembly. After receiving the judgment request, the dragging service system of the other terminal device can send the judgment request to the Framework layer of the other terminal device through the first interface.
It should be noted that fig. 2 only shows an internal structure in the Framework layer of one terminal device, and the other terminal device also has the same structure, which is only not shown.
The operating system installed in the terminal device can be Android, ios, windows, mac, Linux and other systems, which are not limited in the application, and for different systems, an application program of a dragging service can be developed according to the above method to support cross-device data transmission in the dragging mode.
As can be seen from the above examples, data can be transmitted between two terminals by improving the system of the terminal device or adding a new application service. On the basis, the data transmission method of the present application is introduced, and fig. 3a and 3b respectively show a flowchart of the data transmission method according to an embodiment of the present application. The transmission method of data shown in fig. 3a may be applied to the first terminal device in fig. 1, that is, may be applied to a source terminal device, and the transmission method may include:
step S100, when a first terminal device detects a first cross-device transmission operation for an object on the first terminal device, acquiring a data type of the object, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to a second terminal device;
step S101, sending a judgment request carrying the data type to the second terminal equipment,
step S102, receiving a judgment result made by the second terminal device based on the data type, wherein the judgment result represents whether the object can be transmitted to the second terminal device;
and step S103, displaying the judgment result.
The method shown in fig. 3b may be applied to the second terminal device in fig. 1, that is, may be applied to the receiving end device, and in this embodiment, the data transmission method may include:
step S201, receiving a judgment request sent by a first terminal device, wherein the judgment request carries a data type of an object to be transmitted, and the judgment request is used for requesting a second terminal device to judge whether the object of the data type can be transmitted to the second terminal device;
step S202, a judgment result is made according to the data type, and the judgment result is sent to the first terminal device, so that the first terminal device displays the judgment result.
In step S100, the first cross-device transmission operation is used to initiate a process of transmitting an object to the second terminal device, and may refer to that the drag service system may be invoked through the first cross-device transmission operation. That is, the user may trigger a dragstart event through the first cross-device transmission operation on the display screen of the first terminal device, drag data (object) may be specified through setData () in the dragstart event, and the trigger dragstart event may invoke the drag service system to initiate a process of transmitting the object to the second terminal device.
As shown in fig. 2, a Framework layer of a first terminal device may monitor a user operation, and may preset, as a preset operation, an operation capable of evoking a drag service system at the first terminal device and a second terminal device, where the first terminal device may compare the monitored user operation with the preset operation, and when the user operation matches the preset operation, the first terminal device may determine that there is a first cross-device transmission operation for a certain object, which is equivalent to detecting the first cross-device transmission operation for the object on the first terminal device, and may evoke the drag service system.
In a possible implementation manner, the first cross-device transmission operation may refer to a process from an initiation operation to an operation stop, and when the position of the operation stop is within a preset area of the first terminal device (for example, an edge of a display screen, or the like), it indicates that cross-device transmission is to be performed, and the drag service system may be invoked.
In a possible implementation manner, the first cross-device transmission operation may be a drag operation, and for convenience of description, the drag operation is described as an example below, but the application is not limited thereto, and the corresponding first cross-device operation may also be different for different terminal device types (e.g., a touch screen device or a non-touch screen device) or different objects.
For example, for a touch screen device, the preset operation may be that a user long presses an object with a finger or a palm of the user, drags the object to an edge of a screen, or slides the object to the edge of the screen immediately after tapping the object with a stylus, where the direction of the sliding is not limited and may be downward, upward, leftward, rightward, or in other directions, or after selecting a text, the multiple controls called up by long pressing the selected text may include a "cross-device transmission" control, and when the "cross-device transmission" control is triggered, the drag service system may be invoked, that is, the preset operation may be that the "cross-device transmission" control is triggered, and so on. For the non-touch screen device, the preset operation may be that the mouse moves to an object, a finger clicks a left button of the mouse and does not release the mouse and drag the mouse to the edge of the display screen, or the finger controls the touch panel to select the object and drag the object to the edge of the screen, and the like. It should be noted that the above possible ways in which the first terminal device detects the first cross-device transmission operation are only some examples of the present application, and the present application is not limited thereto.
The object of the application can be a document, a folder, a text, a picture, audio, video, a link, and the like. The first terminal device may identify the type of the data by the extension of the data, for example, for the object, the data type of the object may be acquired by identifying the extension of the object. Taking a video as an example, if the extension of the object contains an extension of an image, such as ". avi", "mov", and the like, the data type of the object can be identified as the video; for example, if the extension of the object includes an extension of sound such as ". wav", ". mp 4", ". mp 3", the type of the object can be identified as audio.
In a possible implementation manner, the drag data specified by setData () in the dragstart event may include two parts of information, which are the data type and the data value, respectively, that is, the data type of the object and the data corresponding to the object, after acquiring the data type of the object, the first terminal device may add the obtained data type to the setData () of the dragstart event, and when the dragstart event is triggered to invoke the drag service system, the drag service system may determine that the request may carry the data type of the object when sending the determination request to the second terminal according to the data type specified in the dragstart event.
Fig. 4 is a schematic diagram illustrating an application scenario of a data transmission method according to an embodiment of the present application. As shown in fig. 4, the device 1 and the device 2 are connected in a wireless pairing manner, the device 2 and the device 3 are connected in a wireless pairing manner or in a wired manner, and the device 1 and the device 3 are connected in a wireless or wired manner. The device 1, the device 2, and the device 3 may perform cross-device data transmission in the embodiment of the present application.
In a possible implementation manner, the device 1 and the device 2 may be terminal devices of a touch screen, such as a smart phone, a tablet computer, and the like, a plurality of application programs may be installed on both the device 1 and the device 2, for example, both the device 1 and the device 2 shown in fig. 4 may be installed with an application a and an application B, the application a may refer to a "rich media type" application such as social, email, browser, and the like, the content may include text, pictures, documents, links, and the like, the application B may refer to a file management type application such as a file manager, a gallery, and the content mainly includes pictures, videos, audios, and documents, and no separate text content. The device 3 may be a non-touch screen type device, such as a notebook computer, and the device 3 may have a program a and a storage folder installed thereon, where the program a may be an application program such as office software and a browser in a windows/Mac system. FIG. 4 illustrates some exemplary data types and possible presentation forms of objects, which may be located in an application or archive management, and the application is not limited to the types and presentation forms of objects shown in FIG. 4.
For step S101, as shown in fig. 2, the first terminal device may send a determination request to the drag service system of the second terminal device through the invoked drag service system, where the determination request includes the data type of the object. Therefore, in step S201, the second terminal device receives the determination request sent by the first terminal device, and determines that the determination request carries the data type of the object to be transmitted, and the determination request is used to request the second terminal device to determine whether the object with the data type can be transmitted to the second terminal device.
For step S202, in a possible implementation manner, the second terminal device may determine whether to transmit the data to the second terminal device according to the data type of the object and the type of the second terminal device, and each type of the second terminal device may preset a data type that allows transmission or does not allow transmission. For example, the object is a text file or a folder, and the second terminal device is a bluetooth speaker, so that the determination result of the second terminal device may be that the second terminal device cannot be transmitted; the object is an audio file or a video file with audio, and the second terminal device is a bluetooth speaker, so the determination result of the second terminal device may be that the audio file or the video file with audio can be transmitted to the second terminal device.
For step S202, in a possible implementation manner, the second terminal device may further determine whether to transmit to the second terminal device according to the data type of the object and the application service installed in the second terminal device. The application service installed on the second terminal device may refer to an application program or a software program installed on the second terminal device, for example, an APP installed on a smart phone, a software program installed on a notebook computer, and the like.
The above is merely an example of two implementations that determine whether to transmit to the second terminal device according to the data type of the object, and the application is not limited thereto.
After the second terminal device makes a judgment result, the judgment result can be sent to the dragging service system of the first terminal device through the dragging service system, the dragging service system of the first terminal device can send the judgment result to the Framework layer after receiving the judgment result, and the Framework layer controls the display screen of the first terminal device to display the judgment result.
There may be various display manners of the determination result, and fig. 5a and 5b respectively show examples of a manner of displaying the determination result according to an embodiment of the present application. Fig. 5a shows schematic diagrams of display interfaces of two example first terminal devices, and fig. 5b shows schematic diagrams of display interfaces of two other example first terminal devices. The edge area of the display screen is illustrated in fig. 5a and 5b in dashed lines.
For example, as shown in fig. 5a, taking a mouse dragging manner as an example, a small icon may be displayed near a pointer of the mouse on the display screen, different colors of the small icon may represent different determination results, for example, green of the small icon indicates that transmission to the second terminal device is possible, red of the small icon indicates that transmission to the second terminal device is not possible, or, as shown in fig. 5a, different shapes of the small icon may represent different determination results, paired hooks of the small icon "√" may indicate that transmission to the second terminal device is possible, cross-hatched "×" may indicate that transmission to the second terminal device is not possible, and so on.
As shown in fig. 5b, taking a finger operation as an example, a prompt message may be displayed near an icon (e.g., a thumbnail) of the object, so as to display the determination result, for example, a prompt message "XX is unable to receive" or "XX is acceptable" may be displayed at the upper right corner of the icon of the object to respectively represent different determination results, where "XX" may represent identification information of the second terminal device, and this way may also be used in an example of a mouse operation.
The displayed judgment result may further include a reason for generating the judgment result, such as "device B cannot receive. avi format file", etc., so that the user can more intuitively know the reason and take corresponding measures.
It should be noted that the above manner of displaying the determination result is only some examples of the present application, and the present application is not limited in any manner as long as the determination result can be intuitively displayed to the user.
Through the data transmission method of the embodiment of the application, visual feedback can be provided for a user according to the data type of the object and the matching degree of the receiving end, misoperation and repeated operation are avoided, and the operation efficiency is improved.
In the related art, only the dragging of the contents such as text, document, picture and file can be realized, and the draggable contents are limited. In order to solve the technical problem, the application adds an expanded field in the dragvent event for expanding the draggable content. FIG. 6 shows a diagram of fields included in a managed event according to an embodiment of the present application. As shown in fig. 6, the fields label and mimeType enclosed by the dashed lines are fields expanded on the clipDescription object, wherein label is String type, action actions such as "open" can be expanded, mimeType is data type of the object, and the dragging of the object and actions on the dragging of the object can be expanded through the above two expanded fields.
The "action" on the left side of FIG. 6 to which the dragvent event is directly linked refers to a series of drag actions such as drag, dragstart, draginter, dragnd, drop, etc.; x and y can represent coordinates of a position where a drag action stays, and clipdata can refer to some attributes of dragged data, such as a MIME type of a clip object, text contained in a clipData item object, webpage text, Uri or Intent data; result may be a field that records the returned determination. It should be noted that fig. 6 only shows one way of extending the field in the present application, and the present application is not limited to this, and the field may be extended on other suitable objects or locations on the managed event.
Therefore, in a possible implementation manner, the determination request may carry a data type field and/or an extended data type field, where the data type field and the extended data type field are used to indicate a data type of the object. In this way, when the drag service system of the first terminal device sends the determination request to the second terminal device, the second terminal device may determine whether the object to be moved to the second terminal device by the first terminal device can be processed according to the data type field and/or the extended data type field, and obtain the determination result.
According to the data transmission method of the above embodiment of the application, the draggable content can be expanded, and the expanded data type field can indicate a custom data type, a data type suitable for a specific device or application, or a new data type, so that more data types and devices are suitable for the data transmission across the devices, and the operation of a user is facilitated.
For example, dragging of a public article on some social platforms is not possible in the related art. According to the embodiment of the application, a data type "Weblinks" may be added to an extended field of a dragvent event, a corresponding action may be opened in a browser or an application corresponding to a social platform, and the like, different actions may be represented by different character strings, and in an example, for the data type "Weblinks", if not set (for example, the label field is empty), the data type may be defaulted to being opened by the browser.
Fig. 7 is a schematic diagram of an application scenario according to an embodiment of the application, and as shown in fig. 7, assuming that a first terminal device and a second terminal device are both smartphones, the first terminal device opens a article with a link in an application a (the link may be an object), a preset operation set for such an object may be a large-area touch and then slides to one side, and when the first terminal device detects the large-area touch on a currently displayed page and then slides to one side, it may be determined that cross-device transmission is to be performed for the article with the link opened in the application a. For example, as shown in fig. 7, when the user touches the screen through the whole palm or most part of the palm and slides to the right to trigger a dragstart event, the first terminal device may send a determination request to the second terminal device, where the determination request table carries an extended data type field: the data type is "Weblinks" and the label field is empty. The second terminal device receives the judgment request, the second terminal device is a smart phone, and a browser is locally installed, so that the judgment result can be transmitted to the second terminal device, the second terminal device can send the judgment result to the first terminal device, the first terminal device displays the judgment result on a display screen, after the user sees the judgment result, if a drag gesture is released, that is, the first terminal device detects a drop operation, the link and the expanded data type field can be sent to the second terminal device, the second terminal device recognizes that the data type of the object is 'Weblinks' and the label field is empty according to the expanded data type, and therefore, the object is opened in a default mode, as shown in fig. 7, and the link is opened in the browser.
Fig. 8a and 8b respectively show flowcharts of a transmission method of data according to an embodiment of the present application, and the transmission method shown in fig. 8a may be applied to a first terminal device, as shown in fig. 8a, and the transmission method of this embodiment may include:
step S800, when detecting a first cross-device transmission operation for an object on a first terminal device, the first terminal device acquires a data type of the object, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to a second terminal device;
step S801, sending a judgment request carrying the data type to the second terminal device,
step S802, receiving a judgment result made by the second terminal device based on the data type, wherein the judgment result represents whether the object can be transmitted to the second terminal device;
step S803, displaying the determination result;
step S804, when the determination result is that the object can be transmitted to a second terminal device and a second cross-device transmission operation is detected, sending the object to the second terminal device, where the second cross-device transmission operation is used to confirm that the object is transmitted to the second terminal device for display.
For steps S800 to S803, reference may be made to the description of steps S100 to S103 in the embodiment portion corresponding to fig. 3a, and details are not repeated.
For step S804, the second cross-device transmission operation corresponds to the first cross-device transmission operation, for example, the first cross-device transmission operation is a drag operation, and then the second cross-device transmission operation may be a drag release operation. Taking a non-touch screen device as an example, if the first cross-device transmission operation is that a finger clicks a left mouse button to be not released and dragged, the second cross-device transmission operation may be that the left mouse button is released, or, taking a touch screen device as an example, if the first cross-device transmission operation is that a finger or a touch pen holds and drags an object, the second cross-device transmission operation may be that the finger or the touch pen leaves the touch screen.
Sending the object to the second terminal device may refer to sending one or more of data corresponding to the object, a data type, indication information of a processing manner of the object, and the like to the second terminal device, and the second terminal device may process the data according to the processing manner. In one possible implementation manner, as described above, the indication information of the processing manner of the object may also be carried in the dragstart event.
The transmission method shown in fig. 8b may be applied to a second terminal device, and as shown in fig. 8b, the transmission method of this embodiment may include:
step S301, receiving a judgment request sent by a first terminal device, wherein the judgment request carries a data type of an object to be moved, and the judgment request is used for requesting the second terminal device to judge whether the object of the data type can be transmitted to the second terminal device;
step S302, making a judgment result according to the data type, and sending the judgment result to the first terminal equipment so that the first terminal equipment displays the judgment result;
step S303, when receiving the object sent by the first terminal device, processing the object according to one or more of the data type, the local storage state, and the application service installed by the second terminal device.
For steps S301 to S302, reference may be made to the description of steps S201 to S202 of the embodiment corresponding to fig. 3b, which is not repeated herein.
For step S303, taking the local storage state as an example, when the local storage state is that the second terminal device does not have the storage capability, the second terminal device may open the object according to the application service installed by the second terminal device; when the local storage state is the storage capability, the second terminal device may store the object locally.
In a possible implementation manner, the local storage state of the second terminal device is that the second terminal device does not have a storage capability, and the second terminal device may select an application service installed by the second terminal device to open an object according to the data type of the object, or the second terminal device may display a selectable option of the application service installed by the second terminal device, and open the object according to the application service selected by the user.
In a possible implementation manner, the second terminal device locally stores the object in a storage capacity, and the second terminal device can process the object according to the operation specified by the first terminal device, in addition to locally storing the object, as described above, when the first terminal device sends the object to the second terminal device, the first terminal device also sends the indication information of the processing manner of the object, so that the second terminal device can process the object according to the processing manner. If the first terminal device does not specify the processing mode for the object, the second terminal device may select an installed application service to open the object according to the data type of the object, or, as described above, open the object according to the application service selected by the user.
It should be noted that the above are merely examples of some processing ways for the object shown in the present application, and the present application is not limited in any way.
FIG. 8c shows an interaction flow diagram of an application scenario according to an embodiment of the present application. As shown in fig. 8c, a device a is an example of a first terminal device, a device B is an example of a second terminal device, the device a (Framework layer) detects a drag operation for a local object, the Framework layer of the device a generates a dragstart event, and invokes a drag service system of the device a, the drag service system of the device a may generate a determination request according to a data type of the object specified by the dragstart event, the determination request may be a request for determining whether to allow dragging, the drag service system of the device a sends the determination request to the drag service system of the device B, the drag service system of the device B sends the determination request to a WindowManager on the device B, and then sends the determination request to WmS by the WindowManager, and the WmS may send the determination request to a corresponding application service (APP), the determination result is made by the application service in response to the determination request, and the determination result is returned to the device a step by step, and the device a may display the determination result. If the result of the determination is that the dragging is allowed, the Framework layer of the device a detects a dragging release operation, and may send an object to the dragging service system of the device a, the dragging service system of the device a transmits the object to the dragging service system of the device B, and after receiving the object, the dragging service system of the device B may store the object and issue the object to the Framework layer and the application service layer through notifyddraganddrop message to process the object.
According to the data transmission method of the embodiment, user operation can be simplified, not only can the change, copying, transmission and the like of the display position of the object be realized through simple actions such as dragging and the like, but also corresponding processing can be directly carried out at the receiving end according to the data type of the object, for example, except storage, application service opening can be selected, the application service opening currently opened is used, and the like, further operation of the user at the receiving end is not needed, and the user operation is greatly simplified.
Fig. 9a shows a flow chart of a method of transmitting data according to an embodiment of the present application. Fig. 9a is a flowchart illustrating an example of a process included in the step "transmitting the object to the second terminal device" in step S804, and as shown in fig. 9a, the step "transmitting the object to the second terminal device" may include:
step S8041, temporarily storing the object in the local of the first terminal equipment;
step S8042, sending a data transmission request to the second terminal device, where the data transmission request is used to transmit the object;
step S8043, when receiving a first response signal returned by the second terminal device to accept the data transmission request, sending the object according to the first response signal;
step S8044, when receiving a second response signal returned by the second terminal device and not accepting the data transmission request, revoking sending the object.
Fig. 9b shows a flow chart of a method of transmitting data according to an embodiment of the present application. The transmission method of fig. 9b may be applied to a second terminal device, and the transmission method of this embodiment may include:
step S204, when a data transmission request sent by the first terminal equipment is received, processing options are displayed;
step S205, determining a response signal in response to the data transmission request according to the selection operation for the processing option, and sending the response signal to the first terminal device.
That is to say, when the first terminal device determines that the object can be transmitted to the second terminal device and detects the second cross-device transmission operation, the first terminal device may first temporarily store the object in the local area of the first terminal device, send a data transmission request to the second terminal device, and process the object according to a response signal, which is returned by the second terminal device and is directed to the data transmission request.
When receiving the data transmission request sent by the first terminal device, the second terminal device displays processing options, where the processing options may refer to processing options for an object, and for example, the processing options may be multiple selectable controls displayed in a control form, or prompt information of different selectable gestures.
In one example, the second terminal device displays a plurality of selectable controls in the form of controls, such as "undo", "delay sending", "receive", and so forth. In another example, the second terminal device displays processing options with selectable prompt information of different gestures, for example, a gesture of sliding left is displayed on the display screen to indicate that the transmission is cancelled, a gesture of sliding right is displayed to indicate that the transmission is received, and a gesture of sliding up or down is displayed to delay the transmission.
In step S205, the response signal in response to the data transmission request may include a first response signal and a second response signal. The first response signal may be a signal indicating that the data transmission request is accepted, for example, the user selects to receive or delay transmission, or the user slides to the right, slides upwards, or the like, indicating that the second terminal device allows to transmit the object corresponding to the data transmission request, in which case, the second terminal device determines that the response to the data transmission request is the first response signal, and transmits the first response signal to the first terminal device. The second response signal may represent a signal that does not accept the data transmission request, that is, the second terminal device does not allow the first terminal device to send the object corresponding to the data transmission request, for example, the user selects to cancel, or the user slides to the left, and so on, and the second terminal device may determine that the second response signal is a second response signal in response to the data transmission request according to the selection operation of the user for the processing option, and send the second response signal to the first terminal device.
In step S8043, the first terminal device transmits the object according to the first response signal when receiving the first response signal, for example, the object may be directly transmitted or the object may be transmitted with a delay when receiving the first response signal. The delay time of the delayed sending may be preset by the first terminal device, or may be specified by the first response signal fed back by the second terminal device, for example, when the second terminal device displays the processing option, the option of the delayed sending may also provide an option of delay time selection for a user, or provide an input box of the delay time, the delay time input by the user is collected through the input box, the second terminal device may generate the first response signal according to the delay time selected or input by the user, and send the first response signal to the first terminal device, and the first terminal device may delay sending the object according to the delay time carried by the first response signal.
In step S8044, the first terminal device cancels the transmission object when receiving the second response signal. After the sending object is cancelled, the object may be saved (temporarily stored) in the foreground of the first terminal device for a certain time, the duration of the saving time may be preset by the first terminal device, or may be adjusted in real time according to the data amount saved in the foreground, for example, when the saved data amount exceeds a set threshold, the oldest saved data may be deleted, and so on.
In a possible implementation manner, after revoking the sending object, the method may further include:
and when a call-out instruction for the temporarily stored object is detected, calling out the temporarily stored object, and sending the data transmission request to the second terminal equipment.
This step is only required after a majority of objects are temporarily stored locally in the first terminal device, and is merely an example after the object is revoked, and the specific scenario is not limited thereto.
That is, after the object is cancelled, if the user wants to continue sending the object, the temporarily stored object may be called out by a call-out instruction, where the call-out instruction is detected when the first terminal device detects a call-out operation of the user, and the call-out operation may be a preset operation, for example, sliding along one side of the screen towards the middle. The first terminal device may also resend the data transmission request to the second terminal device, and the subsequent process may refer to the description above.
According to the data transmission method of the above embodiment, the selectable operation for the dragged object can be displayed on the display screen of the receiving end device, so that the user can adjust the transmission process or the processing mode according to the actual requirement, and the second terminal device can process the object according to the actual requirement.
In a possible implementation, the user may also use the above-described process to temporarily store the plurality of objects in the foreground of the first terminal device, such as sending the first object to the second terminal device, then sending a second object to the second terminal device, wherein the first object and the second object are both temporarily stored in the foreground of the first terminal device, in this case, if the second terminal device receives a plurality of data transmission requests consecutively, the processing option provided by the second terminal device may be different from that provided by receiving only one data transmission request, for example, "all reception", "all delay" may be provided, the second terminal device may also display thumbnails of a plurality of objects, a selection control is displayed near the thumbnail of each object for selection by the user who may select the object to receive, then select "receive" or "delay receive" or "undo", and so on. The second terminal device may process the object according to the detected selection operation.
Fig. 9c shows a flow chart of a method of transmitting data according to an embodiment of the present application. The transmission method of fig. 9c may be applied to a second terminal device, and the transmission method of this embodiment may include:
step S206, when the object sent by the first terminal equipment is received, if the second terminal equipment has the storage capacity, the object is temporarily stored locally;
step S207, displaying processing options;
and step S208, processing the object according to the selection operation aiming at the processing option.
That is, in step S804, after the first terminal device sends the object to the second terminal device, when the second terminal device receives the object, it may first determine whether the object has a storage capability locally, and if the object has the storage capability, the object may be temporarily stored locally, for example, temporarily stored in the foreground. Then, the processing options, the displayed processing options, and the process of processing the object according to the selection operation for the processing options may refer to the above description of the steps S204 to S205, and are not described in detail.
In a possible implementation manner, after the object is temporarily stored locally, if the second terminal device detects a drag operation for the temporarily stored object, it may be determined whether a position where the drag operation is stopped is a dragable position, generate a determination result, and display the determination result, for example, the second terminal device may determine whether the position is the dragable position according to a coordinate of the position where the drag operation is stopped in the current display interface and/or a data type of the object.
In an example, if the current display interface of the second terminal device is a desktop or a folder, if the dragging operation stays at a blank position or a folder on the desktop, it is determined that the dragging operation can be performed at the position. If the drag operation is released, the second terminal device may directly store the object locally, for example, in a system disk or a folder, and may also select an application service according to the data type of the object to open the object, or open the object according to the application service selected by the user as described above.
In another example, if the drag operation stays on the shortcut icon of the application service, the second terminal device may determine whether the drag operation is at the draggable position according to the data type of the object and the application service. For example, if the application service can open the object of the data type, it may be determined that the object is draggable, and if the drag operation is released, the second terminal device may open the object by using the application service; if the application service cannot open the object of the data type, it may be determined that the object cannot be dragged to the location, if the drag operation is released, the object may be stored locally, and an application service opening object installed on the second terminal device may be selected according to the data type of the object, or as described above, the object may be opened according to the application service selected by the user.
In another example, if the second terminal device currently opens the application service, the second terminal device may determine whether the position is a draggable position according to the position where the drag operation stays in the current display interface of the application service and the data type of the object.
In a possible implementation manner, the application services of the first terminal device and the second terminal device may register, in advance, the drag snoop in the Framework layer, for example, register, drag snoop in WmS of the Framework layer. In this way, when detecting a drag event or receiving a drag event sent by another terminal device, the Framework layers of the first terminal device and the second terminal device may send the drag event to the corresponding application service, and the application service may determine whether the object may be dragged according to the drag event.
As shown in fig. 2, the Framework layer of the second terminal device may generate a DRAG _ LOCATION event from the captured coordinates of the LOCATION and the data type of the object, and send the DRAG _ LOCATION event to a corresponding application service, such as the application service opened by the second terminal device, where the application service may determine, according to the coordinates and the data type, whether the LOCATION corresponding to the coordinates is dragable into the object of the data type, and if the LOCATION is dragable, display a determination result in a current display interface of the second terminal device, where a specific display manner may refer to an example in fig. 5a or fig. 5 b.
Fig. 9d and 9e respectively show schematic diagrams of a display interface of a terminal device according to an embodiment of the present application. As shown in fig. 9d, when detecting the second cross-device transmission operation, the first terminal device may send a data transmission request to the second terminal device after storing the object locally, or may directly send the object to the second terminal device, the second terminal device may display a processing option, the second terminal device (e.g., a smart phone) displays a desktop or a certain open folder, the user may select a processing manner (e.g., undo, postpone, etc.) in the second terminal device, or may continue to drag the object, if the second terminal device detects that the object is dragged to a blank position, the second terminal device may directly store the object locally, for example, in a system disk or a folder, or may select an application service open object according to a data type of the object, or as described above, open the object according to the application service selected by the user, the fourth display screen in fig. 9d shows a scene of an open object (picture).
As shown in fig. 9e, the second terminal device displays a scene in which the application service is opened, and if capturing that the drag operation of the user stays at a certain position in the current display interface of the application service, the second terminal device may determine whether the drag operation is a draggable position according to the coordinates of the certain position and the data type of the object. As shown in fig. 9e, if a drag release operation is detected and a position where the drag operation stays before the drag release is capable of dragging the object, the second terminal device may open the object at the position, and as shown in a fourth display screen in fig. 9e, the object is opened in a dialog interface of the instant messaging application service.
Fig. 9f is a schematic diagram illustrating a specific application scenario according to an embodiment of the application, as shown in fig. 9f, a second terminal device opens a mail service according to a user operation to create a new mail, the second terminal device receives a data transmission request sent by a first terminal device or temporarily stores an object sent by the first terminal device in a local area, and displays processing options including "cancel", "drag out from XX", a thumbnail of the object, and the like on a display interface, and when a user drags an object (for example, a thumbnail of the dragged object) on the display interface of the second terminal device, whether to allow the object of the data type to be dragged into the position is displayed in real time according to a position where the drag operation stays in the new mail and a data type of the object.
And if the user selects to cancel the dragging, the object is temporarily stored in the foreground of the second terminal device or temporarily stored in the foreground of the first terminal device. Taking the example of temporary existence in the foreground of the second terminal device, after the second terminal device detects the drag cancellation operation, the object may still be temporarily existed in the foreground, and the second terminal device may pack the processing options, as shown in fig. 9 g. If the user wants to process the object again, the processing option can be pulled out in a sliding mode, and the second terminal device processes the object according to the selection operation aiming at the processing option. According to the data transmission method of the above embodiment, the selectable operation for the dragged object can be displayed on the display screen of the receiving end device, so that the second terminal device can process the object according to the actual requirement.
In a possible implementation manner, the first terminal device is connected with more than two candidate terminal devices, where the connection may include a wired connection and/or a wireless connection, which is not limited in this application. Fig. 10 shows a schematic diagram of an application scenario according to an embodiment of the present application. As shown in fig. 10, the first terminal device is wirelessly connected with three candidate terminal devices (candidate terminal device 1, candidate terminal device 2, and candidate terminal device 3). In this embodiment, the data transmission method of the present application may further include: determining the second terminal device from among more than two candidate terminal devices.
In a possible implementation manner, when the first terminal device detects a first cross-device transmission operation for a local object of the first terminal device, according to a stop position of the first cross-device transmission operation, a second terminal device corresponding to the first cross-device transmission operation is determined from the two or more candidate terminal devices.
In a possible implementation manner, terminal devices corresponding to different positions on the display screen of the first terminal device may be preset, and dragging to a corresponding position may indicate that the user wants to drag an object into the terminal device corresponding to the position. Taking fig. 4 as an example, for the device 2, the left area of the display screen of the device 2 may be set to correspond to the device 1 and the right area to correspond to the device 2 during pairing. The left side area or the right side area is only one example of the present application, and the present application is not limited thereto, and may be, for example, an upper side area or a lower side area. The size of each area may be set according to actual circumstances, for example, if the corresponding terminal devices are set only for the left area and the right area, 1/3 on the left side of the display screen may be set as the left area and 1/3 on the right side may be set as the right area.
In a possible implementation manner, the first terminal device may also display, on the display screen, information of two or more candidate terminal devices, for example, identification information of the second terminal device, when detecting a first cross-device transmission operation for an object on the local side of the first terminal device, where the display may be in a manner of displaying "icons and/or characters" at a fixed position on the display screen, where the icons may include an icon representing the candidate terminal device, an icon of an application service installed on the candidate terminal device, and the like, and the characters may be identification information of the candidate terminal device, an option of a processing manner for the object, and the like.
Fig. 11a shows a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application. Fig. 11a shows an example of a first terminal device displaying information of more than two candidate terminal devices, and as shown in fig. 11a, the first terminal device may also display drawers (an example of icons representing the candidate terminal devices) that can be dragged in at a right edge of a display screen when a first cross-device transmission operation for an object local to the first terminal device is detected, where each drawer may correspond to one candidate terminal device, for example, devices 1 to 4, and a device identifier of the corresponding candidate terminal device may be displayed on the drawer. The edge positions of the first terminal device may all show a drawer that can be drawn in, and the illustration in fig. 11a is only an example and does not limit the application in any way.
In a possible implementation manner, the first cross-device transmission operation may refer to a process from an initiating operation to an operation stopping, the first terminal device may display a drawer that can be dragged in at a right side edge of the screen when detecting the initiating operation, and may also display the drawer that can be dragged in at an edge of the screen where the first cross-device transmission operation stops when detecting the operation stopping and a stopping position is the edge of the screen.
In a possible implementation manner, determining, according to a staying position of the first cross-device transmission operation, a second terminal device corresponding to the first cross-device transmission operation from the more than two candidate terminal devices may include:
and determining a second terminal device corresponding to the first cross-device transmission operation according to the relation between the stop position of the first cross-device transmission operation and the display positions of the information of the more than two candidate terminal devices.
The first cross-device transfer operation may generate a thumbnail corresponding to the object during the movement, and the thumbnail may move following the movement of the finger or the mouse, as shown in fig. 11a, following the small icon of the finger. Therefore, the first terminal device may determine the second terminal device corresponding to the first cross-device transmission operation according to a relationship between the staying position of the thumbnail of the object and the display positions of the information of the two or more candidate terminal devices when the first cross-device transmission operation stays. The information of the candidate terminal device may be information such as the ID of the drawer and the device as described above.
According to the first cross-device transmission operation stay, the relation between the stay position of the thumbnail of the object and the display positions of the information of the two or more candidate terminal devices may be: the staying position of the thumbnail of the object is not overlapped with the display positions of the information of more than two candidate terminal devices; or, the staying position of the thumbnail of the object at least partially coincides with the display position of the information of one of the candidate terminal devices, for example, the staying position of the thumbnail of the object is completely located on the display position of the information of one of the candidate terminal devices or partially coincides with the display position of the information of one of the candidate terminal devices; or the stopping position of the thumbnail of the object is overlapped with the display position of the information of two or more candidate terminal devices.
For the case that the staying position of the thumbnail of the object does not coincide with the display positions of the information of the two or more candidate terminal devices, the first terminal device may determine that the second terminal device is not currently selected, that is, the receiving end device is not selected, as shown in fig. 11 a.
Fig. 11b is a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application, where fig. 11b shows an example of a case where a staying position of a thumbnail of an object at least partially coincides with a display position of information of one candidate terminal device (device 2), and in fig. 11b, the thumbnail partially coincides with information of a second candidate terminal device displayed at a right edge, and the first terminal device may determine that the candidate terminal device is the second terminal device.
For a case where there is coincidence between the stay position of the thumbnail of the object and the display positions of the information of two or more candidate terminal devices, the first terminal device may determine the candidate terminal device in which the display position of the information coincides most with the stay position of the thumbnail of the object as the second terminal device. The first terminal device determining that the candidate terminal device with the most overlapping is the second terminal device is only one example of the present application, and the present application is not limited thereto. Alternatively, the first terminal device may prompt the user to continue moving until the second terminal device can be determined.
Fig. 11c shows a schematic diagram of a display interface of a first terminal device according to an embodiment of the present application. Fig. 11d illustrates an application scenario diagram of a display interface of a first terminal device according to an embodiment of the present application.
In a possible implementation manner, the first terminal device may determine the second terminal device and a processing manner of the second terminal device to the object according to the detected trigger instruction for the information of the two or more candidate terminal devices. In this implementation, the sending the object to the second terminal device may include: and sending the object and indication information to the second terminal equipment, wherein the indication information is used for indicating the second terminal equipment to process the object in the processing mode. And after receiving the object and the indication information, the second terminal device can process the object according to a processing mode.
Fig. 11c shows an example of a first terminal device displaying information of more than two candidate terminal devices, the first terminal device may determine a second terminal device according to a selection operation, wherein the selection operation may be that a user clicks an icon of a candidate terminal device by a finger or a stylus pen, or clicks an icon of a candidate terminal device by a mouse. After the second terminal device is determined according to the selection operation, selectable processing modes of the second terminal device on the object may be displayed, as shown in the middle small window in fig. 11c, the selected object is displayed on the upper portion of the small window, the number of the selected objects may also be displayed, that is, a plurality of objects may be selected, information of candidate terminal devices is displayed in the middle of the small window, information of the terminal devices is displayed in the form of device icons and characters, and application programs (e.g., application 1 and application 2) installed on the selected candidate terminal devices (e.g., device 1) and options (e.g., storage, opening, copying, etc.) of the processing modes of the object are displayed on the lower portion of the small window. The first terminal device may determine a processing manner of the second terminal device for the object according to the selection operation, and thus, when the first terminal device sends the object to the second terminal device, the first terminal device may also send indication information indicating that the second terminal device processes the object in the processing manner.
As shown in fig. 11c or fig. 11d, the user selects the second terminal device, the application service installed on the second terminal device, and the processing manner (save, open, etc.) of the second terminal device processing object in the widget. In this way, the first terminal device can determine the second terminal device and the processing mode of the second terminal device on the object.
When only one candidate terminal device exists as the second terminal device, an option of a processing method of the second terminal device for the object may be displayed on the first terminal device, and the processing method of the second terminal device for the object may be determined by a selection operation or the like for the option on the first terminal device.
According to the data transmission method of the embodiment, interaction among a plurality of terminal devices can be realized, and the types of the terminal devices can be different, so that a user can more conveniently share data among a plurality of different devices.
For the example of displaying drawers shown in fig. 11a and 11b, it should be noted that, if the first terminal device sends an object to the second terminal device: the temporary storage of the object may be temporary storage of the object corresponding to the drawer (that is, corresponding to the candidate terminal device), for example, the drawer icon is associated with a cache space, and when the first terminal device detects that the user performs a drag release action on the drawer, the object corresponding to the drag release action may be stored in the associated cache space.
In a possible implementation manner, after the first terminal device temporarily stores the object, as described above, an option of the second terminal device for processing the object may be further displayed for the user to select, the second terminal device may determine the object processing manner through a selection operation for the option and the like on the first terminal device, and then send a data transmission request to the second terminal device, where the data transmission request may also carry the processing manner. By the method, the method for processing and selecting the object can be realized more flexibly.
The first terminal device sends a data transmission request to the second terminal device, the second terminal device returns a second response signal (as shown in fig. 9a and 9 b), and when receiving the second response signal, the first terminal device cancels sending the object, or the drawer may be retracted in a manner similar to that shown in fig. 9g, and the object is still temporarily stored in the buffer space. If the user wants to send the object again, the user can pull out the drawer in a sliding mode and drag the object again for transmission.
In a possible implementation manner, determining, according to a staying position of the first cross-device transmission operation, a second terminal device corresponding to the first cross-device transmission operation from the more than two candidate terminal devices may further include:
and determining second terminal equipment corresponding to the first cross-equipment transmission operation according to the position relation between the stop position of the first cross-equipment transmission operation and the edge of the first terminal equipment.
In a possible implementation manner, according to a position relationship between the first terminal device and the connected candidate terminal device, a corresponding relationship between different edges of the first terminal device and the candidate terminal device may be preset, so that the second terminal device corresponding to the first cross-device transmission operation may be determined according to a position relationship between a stop position of the first cross-device transmission operation and an edge of the first terminal device.
Or, in another possible implementation manner, a location sensor may be installed in the terminal device of the present application, and may sense a location relationship with other terminal devices, and when the first terminal device captures that the first cross-device transmission operation stays at the edge of the first terminal device, the sensor may be enabled to sense the candidate terminal device placed at the edge, and the candidate terminal device placed at the edge is taken as the second terminal device.
Fig. 12a shows a flowchart of a data transmission method according to an embodiment of the present application, where the method shown in fig. 12a may be applied to a first terminal device, and the transmission method of this embodiment may further include, on the basis of steps S100 to S103:
step S105, sending a display instruction to the second terminal device to instruct the second terminal device to display the first image of the object according to the display instruction and the position relation between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
Fig. 12b shows a flowchart of a data transmission method according to an embodiment of the present application, where the method shown in fig. 12b may be applied to a second terminal device, and on the basis of steps S201 to S202 shown in fig. 3b, the transmission method of this embodiment may further include:
step S209, when receiving the display instruction sent by the first terminal device, displaying a first image of an object corresponding to the data according to the display instruction and the position relationship between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
The first image and the second image may refer to a part of a thumbnail of an object, and the complete image of the object may refer to a complete thumbnail of the object. That is to say, in this embodiment, a partial image in a thumbnail corresponding to an object may be displayed according to a positional relationship between a first terminal device and a second terminal device, for example, the second terminal device is on the right side of the first terminal device, and then after the first terminal device determines the second terminal device, a display instruction may be sent to the second terminal device, and the display instruction may carry a second image of the object currently displayed by the first terminal device, so that the second terminal device may remove a first image of the second image from a complete image of the object displayed on a display screen of the second terminal device according to the positional relationship between the second terminal device and the first terminal device.
In one possible implementation manner, displaying the first image according to the position relationship between the second terminal device and the first terminal device may refer to displaying the first image on a side of a display screen of the second terminal device, which is close to the first terminal device. For example, as described above in the example, the second terminal device is on the right side of the first terminal device, and then the first image may be displayed on the left side of the display screen of the second terminal device.
The side of the display screen of the second terminal device, which is close to the first terminal device, may be set in advance according to an initial position relationship between the second terminal device and the first terminal device, or may be a position relationship between the second terminal device and the first terminal device sensed in real time. For example, the position relationship between the second terminal device and the first terminal device is preset, for example, the second terminal device is located on the right side of the first terminal device, and when the second terminal device receives the display instruction, the second terminal device may display the first image on the left side even if the actual position of the second terminal device is not on the right side of the first terminal device. For another example, as described above, the first terminal device and the second terminal device may both be provided with a position sensor, the second terminal device may sense a position relationship with the first terminal device in real time, and when the second terminal device receives the display instruction, if it is sensed that the first terminal device is located on the left side of the second terminal device, the first image may be displayed on the left side of the display screen.
If the first terminal device is connected with a second terminal device, or after the second terminal device is determined from more than two candidate terminal devices, the first terminal device may send a display instruction to the second terminal device, and the display instruction may carry a second image of an object displayed by the first terminal device, so that the second terminal device may display a complete image of the object on a display screen of the second terminal device according to the display instruction and a positional relationship between the second terminal device and the first terminal device to remove a first image of the second image.
In a possible implementation manner, the "sending the object to the second terminal device" in step S104 may further include: and if the area of the second image of the object displayed on the display screen is detected to meet the sending condition, sending the object to the second terminal equipment.
The sending condition may be that the object may be sent when the relationship between the area of the second image and the area of the complete image satisfies a certain proportion, and if the relationship between the area of the second image and the area of the complete image does not satisfy the certain proportion, the object may not be sent to the second terminal device even when the second cross-device transmission operation is detected. For example, if the ratio of the area of the second image to the area of the complete image is smaller than the ratio threshold, the object is sent to the second terminal device when the second cross-device transmission operation is detected, and if the ratio of the area of the second image to the area of the complete image is not smaller than the ratio threshold, the object is not sent to the second terminal device when the second cross-device transmission operation is detected. The proportion threshold value may be preset, and may be 50%, for example.
Fig. 13a is a schematic diagram of an application scenario according to an embodiment of the present application, and fig. 13b is a schematic diagram of a cross-screen display according to an embodiment of the present application. As shown in fig. 13a, the position of the different edge of the first terminal device may correspond to the candidate terminal device, the position of the left edge of the first terminal device corresponds to the candidate terminal device 1, the position of the right edge of the first terminal device corresponds to the candidate terminal device 2, and the position of the upper edge of the first terminal device corresponds to the candidate terminal device 3. In this way, according to the above, the second terminal device corresponding to the first cross-device transmission operation may be determined according to the position relationship between the stop position of the first cross-device transmission operation and the edge of the first terminal device.
As shown in fig. 13a, if the stay position of the first cross device transmission operation is the right edge of the first terminal device, the candidate terminal device 2 may be the second terminal device. At this time, the first terminal device may send a display instruction to the second terminal device, where the display instruction may carry a second image of the object displayed by the first terminal device, so that the second terminal device may display a complete image of the object on a display screen of the second terminal device according to the display instruction and a positional relationship between the second terminal device and the first terminal device, and remove a first image of the second image, as shown in fig. 13a, and display the first image at a left edge of the candidate terminal device 2. If the first terminal device detects the drag release action, at this time, the area of the second image displayed on the first terminal device is smaller than half of the complete image, that is, the ratio of the area of the second image to the area of the complete image is smaller than the proportional threshold, so that the first terminal device can send the object to the second terminal device. The specific process of sending the object to the second terminal device may refer to the part described in step S804 and the embodiments shown in fig. 9a to 9c, which is not described in detail herein.
In the specific application scenario shown in fig. 13b, the second terminal device currently opens the application service, and as shown in the example of fig. 13b, the second terminal device opens the chat window of the instant messaging application. The first terminal device detects that the first cross-device transmission operation stays at the right edge of the display screen of the first terminal device, and may send a display instruction to the second terminal device, and the second terminal device displays the second image of the object at the left edge of the display screen.
In a possible implementation manner, the display instruction sent by the first terminal device may also carry a data type of the object, and the second terminal device may determine whether the object can be received according to information such as the data type, a currently opened application service, an application service installed on the second terminal device, and a local storage state of the second terminal device, and feed back a determination result to the first terminal device in real time.
When the first terminal device detects a second cross-device transmission operation, if the ratio of the second image of the object displayed by the first terminal device to the complete image is smaller than the ratio threshold and the determination result indicates that the second image can be transmitted to the second terminal device, the first terminal device may send the object to the second terminal device. After receiving the object, if the currently opened application service can open the object, the second terminal device may directly open the object in the currently opened application service, as shown in fig. 13b, the second terminal device may open the object (e.g., a picture) in the instant messaging application, and may select to send the picture or store the picture locally according to further operations of the user; if the currently opened application service cannot open the object, the second terminal device may select the application service to open the object according to the data type of the object. The second terminal device may also store the object.
According to the data transmission method of the embodiment, the dragging interaction process is simply and visually displayed, meanwhile, the user can judge the dragging time, repeated operation is avoided, and user operation is simplified.
Fig. 14a shows a flow chart of a method of transmitting data according to an embodiment of the present application. The transmission method shown in fig. 14a may be applied to the first terminal device, and the transmission method of this embodiment may further include, in addition to steps S100 to S103:
step S106, sending a display state request to the second terminal device, so that the second terminal device responds to the display state request, returns to the current display interface of the second terminal device, and displays the display interface.
Fig. 14b shows a flow chart of a method of transmitting data according to an embodiment of the present application. The transmission method shown in fig. 14b may be applied to the second terminal device, and the transmission method of this embodiment may further include, in addition to steps S201 to S202:
step S210, when receiving a display state request sent by a first terminal device, returning a current display interface of the second terminal device to the first terminal device, so that the first terminal device determines a position coordinate of a position where a first cross-device transmission operation stays in the display interface.
The first cross-device transmission operation is an operation used by the first terminal device to initiate a process of transmitting the object to the second terminal device.
The display state request is used for requesting the second terminal device to return the current display interface of the second terminal device, and when the second terminal device receives the display state request, the second terminal device can capture the current display interface to obtain a screen capture image and return the screen capture image to the first terminal device, or can directly project the screen onto the display interface of the first terminal device. The present application does not limit the specific way of returning to the current display interface of the second terminal device.
The transmission method of this embodiment may be combined with the examples shown in fig. 3a and 3b, or may be combined with the examples shown in fig. 11a and 11 b. That is, if the first terminal device connects to one second terminal device, or after the second terminal device is determined from two or more candidate terminal devices, the first terminal device may send a display status request to the second terminal device.
After receiving a current display interface of the second terminal device returned by the second terminal device, the second terminal device may continue to monitor the first cross-device transmission operation, and in a possible implementation manner, step S101 may include: and sending a judgment request to the second terminal device, wherein the judgment request comprises the data type and the coordinate of the position where the first cross-device transmission operation stays in the display interface. That is, the coordinate of the position where the first cross-device transmission operation stays in the display interface may be carried in the determination request received by the second terminal device, and the determining the determination result according to the data type in step S202 may include: and making a judgment result according to the data type and the coordinate.
For example, if the application service registers drag and drop listening, the second terminal device may send the determination request sent by the first terminal device to the application service, and the service determines whether the data can be transmitted to the second terminal device according to the data type and the position of the coordinate. The second terminal device may send a determination result according to the data type and the coordinates to the first terminal device, and step S102 may include: and receiving a judgment result made by the second terminal device based on the data type and the coordinate. The display mode of the judgment result may refer to the above embodiment, and is not described in detail.
In a possible implementation manner, the second terminal device may further make a determination result according to the local storage state. And when the local storage state is the storage capacity, the second terminal equipment judges that the mobile terminal equipment can be moved to the second terminal equipment. The local storage state is not provided with the storage capacity, and the judgment result can be made according to the data type and the position coordinate.
Fig. 15 shows a schematic diagram of an application scenario according to an embodiment of the present application. As shown in fig. 15, the transmission method of the present application is described with reference to fig. 14a, 14b, and 11b, and as shown in fig. 11b, the stay position of the thumbnail of the object partially overlaps with the display position of the information of one candidate terminal device, and the candidate terminal device is selected as the second terminal device. The first terminal device can send a display state request to the second terminal device, when the second terminal device receives the display state request, the second terminal device can capture a screen of a current display interface to obtain a screen capture image, the screen capture image is returned to the first terminal device, and the first terminal device displays the screen capture image of the second terminal device on the display screen.
The first terminal equipment continues to monitor the first cross-equipment transmission operation, sends a judgment request to the second terminal equipment, and judges the coordinates of the data type of the object carried by the request and the position where the first cross-equipment transmission operation stays in the screen capture image of the display interface of the second terminal equipment. For example, in fig. 15, a finger drags a thumbnail of an object on the first terminal device to stay in a message input box of a dialog interface of the instant message application in the screenshot image of the second terminal device, or in a dialog content display box, the first terminal device may obtain coordinates of the stay position, carry the coordinates in the determination request, and send the coordinates to the second terminal device. The second terminal device receives the determination request, and the second terminal device may make a determination result according to the data type and the coordinates. For example, the DRAG service system of the second terminal device receives the determination request, and issues the determination request to the Framework layer, the Framework layer generates a DRAG _ LOCATION event, and sends the DRAG _ LOCATION event to the application service, the application service may determine whether the DRAG can be performed according to the coordinate and the data type, and return the determination result step by step (as an example of returning the determination result shown in fig. 8 c) to the first terminal device, and display the determination result in the current display interface of the first terminal device, where a specific display manner may refer to the example in fig. 5a or fig. 5 b.
If the object in this embodiment is a picture or a segment of text, the picture is allowed to be dragged into a dialog box of the instant message application, or the text is allowed to be dragged into an input box, and then the determination result is displayed on the display interface of the first terminal device. The user releases the drag operation, the first terminal device may send the object to the second terminal device when detecting that the drag operation is released, and the second terminal device may directly open the object at a corresponding position, for example, as shown in fig. 15, the second terminal device displays the object (e.g., a picture) in a dialog box.
According to the data transmission method of the embodiment, the object can be directly dragged to the target position, and compared with the related art that the process of dragging the object to the target position at the receiving end needs to be further operated, the user operation can be simplified. The judgment result is displayed in real time in the dragging process, repeated operation can be avoided, and the operation efficiency is improved.
Based on the foregoing implementation manners of fig. 14a and 14b, fig. 16 shows a flowchart of a data transmission method according to an embodiment of the present application, and the method shown in fig. 16 may be applied to a second terminal device, as shown in fig. 16, and the method may further include: step S211, when receiving the object sent by the first terminal device, processing the object according to one or more of the data type, the local storage state, the application service installed by the second terminal device, and the position coordinate.
In one possible implementation, step S211 may include:
when the local storage state is that the object does not have the storage capacity, opening the object according to the application service installed by the second terminal equipment;
and when the local storage state is the storage capacity, the object is stored locally.
In a possible implementation manner, when the local storage state is storage-capable, storing the object locally may include:
when the local storage state is that the storage capacity is available and the position of the position coordinate in the display interface does not allow the data type to be inserted, locally storing the object and selecting an application program to open the object according to the data type;
and when the local storage state is a storage capacity and the position of the position coordinate in the display interface allows the data type to be inserted, locally storing the object and opening the object in the display interface according to the position coordinate.
The specific process of opening the object according to the application service installed in the second terminal device may refer to the foregoing description. Whether the position of the position coordinate in the display interface allows the data type to be inserted or not can be determined according to whether the position has drag and drop monitoring of the data type registered or not, and the specific process can refer to the description above.
According to the data transmission method of the embodiment, the object can be directly dragged to the target position, compared with the related technology that the process of dragging the object to the target position needs to be further operated at the receiving end, the user operation can be simplified, and the receiving end can process the object according to local information, so that the user operation is simplified.
The application also provides a data transmission device, which is applied to a first terminal device, wherein the first terminal device is connected with a second terminal device, such as the application scenario shown in fig. 1. Fig. 17 is a block diagram illustrating a transmission apparatus of data according to an embodiment of the present application, and as shown in fig. 17, the apparatus includes:
a first obtaining module 61, configured to, when the first terminal device detects a first cross-device transmission operation for an object on the first terminal device, obtain a data type of the object, where the first cross-device transmission operation is used to initiate a process of transmitting the object to a second terminal device;
a first sending module 62, configured to send a determination request carrying the data type to the second terminal device,
a first receiving module 63, configured to receive a determination result made by the second terminal device based on the data type, where the determination result indicates whether the object can be transmitted to the second terminal device;
and a first display module 64, configured to display the determination result.
According to the data transmission device of the embodiment of the application, visual feedback can be provided for a user according to the data type of the object and the matching degree of the receiving end, misoperation and repeated operation are avoided, and the operation efficiency is improved.
In a possible implementation manner, the determination request carries a data type field and/or an extended data type field, and the data type field and the extended data type field are used for indicating the data type of the object.
In one possible implementation, the apparatus further includes:
and a second sending module, configured to send the object to a second terminal device when the determination result is that the object can be transmitted to the second terminal device and a second cross-device transmission operation is detected, where the second cross-device transmission operation is used to confirm that the object is transmitted to the second terminal device.
In one possible implementation manner, the second sending module includes:
the first storage unit is used for temporarily storing the object in the local of the first terminal equipment;
a first sending unit, configured to send a data transmission request to the second terminal device, where the data transmission request is used to transmit the object;
and the second sending unit is used for sending the object according to the first response signal when receiving the first response signal returned by the second terminal equipment and accepting the data transmission request.
In a possible implementation manner, the first sending unit is further configured to, when detecting a call-out instruction for the temporarily stored object, call out the temporarily stored object, and send the data transmission request to the second terminal device.
In a possible implementation manner, the second sending unit is further configured to:
upon receiving the first response signal, directly transmitting the object;
or delaying the transmission of the object upon receiving the first response signal.
In a possible implementation manner, the second sending module further includes:
and the canceling and sending unit is used for canceling and sending the object when receiving a second response signal which is returned by the second terminal device and does not accept the data transmission request.
In one possible implementation, the first terminal device is connected to more than two candidate terminal devices,
the device further comprises:
a second display module, configured to display information of the two or more candidate terminal devices when the first terminal device detects a first cross-device transmission operation for an object local to the first terminal device;
and the second determining module is used for determining the second terminal equipment corresponding to the first cross-equipment transmission operation according to the relation between the stop position of the first cross-equipment transmission operation and the display positions of the information of the more than two candidate terminal equipment.
In a possible implementation manner, the apparatus further includes a third determining module, configured to determine, according to a position relationship between a stop position of the first cross-device transmission operation and an edge of the first terminal device, a second terminal device corresponding to the first cross-device transmission operation. In one possible implementation, the apparatus further includes:
a third sending module, configured to send a display instruction to the second terminal device to instruct the second terminal device to display the first image of the object according to the display instruction and a positional relationship between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
In a possible implementation manner, the fourth sending module is configured to send the object to the second terminal device if it is detected that an area of the second image of the object displayed on the display screen satisfies a sending condition.
In one possible implementation, the apparatus further includes:
a fifth sending module, configured to send a display status request to the second terminal device, so that the second terminal device returns to a current display interface of the second terminal device in response to the display status request, and displays the display interface;
the first transmitting module includes:
a third sending unit, configured to send a determination request to the second terminal device, where the determination request carries the data type and a coordinate of a location where the first cross-device transmission operation stays in the display interface,
the first receiving module includes:
a receiving unit, configured to receive a determination result made by the second terminal device based on the data type and the coordinate.
In one possible implementation, the apparatus further includes:
the third determining module is used for determining the processing mode of the second terminal equipment to the object;
the second sending module further comprises:
and a fourth sending unit, configured to send the object and indication information to the second terminal device, where the indication information is used to instruct the second terminal device to process the object in the processing manner.
In one possible implementation manner, the first cross-device transmission operation is a drag operation.
In one possible implementation, the second cross-device transmission operation is a drag release operation.
In a possible implementation manner, the first terminal device is a touch screen device or a non-touch screen device, and the second terminal device is a touch screen device, a non-touch screen device or a non-screen device.
The application also provides a data transmission device, which is applied to a second terminal device, and is characterized in that the second terminal device is connected with a first terminal device, such as the application scenario shown in fig. 1. Fig. 18 shows a block diagram of a transmission apparatus of data according to an embodiment of the present application, as shown in fig. 18,
the device comprises:
a second receiving module 71, configured to receive a determination request sent by a first terminal device, where the determination request carries a data type of an object to be transmitted, and the determination request is used to request the second terminal device to determine whether the object of the data type can be transmitted to the second terminal device;
the first determining module 72 is configured to make a determination result according to the data type, and send the determination result to the first terminal device, so that the first terminal device displays the determination result.
In one possible implementation, the apparatus further includes:
a sixth sending module, configured to, when receiving a display state request sent by a first terminal device, return a current display interface of the second terminal device to the first terminal device, so that the first terminal device determines coordinates of a position where a first cross-device transmission operation stays in the display interface, where the first cross-device transmission operation is an operation in which the first terminal device is used to initiate a process of transmitting the object to the second terminal device.
In one possible implementation, the determination request carries the coordinates,
the first judging module comprises:
and the first judgment unit is used for making a judgment result according to the data type and the coordinate.
In one possible implementation, the apparatus further includes:
and the first processing module is used for processing the object according to one or more of the data type, the local storage state, the application service installed by the second terminal equipment and the coordinates when the object sent by the first terminal equipment is received.
In one possible implementation, the first processing module includes:
the first processing unit is used for opening the object according to the application service installed by the second terminal equipment when the local storage state is that the local storage state does not have the storage capacity;
and the second processing unit is used for locally storing the object when the local storage state is that the storage capacity is available.
In a possible implementation manner, the second processing unit is further configured to, when the local storage state is that the storage capability is provided, and the corresponding position of the coordinate in the display interface does not allow the data type to be inserted, locally store the object and select an application program to open the object according to the data type;
and when the local storage state is that the storage capacity is provided and the corresponding position of the coordinate in the display interface allows the data type to be inserted, locally storing the object and opening the object in the display interface according to the coordinate.
In one possible implementation, the apparatus further includes:
the storage module is used for temporarily storing the object in the local area if a second terminal device has storage capacity when receiving the object sent by the first terminal device;
the third display module is used for displaying the processing options;
and the second processing module is used for processing the object according to the selection operation aiming at the processing option.
In one possible implementation, the apparatus further includes:
a fourth display module, configured to display a first image of an object corresponding to the data according to the display instruction and a positional relationship between the second terminal device and the first terminal device when receiving the display instruction sent by the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
In a possible implementation manner, the first terminal device is a touch screen device or a non-touch screen device, and the second terminal device is a touch screen device, a non-touch screen device or a non-screen device.
Embodiments of the present application provide a terminal device, which may perform the data transmission method in one or more of the foregoing implementation manners.
The present application may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present application.
Embodiments of the present application provide a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, the processor in the electronic device performs the above method.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry can execute computer-readable program instructions to implement aspects of the present application by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present application, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (32)

1. A data transmission method is applied to a first terminal device, and is characterized in that the first terminal device is connected with more than two candidate terminal devices, wherein the more than two candidate terminal devices comprise a second terminal device, and the method comprises the following steps:
the method comprises the steps that when a first cross-device transmission operation aiming at an object on a first terminal device is detected by the first terminal device, the data type of the object is obtained, information of more than two candidate terminal devices is displayed, and a second terminal device corresponding to the first cross-device transmission operation is determined according to the relation between the stop position of the first cross-device transmission operation and the display positions of the information of more than two candidate terminal devices, wherein the first cross-device transmission operation is used for initiating the process of transmitting the object to the second terminal device;
or the first terminal device determines a second terminal device corresponding to the first cross-device transmission operation according to a position relation between a stop position of the first cross-device transmission operation and an edge of the first terminal device, and the first terminal device acquires the data type of an object when detecting the first cross-device transmission operation for the object on the first terminal device;
sending a judgment request carrying the data type to the second terminal device,
receiving a judgment result made by the second terminal device based on the data type, wherein the judgment result represents whether the object can be transmitted to the second terminal device;
displaying the judgment result;
when the judgment result is that the object can be transmitted to a second terminal device and a second cross-device transmission operation is detected, sending the object to the second terminal device, wherein the second cross-device transmission operation is used for confirming that the object is transmitted to the second terminal device;
the first cross-device transmission operation is a drag operation, and the second cross-device transmission operation is a drag release operation.
2. The method according to claim 1, wherein the determination request carries a data type field and/or an extended data type field, and the data type field and the extended data type field are used for indicating a data type of the object.
3. The method of claim 1, wherein sending the object to the second terminal device comprises:
temporarily storing the object in the local of the first terminal equipment;
sending a data transmission request to the second terminal equipment, wherein the data transmission request is used for transmitting the object;
and when a first response signal which is returned by the second terminal equipment and accepts the data transmission request is received, the object is sent according to the first response signal.
4. The method of claim 3, wherein after the object is temporarily stored locally at the first terminal device, further comprising:
and when a call-out instruction for the temporarily stored object is detected, calling out the temporarily stored object, and sending the data transmission request to the second terminal equipment.
5. The method of claim 3 or 4, wherein transmitting the object according to the first response signal comprises:
upon receiving the first response signal, directly transmitting the object;
or delaying the transmission of the object upon receiving the first response signal.
6. The method of claim 1, further comprising:
sending a display instruction to the second terminal device to instruct the second terminal device to display a first image of the object according to the display instruction and a positional relationship between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be spliced into a complete image of the object;
and if the area of the second image of the object displayed on the display screen is detected to meet the sending condition, sending the object to the second terminal equipment.
7. The method of claim 1, further comprising:
sending a display state request to the second terminal device, so that the second terminal device responds to the display state request, returns to the current display interface of the second terminal device, and displays the display interface;
sending a judgment request carrying the data type to the second terminal device, including:
sending a judgment request to the second terminal device, wherein the judgment request carries the data type and the coordinate of the position where the first cross-device transmission operation stays in the display interface,
receiving a judgment result made by the second terminal device based on the data type, wherein the judgment result comprises the following steps:
and receiving a judgment result made by the second terminal device based on the data type and the coordinate.
8. The method of claim 1, further comprising:
determining a processing mode of the second terminal equipment to the object;
sending the object to the second terminal device, including:
and sending the object and indication information to the second terminal equipment, wherein the indication information is used for indicating the second terminal equipment to process the object in the processing mode.
9. A data transmission method is applied to a second terminal device, and is characterized in that the second terminal device is connected with a first terminal device, and the method comprises the following steps:
receiving a judgment request sent by a first terminal device, wherein the judgment request carries the data type of an object to be transmitted, and the judgment request is used for requesting a second terminal device to judge whether the object of the data type can be transmitted to the second terminal device;
making a judgment result according to the data type, and sending the judgment result to the first terminal equipment so that the first terminal equipment displays the judgment result;
the second terminal device is determined by the first terminal device according to a relationship between a stop position of the first cross-device transmission operation and display positions of the information of the two or more candidate terminal devices, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to the second terminal device;
or the second terminal device is determined by the first terminal device according to the position relationship between the stop position of the first cross-device transmission operation and the edge of the first terminal device.
10. The method of claim 9, further comprising:
when a display state request sent by a first terminal device is received, returning a current display interface of a second terminal device to the first terminal device so that the first terminal device determines coordinates of a position where a first cross-device transmission operation stays in the display interface, wherein the first cross-device transmission operation is an operation used by the first terminal device for initiating a process of transmitting the object to the second terminal device;
wherein the judgment request carries the coordinates,
making a judgment result according to the data type, comprising:
and making a judgment result according to the data type and the coordinate.
11. The method of claim 10, further comprising:
and when the object sent by the first terminal equipment is received, processing the object according to one or more of the data type, the local storage state, the application service installed by the second terminal equipment and the coordinates.
12. The method of claim 11, wherein processing the object according to one or more of the data type, a local storage state, an application service installed by a second terminal device, and the coordinates comprises:
when the local storage state is that the object does not have the storage capacity, opening the object according to the application service installed by the second terminal equipment;
and when the local storage state is the storage capacity, the object is stored locally.
13. The method of claim 12, wherein storing the object locally when the local storage state is storage-capable comprises:
when the local storage state is that the storage capacity is available and the corresponding position of the coordinate in the display interface does not allow the data type to be inserted, locally storing the object and selecting an application program to open the object according to the data type;
and when the local storage state is that the storage capacity is provided and the corresponding position of the coordinate in the display interface allows the data type to be inserted, locally storing the object and opening the object in the display interface according to the coordinate.
14. The method according to any one of claims 9-12, further comprising:
when the object sent by the first terminal equipment is received, if the second terminal equipment has the storage capacity, the object is temporarily stored locally;
displaying the processing options;
and processing the object according to the selection operation aiming at the processing option.
15. The method of claim 9, further comprising:
when a display instruction sent by the first terminal equipment is received, displaying a first image of the object according to the display instruction and the position relation between the second terminal equipment and the first terminal equipment,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
16. An apparatus for transmitting data, the apparatus being applied to a first terminal device, wherein the first terminal device is connected to two or more candidate terminal devices, and wherein the two or more candidate terminal devices include a second terminal device, the apparatus comprising:
a first obtaining module, configured to obtain a data type of an object when the first terminal device detects a first cross-device transmission operation for the object on the first terminal device, and a second displaying module, configured to display information of the two or more candidate terminal devices, so that a second determining module is configured to determine, according to a relationship between a stop position of the first cross-device transmission operation and a display position of the information of the two or more candidate terminal devices, a second terminal device corresponding to the first cross-device transmission operation, where the first cross-device transmission operation is used to initiate a process of transmitting the object to the second terminal device;
or, a third determining module, configured to determine, according to a position relationship between a staying position of the first cross-device transmission operation and an edge of a first terminal device, a second terminal device corresponding to the first cross-device transmission operation, and a first obtaining module, configured to obtain, by the first terminal device, a data type of an object on the first terminal device when the first cross-device transmission operation for the object is detected;
a first sending module, configured to send a determination request carrying the data type to the second terminal device,
a first receiving module, configured to receive a determination result made by the second terminal device based on the data type, where the determination result indicates whether the object can be transmitted to the second terminal device;
the first display module is used for displaying the judgment result;
a second sending module, configured to send the object to a second terminal device when the determination result is that the object can be transmitted to the second terminal device and a second cross-device transmission operation is detected, where the second cross-device transmission operation is used to confirm that the object is transmitted to the second terminal device;
the first cross-device transmission operation is a drag operation, and the second cross-device transmission operation is a drag release operation.
17. The apparatus according to claim 16, wherein the determination request carries a data type field and/or an extended data type field, and the data type field and the extended data type field are used to indicate a data type of the object.
18. The apparatus of claim 16, wherein the second sending module comprises:
the first storage unit is used for temporarily storing the object in the local of the first terminal equipment;
a first sending unit, configured to send a data transmission request to the second terminal device, where the data transmission request is used to transmit the object;
and the second sending unit is used for sending the object according to the first response signal when receiving the first response signal returned by the second terminal equipment and accepting the data transmission request.
19. The apparatus of claim 18, wherein the first sending unit is further configured to send the first message to a second sending unit
And when a call-out instruction for the temporarily stored object is detected, calling out the temporarily stored object, and sending the data transmission request to the second terminal equipment.
20. The apparatus according to claim 18 or 19, wherein the second sending unit is further configured to:
upon receiving the first response signal, directly transmitting the object;
or delaying the transmission of the object upon receiving the first response signal.
21. The apparatus of claim 16, further comprising:
a third sending module, configured to send a display instruction to the second terminal device to instruct the second terminal device to display the first image of the object according to the display instruction and a positional relationship between the second terminal device and the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be spliced into a complete image of the object;
and the fourth sending module is used for sending the object to the second terminal equipment if the area of the second image of the object displayed on the display screen is detected to meet the sending condition.
22. The apparatus of claim 16, further comprising:
a fifth sending module, configured to send a display status request to the second terminal device, so that the second terminal device returns to a current display interface of the second terminal device in response to the display status request, and displays the display interface;
the first transmitting module includes:
a third sending unit, configured to send a determination request to the second terminal device, where the determination request carries the data type and a coordinate of a location where the first cross-device transmission operation stays in the display interface,
the first receiving module includes:
a receiving unit, configured to receive a determination result made by the second terminal device based on the data type and the coordinate.
23. The apparatus of claim 16, further comprising:
the third determining module is used for determining the processing mode of the second terminal equipment to the object;
the second sending module further comprises:
and a fourth sending unit, configured to send the object and indication information to the second terminal device, where the indication information is used to instruct the second terminal device to process the object in the processing manner.
24. A data transmission device, which is applied to a second terminal device, wherein the second terminal device is connected with a first terminal device, the device comprising:
a second receiving module, configured to receive a determination request sent by a first terminal device, where the determination request carries a data type of an object to be transmitted, and the determination request is used to request the second terminal device to determine whether the object of the data type can be transmitted to the second terminal device;
the first judgment module is used for making a judgment result according to the data type and sending the judgment result to the first terminal equipment so as to enable the first terminal equipment to display the judgment result;
the second terminal device is determined by the first terminal device according to a relationship between a stop position of the first cross-device transmission operation and display positions of the information of the two or more candidate terminal devices, wherein the first cross-device transmission operation is used for initiating a process of transmitting the object to the second terminal device;
or the second terminal device is determined by the first terminal device according to the position relationship between the stop position of the first cross-device transmission operation and the edge of the first terminal device.
25. The apparatus of claim 24, further comprising:
a sixth sending module, configured to, when receiving a display state request sent by a first terminal device, return a current display interface of a second terminal device to the first terminal device, so that the first terminal device determines coordinates of a position where a first cross-device transmission operation stays in the display interface, where the first cross-device transmission operation is an operation in which the first terminal device is configured to initiate a process of transmitting the object to the second terminal device;
wherein the judgment request carries the coordinates,
the first judging module comprises:
and the first judgment unit is used for making a judgment result according to the data type and the coordinate.
26. The apparatus of claim 25, further comprising:
and the first processing module is used for processing the object according to one or more of the data type, the local storage state, the application service installed by the second terminal equipment and the coordinates when the object sent by the first terminal equipment is received.
27. The apparatus of claim 26, wherein the first processing module comprises:
the first processing unit is used for opening the object according to the application service installed by the second terminal equipment when the local storage state is that the local storage state does not have the storage capacity;
and the second processing unit is used for locally storing the object when the local storage state is that the storage capacity is available.
28. The apparatus of claim 27, wherein the second processing unit is further configured to
When the local storage state is that the storage capacity is available and the corresponding position of the coordinate in the display interface does not allow the data type to be inserted, locally storing the object and selecting an application program to open the object according to the data type;
and when the local storage state is that the storage capacity is provided and the corresponding position of the coordinate in the display interface allows the data type to be inserted, locally storing the object and opening the object in the display interface according to the coordinate.
29. The apparatus of any one of claims 25-28, further comprising:
the storage module is used for temporarily storing the object in the local area if a second terminal device has storage capacity when receiving the object sent by the first terminal device;
the third display module is used for displaying the processing options;
and the second processing module is used for processing the object according to the selection operation aiming at the processing option.
30. The apparatus of claim 25, further comprising:
a fourth display module, configured to display the first image of the object according to the display instruction and the positional relationship between the second terminal device and the first terminal device when receiving the display instruction sent by the first terminal device,
the second image of the object displayed on the first terminal device and the first image of the object displayed on the second terminal device can be pieced together into a complete image of the object.
31. An apparatus for transmitting data, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to carry out the instructions when executing the method of any one of claims 1 to 8 or to carry out the method of any one of claims 9 to 15.
32. A non-transitory computer readable storage medium having stored thereon computer program instructions, wherein the computer program instructions, when executed by a processor, implement the method of any one of claims 1-8 or the method of any one of claims 9-15.
CN202010333906.0A 2020-04-24 2020-04-24 Data transmission method and device Active CN111666055B (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
CN202111463198.3A CN114356198A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202111463143.2A CN114356197A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202010333906.0A CN111666055B (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202080100104.3A CN115516413A (en) 2020-04-24 2020-12-31 Object dragging method and device
US17/920,867 US20240053879A1 (en) 2020-04-24 2020-12-31 Object Drag Method and Device
PCT/CN2020/142420 WO2021212922A1 (en) 2020-04-24 2020-12-31 Object dragging method and device
EP20932244.5A EP4130963A4 (en) 2020-04-24 2020-12-31 Object dragging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010333906.0A CN111666055B (en) 2020-04-24 2020-04-24 Data transmission method and device

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202111463198.3A Division CN114356198A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202111463143.2A Division CN114356197A (en) 2020-04-24 2020-04-24 Data transmission method and device

Publications (2)

Publication Number Publication Date
CN111666055A CN111666055A (en) 2020-09-15
CN111666055B true CN111666055B (en) 2021-12-14

Family

ID=72382836

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202111463143.2A Pending CN114356197A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202111463198.3A Pending CN114356198A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202010333906.0A Active CN111666055B (en) 2020-04-24 2020-04-24 Data transmission method and device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202111463143.2A Pending CN114356197A (en) 2020-04-24 2020-04-24 Data transmission method and device
CN202111463198.3A Pending CN114356198A (en) 2020-04-24 2020-04-24 Data transmission method and device

Country Status (1)

Country Link
CN (3) CN114356197A (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756151A (en) * 2020-12-25 2022-07-15 华为技术有限公司 Interface element display method and equipment
CN114356197A (en) * 2020-04-24 2022-04-15 华为技术有限公司 Data transmission method and device
CN114201128A (en) 2020-09-02 2022-03-18 华为技术有限公司 Display method and device
US20240053879A1 (en) * 2020-04-24 2024-02-15 Huawei Technologies Co., Ltd. Object Drag Method and Device
CN114328423A (en) * 2020-09-30 2022-04-12 华为技术有限公司 File sharing method and electronic equipment
WO2022109883A1 (en) * 2020-11-25 2022-06-02 京东方科技集团股份有限公司 Screen projection interaction method, screen projection system and terminal device
CN113766303B (en) * 2021-05-08 2023-04-28 北京字节跳动网络技术有限公司 Multi-screen interaction method, device, equipment and storage medium
CN115033319A (en) * 2021-06-08 2022-09-09 华为技术有限公司 Distributed display method and terminal of application interface
CN114760291B (en) * 2022-06-14 2022-09-13 深圳乐播科技有限公司 File processing method and device
CN115665711A (en) * 2022-10-26 2023-01-31 昆山联滔电子有限公司 Wireless transmission system and method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279288A (en) * 2013-05-31 2013-09-04 北京小米科技有限责任公司 Method, device and terminal units for transmitting data
CN104349110A (en) * 2014-09-24 2015-02-11 招商银行股份有限公司 Terminal remote interaction method and system
CN105892851A (en) * 2016-03-29 2016-08-24 北京金山安全软件有限公司 Visual resource transmission method and device and electronic equipment
CN106844063A (en) * 2016-12-30 2017-06-13 深圳市优博讯科技股份有限公司 Cross-platform data processing method, system and cross-platform data shared system
CN107124690A (en) * 2017-03-31 2017-09-01 上海掌门科技有限公司 A kind of method carried out data transmission between intelligent watch and mobile phone
CN107222936A (en) * 2017-06-26 2017-09-29 广东欧珀移动通信有限公司 A kind of data processing method, device and terminal
CN108123826A (en) * 2017-09-25 2018-06-05 珠海许继芝电网自动化有限公司 A kind of interactive system and method for transregional data
CN108718439A (en) * 2018-05-22 2018-10-30 北京硬壳科技有限公司 Data transmission method and device
CN108874713A (en) * 2018-05-31 2018-11-23 联想(北京)有限公司 A kind of information processing method and device
CN109690967A (en) * 2016-09-15 2019-04-26 高通股份有限公司 Wireless orientation based on antenna sector is shared

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101495176B1 (en) * 2008-08-12 2015-03-02 엘지전자 주식회사 Mobile terminal and method for transmission information thereof
CN102687117B (en) * 2012-02-06 2014-09-03 华为技术有限公司 Method and device for data transmission
CN102945131B (en) * 2012-09-29 2016-05-25 华为技术有限公司 A kind of data transmission method and device based on orientation
KR20160047273A (en) * 2014-10-22 2016-05-02 엘지전자 주식회사 Watch type terminal
CN106406127B (en) * 2015-07-31 2019-08-13 腾讯科技(深圳)有限公司 The generation method and generating means of the control interface of internet of things equipment
CN105635948A (en) * 2015-12-31 2016-06-01 上海创功通讯技术有限公司 Data sending method and data sending module
CN105681441B (en) * 2016-01-29 2019-06-28 腾讯科技(深圳)有限公司 Data transmission method and device
CN107491469B (en) * 2016-06-11 2020-11-24 苹果公司 Intelligent task discovery
CN114356197A (en) * 2020-04-24 2022-04-15 华为技术有限公司 Data transmission method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279288A (en) * 2013-05-31 2013-09-04 北京小米科技有限责任公司 Method, device and terminal units for transmitting data
CN104349110A (en) * 2014-09-24 2015-02-11 招商银行股份有限公司 Terminal remote interaction method and system
CN105892851A (en) * 2016-03-29 2016-08-24 北京金山安全软件有限公司 Visual resource transmission method and device and electronic equipment
CN109690967A (en) * 2016-09-15 2019-04-26 高通股份有限公司 Wireless orientation based on antenna sector is shared
CN106844063A (en) * 2016-12-30 2017-06-13 深圳市优博讯科技股份有限公司 Cross-platform data processing method, system and cross-platform data shared system
CN107124690A (en) * 2017-03-31 2017-09-01 上海掌门科技有限公司 A kind of method carried out data transmission between intelligent watch and mobile phone
CN107222936A (en) * 2017-06-26 2017-09-29 广东欧珀移动通信有限公司 A kind of data processing method, device and terminal
CN108123826A (en) * 2017-09-25 2018-06-05 珠海许继芝电网自动化有限公司 A kind of interactive system and method for transregional data
CN108718439A (en) * 2018-05-22 2018-10-30 北京硬壳科技有限公司 Data transmission method and device
CN108874713A (en) * 2018-05-31 2018-11-23 联想(北京)有限公司 A kind of information processing method and device

Also Published As

Publication number Publication date
CN111666055A (en) 2020-09-15
CN114356197A (en) 2022-04-15
CN114356198A (en) 2022-04-15

Similar Documents

Publication Publication Date Title
CN111666055B (en) Data transmission method and device
EP4024193A1 (en) Data transmission method and related devices
EP4177725A1 (en) Cross-device object dragging method and device
US11922005B2 (en) Screen capture method and related device
WO2021057868A1 (en) Interface switching method and electronic device
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
US20220308753A1 (en) Split-Screen Method and Electronic Device
KR102481065B1 (en) Application function implementation method and electronic device
EP4184308A1 (en) Data transmission method and device
WO2022028494A1 (en) Multi-device data collaboration method and electronic device
CN111225108A (en) Communication terminal and card display method of negative screen interface
EP4280058A1 (en) Information display method and electronic device
EP3951589A1 (en) View display method and electronic device
CN111274564A (en) Communication terminal and application unlocking method in split screen mode
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
US20190260871A1 (en) Electronic device and method of executing application
WO2023221946A1 (en) Information transfer method and electronic device
WO2022194005A1 (en) Control method and system for synchronous display across devices
CN114520867B (en) Camera control method based on distributed control and terminal equipment
CN116954409A (en) Application display method and device and storage medium
CN115114607A (en) Sharing authorization method, device and storage medium
CN113467961A (en) Copy and paste method, electronic equipment and system
CN114513760B (en) Font library synchronization method, device and storage medium
WO2022121751A1 (en) Camera control method and apparatus, and storage medium
CN117472220A (en) Operation identification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant