CN114860142A - Dragging processing method and device - Google Patents

Dragging processing method and device Download PDF

Info

Publication number
CN114860142A
CN114860142A CN202110074033.0A CN202110074033A CN114860142A CN 114860142 A CN114860142 A CN 114860142A CN 202110074033 A CN202110074033 A CN 202110074033A CN 114860142 A CN114860142 A CN 114860142A
Authority
CN
China
Prior art keywords
application
control
data structure
electronic device
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110074033.0A
Other languages
Chinese (zh)
Other versions
CN114860142B (en
Inventor
张飞雨
赵金龙
张帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110074033.0A priority Critical patent/CN114860142B/en
Priority to PCT/CN2021/137666 priority patent/WO2022156427A1/en
Publication of CN114860142A publication Critical patent/CN114860142A/en
Application granted granted Critical
Publication of CN114860142B publication Critical patent/CN114860142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/13File access structures, e.g. distributed indices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a drag processing method and a drag processing device, wherein the method can enable a non-native control of application software to respond to a drag function, and the method comprises the following steps: the electronic equipment acquires first information corresponding to a first control according to a dragging operation of moving the content of the first control of the first application to a second application, and the first information is packaged in a first data structure body; the electronic equipment judges whether the second application supports the first data structure body; if the second application supports the first data structure body, the electronic equipment sends the first data structure body to the second application; if the second application does not support the first data structure, the electronic device converts the first data structure into a second data structure and sends the second data structure to the second application, wherein the second data structure is supported by the second application.

Description

Dragging processing method and device
Technical Field
The application relates to the technical field of terminals, in particular to a drag processing method and a drag processing device.
Background
Currently, mobile terminals in the market all support a drag function. For example, the user may change the position of an icon on the desktop of the mobile terminal through a drag function. However, in some scenarios, the non-native controls of the application software may not be able to respond to the drag functionality provided by the operating system.
Because in the process of developing the operating system, the staff performs much work on the adaptation between the native control and the operating system, so that the native control and the operating system have higher compatibility, the native control can respond to the dragging function provided by the operating system. Because the staff does not do the adaptation work between the non-native control and the operating system, the non-native control cannot respond to the dragging function provided by the operating system.
Therefore, how to ensure that the non-native control of the application software can respond to the drag function provided by the operating system becomes a technical problem which needs to be solved urgently at present.
Disclosure of Invention
The application provides a drag processing method and a drag processing device, so that a non-native control of application software can respond to a drag function.
In a first aspect, the present application provides a drag processing method, including: the electronic equipment acquires first information corresponding to a first control according to a dragging operation of moving the content of the first control of the first application to a second application, and the first information is packaged in a first data structure body; the electronic equipment judges whether the second application supports the first data structure body; if the second application supports the first data structure body, the electronic equipment sends the first data structure body to the second application; if the second application does not support the first data structure, the electronic device converts the first data structure into a second data structure and sends the second data structure to the second application, wherein the second data structure is supported by the second application.
In the first aspect, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a native control supported by an operating system of the electronic device, the electronic device may send the first data structure to the second application, so that the second control of the second application may obtain the first text information in the first data structure, and the second control may load the first text information. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control that is not supported by the operating system of the electronic device, the electronic device may convert the first data structure into a second data structure that is supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain the first text information in the second data structure, and the second control may load the first text information, so that the second control of the second application may respond to the drag function.
In one possible implementation of the first aspect, the electronic device receives a drag operation that moves content of a first control of a first application to a second application.
In one possible implementation manner of the first aspect, the moving the content of the first control of the first application to the drag operation of the second application includes: the content of a first control of a first application is moved onto a second control of a second application.
In one possible implementation manner of the first aspect, the moving the content of the first control of the first application to the drag operation of the second application includes: the content of the first control of the first application is moved to the icon of the second application.
In a possible implementation manner of the first aspect, after the electronic device receives that the content of the first control of the first application is moved to the icon of the second application, the method further includes: the electronic equipment displays one or more suspension icons corresponding to the second application; the electronic equipment receives an operation that a user selects a first floating icon from one or more floating icons, wherein the first floating icon corresponds to a first interface with a second control; the electronic device determining whether the second application supports the first data structure includes: the electronic device determines whether the second control supports the first data structure.
After the user moves the content of the first control of the first application to the icon of the second application, the electronic device displays one or more floating icons corresponding to the second application, and the one or more floating icons respectively correspond to one or more interfaces with the second control of the second application. After the electronic device receives an operation of selecting the first floating icon by the user, it indicates that the user wants to move the content of the first control of the first application to the second control on the first interface corresponding to the first floating icon, and then the electronic device can continue to execute the subsequent steps. Therefore, after the user moves the content of the first control of the first application onto the icon of the second application, the second application can respond to the drag function regardless of whether the second control of the second application supports the first data structure.
In one possible implementation manner of the first aspect, the electronic device has a first application and a second application installed therein.
In one possible implementation manner of the first aspect, the moving the content of the first control of the first application to the drag operation of the second application includes: and moving the content of the first control of the first application to a second control or icon of a second application in the collaborative window, wherein the collaborative window is a screen projection window of the external device on the electronic device, and the second application is installed on the external device.
In a possible implementation manner of the first aspect, the determining, by the electronic device, whether the second application supports the first data structure includes: the electronic equipment sends inquiry information to the external equipment; the electronic equipment receives response information sent by the external equipment in response to the query information; the electronic device judges whether the second application supports the first data structure body according to the response information.
After the electronic device generates the first data structure, the electronic device sends query information to the external device, so that the external device judges whether the second control of the second application supports the first data structure. And the external equipment sends response information to the electronic equipment so that the electronic equipment can know whether the second control of the second application supports the first data structure body. When the content in the response information is the first identification information, it is described that the second control of the second application supports the first data structure, and then the electronic device sends the first data structure to the external device. When the content in the response information is the second identification information, it is indicated that the second control of the second application does not support the first data structure, and then the electronic device converts the first data structure into the second data structure supported by the second application and sends the second data structure to the external device, so that the second control of the second application can respond to the drag function.
In a possible implementation manner of the first aspect, the content of the first control of the first application includes a first text, and the first information corresponding to the first control includes text information of the first text in the first control; or the content of the first control of the first application comprises a first picture, and the first information corresponding to the first control comprises a storage path of the first picture in the first control.
In one possible implementation manner of the first aspect, the first data structure includes clipdata, and the second data structure includes intent.
In a second aspect, the present application provides a drag processing method, including: the first device receives a drag operation input by a user, wherein the drag operation refers to moving the content of a first control of a first application to a second control or icon of a second application in a cooperative window. And the collaboration window is a screen projection window of the second equipment on the first equipment. The method comprises the steps that first equipment obtains first information corresponding to a first control; the first device encapsulates the first information within a first data structure; the first device sends the first data structure to the second device.
In a third aspect, the present application provides a drag processing method, including: the second equipment receives a first data structure body sent by the first equipment, wherein the first data structure body comprises first information; the second device judges whether the second control supports the first data structure body; if the second control supports the first data structure body, the second equipment sends the first data structure body to a second control of a second application, and the second application is installed in the second equipment; and if the second control supports the first data structure, the second device converts the first data structure into a second data structure and sends the second data structure to a second control of a second application, wherein the second data structure is a data structure supported by the second control.
In the second aspect and the third aspect, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a control supported by an operating system of the second device, the second device may send the first data structure to the second application, so that the second control of the second application may obtain the first text information in the first data structure, and the second control may load the first text information. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control, the second device may convert the first data structure into a second data structure supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain the first text information in the second data structure, and the second control may load the first text information, so that the second control of the second application may respond to the drag function.
In a fourth aspect, the present application provides a drag processing method, including: the first device receives a drag operation input by a user, wherein the drag operation refers to moving the content of a first control of a first application to a second control or icon of a second application in a cooperative window. And the collaboration window is a screen projection window of the second equipment on the first equipment. The method comprises the steps that first equipment obtains first information corresponding to a first control; the first device encapsulates the first information within a first data structure; the first equipment sends a first query request to the second equipment, wherein the first query request is used for indicating the second equipment to judge whether the second control supports the first data structure body; the first device receives first response information sent by the second device, wherein the first response information is used for indicating whether the second control supports the first data structure body; if the second control supports the first data structure body, the first equipment sends the first data structure body to a second control of a second application in the second equipment; and if the second control does not support the first data structure, the first device converts the first data structure into a second data structure and sends the second data structure to a second control of a second application in the second device, wherein the second data structure is a data structure which can be identified by the second control.
In a fifth aspect, the present application provides a drag processing method, including: the method comprises the steps that a first query request sent by first equipment is received by second equipment, the first query request is used for indicating the second equipment to judge whether a second control of a second application supports a first data structure body, and the first data structure body comprises first information; the second device judges whether the second control supports the first data structure body; if the second control supports the first data structure, the second device sends first response information to the first device and receives the first data structure sent by the first device, wherein the first response information is used for indicating that the second control can identify the first data structure; or, if the second control does not support the first data structure, the second device sends the first response information to the first device, and receives a second data structure sent by the first device, where the first response information is used to indicate that the second control cannot identify the first data structure, and the second data structure is a data structure that the second control can identify.
In the fourth and fifth aspects, after the first device generates the first data structure, the first device sends a first query request to the second device, so that the second device determines whether the second control of the second application supports the first data structure. And the second device sends the first response information to the first device, so that the first device can know whether the second control of the second application supports the first data structure body. When the content in the first response information is the first identification information, it is described that the second control of the second application supports the first data structure, and then the first device sends the first data structure to the second device. When the content in the first response information is the second identification information, it is indicated that the second control of the second application does not support the first data structure, and then the first device converts the first data structure into the second data structure supported by the second application and sends the second data structure to the second device, so that the second control of the second application can respond to the drag function.
In a sixth aspect, the present application provides an electronic device comprising one or more processors and a memory to store instructions; the processor is used for executing the instructions to cause the electronic equipment to execute the following operations: acquiring first information corresponding to a first control according to a dragging operation of moving the content of the first control of the first application to a second application, wherein the first information is encapsulated in a first data structure body; determining whether the second application supports the first data structure; if the second application supports the first data structure body, the first data structure body is sent to the second application; and if the second application does not support the first data structure body, converting the first data structure body into a second data structure body, and sending the second data structure body to the second application, wherein the second data structure body is a data structure body supported by the second application.
In one possible implementation manner of the sixth aspect, the processor is further configured to receive a drag operation for moving the content of the first control of the first application to the second application.
In a possible implementation manner of the sixth aspect, the processor is specifically configured to move content of a first control of the first application onto a second control of the second application.
In one possible implementation of the sixth aspect, the processor is specifically configured to move content of a first control of the first application onto an icon of the second application.
In a possible implementation manner of the sixth aspect, the processor is specifically configured to display one or more floating icons corresponding to the second application; receiving an operation of a user for selecting a first floating icon from one or more floating icons, wherein the first floating icon corresponds to a first interface with a second control; and judging whether the second control supports the first data structure body.
In one possible implementation manner of the sixth aspect, the electronic device has a first application and a second application installed therein.
In a possible implementation manner of the sixth aspect, the processor is specifically configured to move the content of the first control of the first application to a second control or an icon of a second application in the collaborative window, where the collaborative window is a screen projection window of the external device on the electronic device, and the second application is installed on the external device.
In a possible implementation manner of the sixth aspect, the processor is specifically configured to send query information to an external device; receiving response information sent by the external equipment in response to the query information; and judging whether the second application supports the first data structure body or not according to the response information.
In a possible implementation manner of the sixth aspect, the content of the first control of the first application includes a first text, and the first information corresponding to the first control includes text information of the first text in the first control; or the content of the first control of the first application comprises a first picture, and the first information corresponding to the first control comprises a storage path of the first picture in the first control.
In one possible implementation manner of the sixth aspect, the first data structure includes clipdata, and the second data structure includes intent.
In a seventh aspect, embodiments of the present application provide a computer-readable storage medium, where a software code is stored, and the software code is capable of performing the first aspect, each possible implementation manner of the first aspect, the second aspect, the third aspect, the fourth aspect, or the fifth aspect after being read by one or more processors.
Drawings
Fig. 1 is a schematic diagram of a desktop of a tablet computer 100;
fig. 2 is a schematic split-screen view of the tablet pc 100;
FIG. 3 is a diagram illustrating a user dragging hover icon 107;
FIG. 4 is a diagram illustrating a second control 106 on the interface of the memo 102 loading the picture 104;
fig. 5 is a schematic diagram illustrating an internal processing flow of the tablet pc 100;
fig. 6 is a schematic split-screen view of another tablet pc 100;
fig. 7 is a schematic diagram of an internal processing flow of another tablet computer 100;
FIG. 8 is a flowchart of a drag processing method provided in the present application;
fig. 9 is a schematic diagram of a desktop of another tablet computer 100;
fig. 10 is a schematic split-screen view of the tablet pc 200;
FIG. 11 is a diagram illustrating a user dragging a hover icon 207;
FIG. 12 is a diagram illustrating a text box 206 on the interface of chat 203 loading text information 204;
fig. 13 is a schematic diagram illustrating an internal processing flow of the tablet pc 200;
FIG. 14 is a flow chart of another drag processing method provided herein;
FIG. 15 is a flowchart of yet another drag processing method provided herein;
fig. 16 is a schematic diagram of the smartphone 400 being projected to the tablet pc 300;
fig. 17 is a schematic diagram of the tablet computer 200 simultaneously displaying the interface of the memo 301 and the collaboration window 302;
FIG. 18 is a diagram illustrating a user dragging hover icon 305;
FIG. 19 is a diagram of a text box 402 of chat 401 loaded with text message 303;
fig. 20 is a schematic interaction diagram between the tablet pc 300 and the smartphone 400;
FIG. 21 is a flowchart of another drag processing method provided in the present application;
FIG. 22 is a flow chart of yet another drag processing method provided herein;
fig. 23 is a schematic view of an electronic device according to an embodiment of the present application.
Detailed Description
The following briefly describes a scenario in which a picture 104 in a first control 105 in a gallery 101 is dragged into a second control 106 of a memo 102 by way of example shown in fig. 1 to 5. The first control 105 and the second control 106 are both native controls supported by an operating system 108 of the tablet pc 100.
Referring to fig. 1, fig. 1 is a schematic diagram of a desktop of a tablet pc 100. In the example shown in fig. 1, the operating system of tablet computer 100 may be a hong meng system (harmony os), an android system (android os), or an apple mobile operating system (IOS). Of course, the operating system of the tablet computer 100 is not limited to the above-mentioned operating system, and may be other types of operating systems. The tablet computer 100 is provided with a plurality of application software such as a gallery 101, a memo 102 and a chat 103.
In the scenario shown in fig. 1, a user may open the gallery 101, input a screen splitting instruction to the tablet computer 100, and open the memo 102, so that the tablet computer 100 displays the interface of the gallery 101 and the interface of the memo 102 at the same time.
Referring to fig. 2, fig. 2 is a schematic view of a split screen of the tablet pc 100. After the tablet computer 100 displays the interface of the gallery 101 and the interface of the memo 102 in a split screen, the user can use the finger 500 to press the picture 104 in the first control 105 on the interface of the gallery 101.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating a user dragging and dropping the floating icon 107. After the tablet computer 100 monitors the long-press operation input by the finger 500 of the user on the picture 104 in the first control 105, the tablet computer 100 acquires the storage path of the picture 104 through the first control 105, and encapsulates the storage path of the picture 104 in the data structure a. Meanwhile, the tablet computer 100 may draw the floating icon 107 according to the picture 104, and the floating icon 107 may move along with the finger 500 of the user. The user's finger 500 may then drag the hover icon 107 over the second control 106 on the interface of the memo 102.
In one possible implementation, if the operating system of the tablet pc 100 is an android system, the data structure a may be a clipdata object, and the clipdata object includes a storage path of the picture 104, where the storage path of the picture 104 is "file:// pad/images/jumps.
Illustratively, the clipdata object includes the following:
CharSequence mText=“”;
String mHtmlText=“”;
Intent mIntent=null;
uri muuri ═ file:// pad/images/jump. jpg;
as can be seen from the above example, the clipdata object includes a storage path of the picture 104.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating the second control 106 on the interface of the memo 102 loading the picture 104. After the user drags the hover icon 107 over the second control 106 on the interface of the memo 102 using the finger 500, the user's finger 500 leaves the tablet 100. When the tablet computer 100 monitors that the position where the finger 500 of the user leaves is the second control 106 on the interface of the memo 102, it indicates that the user wishes to drag the picture 104 in the first control 105 on the interface of the gallery 101 to the second control 106 on the interface of the memo 102, and the tablet computer 100 deletes the floating icon 107, and then sends the data structure a to the memo 102.
Since the second control 106 on the interface of the memo 102 is a native control supported by the operating system, the second control 106 can automatically parse the storage path of the picture 104 in the data structure a. The memo 102 then takes the picture 104 according to the storage path of the picture 104 and displays the picture 104 within the second control 106 on the interface of the memo 102.
It should be noted that, in the examples shown in fig. 2 to 4, the first control 105 and the second control 106 are both controls that exist objectively but cannot be seen by the user, so the first control 105 and the second control 106 are both represented by dashed boxes.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an internal processing flow of the tablet computer 100, and fig. 5 is an internal processing flow of the tablet computer 100 illustrated in fig. 1 to 4.
In the example shown in fig. 5, which may be combined with the examples shown in fig. 1 to 4, after the user drags the picture 104 in the first control 105 on the interface of the gallery 101 to the top of the second control 106 on the interface of the memo 102 and releases his/her hand, the operating system 108 of the tablet 100 obtains the storage path of the picture 104 through the first control 105 of the gallery 101. Then, the operating system 108 of the tablet 100 encapsulates the storage path of the picture 104 in the data structure a, and sends the data structure a to the memo 102. Since the second control 106 on the interface of the memo 102 is a native control supported by the operating system 108 of the tablet 100, the second control 106 on the interface of the memo 102 can automatically parse the storage path of the picture 104 in the data structure a, and then the memo 102 acquires the picture 104 in the memory 109 through the storage path of the picture 104, and displays the picture 104 in the second control 106 on the interface of the memo 102. Thus, the second control 106 in the memo 102 may respond to the drag function.
Referring to fig. 1, 6 and 7, fig. 6 is a schematic view of a split screen of another tablet computer 100, and fig. 7 is a schematic view of an internal processing flow of another tablet computer 100. The examples shown in fig. 1, 6 and 7 primarily describe a scenario in which a picture 104 within a first control 105 in gallery 101 is dragged to a third control 110 of chat 103. The first control 105 is a native control supported by the operating system 108 of the tablet pc 100, and the third control 110 is not a native control supported by the operating system 108 of the tablet pc 100.
In the example shown in fig. 1, 6 and 7, the user needs to input a split screen instruction to the tablet computer 100, so that the tablet computer 100 displays the interface of the gallery 101 and the interface of the chat 103 at the same time. Then, after the user drags the picture 104 in the first control 105 on the interface of the gallery 101 to the top of the third control 110 on the interface of the chat 103 and releases his/her hand, the operating system 108 of the tablet pc 100 acquires the storage path of the picture 104 through the first control 105. Then, the operating system 108 of the tablet 100 encapsulates the storage path of the picture 104 in the data structure a, and sends the data structure a to the chat 103. Since the third control 110 on the interface of the chat 103 is not a native control supported by the operating system 108 of the tablet computer 100, the third control 110 on the interface of the chat 103 cannot automatically parse the storage path of the picture 104 in the data structure a, and then the third control 110 on the interface of the chat 103 cannot display the picture 104. Thus, the third control 110 in chat 103 is unable to respond to the drag function.
As can be appreciated from the examples shown in fig. 1-7, since the second control 106 on the interface of the memo 102 is a native control supported by the operating system 108 of the tablet 100, the second control 106 in the memo 102 may respond to the drag function. Since the third control 110 on the interface of the chat 103 is not a native control supported by the operating system 108 of the tablet computer 100, the third control 110 in the chat 103 cannot respond to the drag function. Thus, the non-native controls of the application software cannot respond to the drag function.
Referring to fig. 8, fig. 8 is a flowchart of a drag processing method provided in the present application. The drag processing method shown in fig. 8 may make the non-native control of the application software respond to the drag function, and the method shown in fig. 8 includes steps S101 to S106.
S101, the electronic equipment receives a drag operation input by a user.
The electronic device can be a smart phone or a tablet computer and the like, and the first application and the second application are installed in the electronic device.
In S101, the drag operation refers to moving the content of the first control of the first application onto the second control of the second application. The content of the first control of the first application includes first text, and the first information corresponding to the first control includes textual information of the first text within the first control.
Specifically, the first control is a first text box, and the second control is a second text box. For example, when the operating system of the electronic device is an android system, both the first control and the second control are TextView.
S102, the electronic equipment acquires first text information in the first control.
S103, the electronic equipment encapsulates the first text information in the first data structure body.
For example, when the operating system of the electronic device is an android system, then the first data structure may be a clipdata object.
S104, the electronic equipment judges whether the second control of the second application supports the first data structure body. If yes, go to step S105; otherwise, step S106 is executed.
And S105, the electronic equipment sends the first data structure body to the second application.
S106, the electronic device converts the first data structure into a second data structure and sends the second data structure to a second application.
And the second data structure is a data structure supported by a second control of the second application.
For example, when the operating system of the electronic device is an android system, then the second data structure may be an intent object.
In the embodiment shown in fig. 8, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a native control supported by an operating system of the electronic device, the electronic device may send the first data structure to the second application, so that the second control of the second application may obtain the first text information in the first data structure, and the second control may load the first text information. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control that is not supported by the operating system of the electronic device, the electronic device may convert the first data structure into a second data structure that is supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain the first text information in the second data structure, and the second control may load the first text information, so that the second control of the second application may respond to the drag function.
In an implementable embodiment shown in fig. 8, if the drag operation in S101 refers to moving the content of the first control of the first application onto the icon of the second application, then after S101, before S102, the method shown in fig. 8 may further comprise the steps of: the electronic equipment displays one or more floating icons corresponding to the second application, and receives an operation that a user selects a first floating icon from the one or more floating icons, wherein the first floating icon corresponds to a first interface with a second control, and the first interface is an interface of the second application.
After the user moves the content of the first control of the first application to the icon of the second application, the electronic device displays one or more floating icons corresponding to the second application, and the one or more floating icons respectively correspond to one or more interfaces with the second control of the second application. After the electronic device receives an operation of selecting the first floating icon by the user, it indicates that the user wants to move the content of the first control of the first application to the second control on the first interface corresponding to the first floating icon, and then the electronic device may continue to execute steps S102 to S106. Therefore, after the user moves the content of the first control of the first application onto the icon of the second application, the second application can respond to the drag function regardless of whether the second control of the second application supports the first data structure.
When the operating system of the electronic device is an android system, the operating system of the electronic device queries whether an interface with a second control exists in the second application by using a Package Manager Service (PMS).
The drag processing method shown in fig. 8 will be described below by way of examples shown in fig. 9 to 13.
Referring to fig. 9, fig. 9 is a schematic diagram of a desktop of another tablet pc 100. In the example shown in fig. 9, the operating system 208 of the tablet computer 200 may be a hong meng system (harmony os), an android system (android os), or an apple mobile operating system (IOS). Of course, the operating system 208 of the tablet computer 200 is not limited to the above-mentioned operating system, and may be other types of operating systems. The tablet computer 200 is internally provided with a plurality of application software such as a gallery 201, a memo 202 and a chat 203.
In the scenario shown in fig. 9, the user may open the memo 202, input a screen splitting instruction to the tablet computer 200, and open the chat 203, so that the tablet computer 200 displays the interface of the memo 202 and the interface of the chat 203 at the same time.
Referring to fig. 10, fig. 10 is a schematic view of a split screen of the tablet pc 200. After the tablet computer 200 displays the interface of the memo 202 and the interface of the chat 203 in a split screen, the user can use the finger 500 to press the text information 204 in the text box 205 on the interface of the memo 202. The specific content of the text message 204 is "member account number: 12345".
Referring to fig. 11, fig. 11 is a schematic diagram illustrating a user dragging the floating icon 207. After the tablet computer 200 monitors the long press operation of the finger 500 of the user on the text message 204 in the text box 205, the tablet computer 200 acquires the text message 204 in the text box 205 and encapsulates the text message 204 in the data structure a. Meanwhile, the tablet computer 200 draws the floating icon 207 according to the text information 204, and the floating icon 207 can move along with the finger 500 of the user. The user's finger 500 may then drag the hover icon 207 over the text box 206 on the interface of the chat 203, and the user's finger 500 then exits the hover icon 207.
Referring to fig. 12, fig. 12 is a schematic diagram illustrating a text box 206 on an interface of a chat 203 loading a text message 204. After the user's finger 500 leaves the hover icon 207, the tablet computer 200 determines whether the text box 206 of the chat 203 supports data structure a. Since the text box 206 of the chat 203 is a non-native control, the text box 206 of the chat 203 does not support the data structure a, and then the tablet computer 200 may convert the data structure a into the data structure B supported by the text box 206, and send the data structure B to the chat 203, so that the text box 206 of the chat 203 may obtain the text information 204 in the data structure B, and the text box 206 may load the text information 204, so that the text box 206 of the chat 203 may respond to the drag function.
The following illustrates how the electronic device converts the data structure a into the data structure B. Assuming that the data structure a is a clipdata object, the text message 204 indicates that "the member account number is: 12345', and the data structure B is the intent object.
Illustratively, the clipdata object includes the following:
CharSequence mText ═ member account number: 12345 ";
String mHtmlText=“”;
Intent mIntent=null;
Uri mUri=“”;
the electronic device will obtain the text information 204 "member account number in clipdata object: 12345 "and the text message" member account number: 12345 "is encapsulated within an intent object, which includes the following:
Intent:
Action=Intent.Action_Send;
Type=Type:text/plain;
EXTRA _ TEXT ═ the member account is: 12345 ";
the clipdata object can only be recognized by the native control supported by the operating system 208 of the tablet computer 200, and the text box 206 of the chat 203 does not belong to the native control supported by the operating system 208 of the tablet computer 200, so that the clipdata object is not supported by the text box 206 of the chat 203. Based on this, the tablet computer 200 may convert the clipdata object into an Intent object supported by the text box 206 of the chat 203, and then the tablet computer 200 sends the Intent object to the chat 203, so that the text box 206 of the chat 203 may obtain text information 204 "the member account number is: 12345 ", and the text box 206 will load the text information 204, thereby enabling the text box 206 of the chat 203 to respond to the drag function.
Referring to fig. 13, fig. 13 is a schematic diagram illustrating an internal processing flow of the tablet pc 200, and fig. 13 is an internal processing flow of the tablet pc 200 illustrated in fig. 9 to 12.
In the example shown in fig. 13, which can be combined with the examples shown in fig. 9 to 12, after the user drags the text information 204 in the text box 205 on the interface of the memo 202 over the text box 206 on the interface of the chat 203 and releases his/her hand, the operating system 208 of the tablet computer 200 determines whether the text box 206 of the chat 203 supports the data structure a. Since the text box 206 of the chat 203 is a non-native control, the text box 206 of the chat 203 does not support the data structure a, and then the tablet computer 200 may convert the data structure a into the data structure B supported by the text box 206, and send the data structure B to the chat 203, so that the text box 206 of the chat 203 may obtain the text information 204 in the data structure B, and the text box 206 may load the text information 204, so that the text box 206 of the chat 203 may respond to the drag function.
Referring to fig. 14, fig. 14 is a flowchart of another drag processing method provided in the present application. The drag processing method shown in fig. 14 may make the non-native control of the application software respond to the drag function, and the method shown in fig. 14 includes steps S201 to S206.
S201, the electronic equipment receives a dragging operation input by a user.
The electronic device can be a smart phone or a tablet computer and the like, and the first application and the second application are installed in the electronic device.
In S201, the drag operation refers to moving the content of the first control of the first application onto the second control of the second application. The content of a first control of the first application comprises a first picture, and first information corresponding to the first control comprises a storage path of the first picture in the first control.
Specifically, the first control is a first image frame, and the second control is a second image frame. For example, when the operating system of the electronic device is an android system, both the first control and the second control are ImageView.
S202, the electronic equipment acquires a storage path of a first picture in the first control.
S203, the electronic device packages the storage path of the first picture in the first data structure body.
For example, when the operating system of the electronic device is an android system, then the first data structure may be a clipdata object.
S204, the electronic equipment judges whether the second control of the second application supports the first data structure body. If yes, go to step S205; otherwise, step S206 is executed.
S205, the electronic equipment sends the first data structure body to the second application.
S206, the electronic device converts the first data structure into a second data structure and sends the second data structure to a second application.
And the second data structure is a data structure which can be supported by the second control of the second application.
In the embodiment shown in fig. 14, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a native control supported by an operating system of the electronic device, the electronic device may send the first data structure to the second application, so that the second control of the second application may obtain a storage path of the first picture in the first data structure, and the second control may load the first picture through the storage path of the first picture. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control that is not supported by the operating system of the electronic device, the electronic device may convert the first data structure into a second data structure that is supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain a storage path of the first picture in the second data structure, and the second control may load the first picture through the storage path of the first picture, so that the second control of the second application can respond to a drag function.
For example, when the operating system of the electronic device is an android system, then the second data structure may be an intent object.
The following illustrates how the electronic device converts the first data structure into the second data structure. Suppose that the first data structure is a clipdata object, the storage path of the first picture is "file:// pad/images/jump. jpg", and the second data structure is an intent object.
Illustratively, the clipdata object includes the following:
CharSequence mText=“”;
String mHtmlText=“”;
Intent mIntent=null;
uri muuri ═ file:// pad/images/jump. jpg;
the electronic equipment acquires a storage path 'file:// pad/images/jump.jpg' of a first picture in the clipdata object, and encapsulates the storage path 'file:// pad/images/jump.jpg' of the first picture in an intent object, wherein the intent object comprises the following contents:
Intent:
Action=Intent.Action_Send;
Type=Type:imge/*;
EXTRA _ STREAM ═ file:// pad/images/jumps.jpg ";
the clipdata object can only be recognized by a native control supported by an operating system of the electronic equipment, and if the second control of the second application does not belong to the native control supported by the operating system of the electronic equipment, the clipdata object is not supported by the second control. Based on the method, the electronic device can convert the clipdata object into an Intent object supported by the second control, and then the electronic device sends the Intent object to the second application, so that the second control of the second application can acquire the storage path of the first picture in the Intent object, namely 'file:// pad/images/jump.jpg', and the second control can load the first picture through the storage path of the first picture, namely 'file:// pad/images/jump.jpg', and therefore the second control of the second application can respond to a dragging function.
Referring to fig. 15, fig. 15 is a flowchart of another drag processing method provided in the present application. The drag processing method shown in fig. 15 may make the non-native control of the application software respond to the drag function, and the method shown in fig. 15 includes steps S301 to S307.
S301, the first device receives a drag operation input by a user.
The first device is internally provided with a first application, the second device is internally provided with a second application, and a first communication connection is established between the first device and the second device.
Before S301, the second device sends a screen-projection instruction to the first device. After the first device agrees with the screen projection instruction, the second device sends the display desktop of the second device to the first device. The first device generates a collaboration window and displays a display desktop of the second device in the collaboration window.
In S301, the drag operation refers to moving the content of the first control of the first application onto a second control or icon of a second application within the collaborative window. And the collaboration window is a screen projection window of the second equipment on the first equipment.
Specifically, the first control is a first text box, and the second control is a second text box.
For example, when the operating system of the electronic device is an android system, both the first control and the second control are TextView.
S302, the first device obtains first text information in the first control.
S303, the first device packages the first text information in the first data structure body.
Wherein the first data structure includes first text information.
For example, when the operating system of the electronic device is an android system, then the first data structure may be a clipdata object.
S304, the first device sends the first data structure body to the second device.
S305, the second device judges whether the second control of the second application supports the first data structure body. If yes, go to step S306; otherwise, step S307 is executed.
S306, the second device sends the first data structure body to the second application.
And S307, the second device converts the first data structure body into a second data structure body and sends the second data structure body to the second application.
Wherein the second data structure is a data structure supported by the second application.
For example, when the operating system of the electronic device is an android system, then the second data structure may be an intent object.
In the embodiment shown in fig. 15, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a control supported by an operating system of the second device, the second device may send the first data structure to the second application, so that the second control of the second application may obtain the first text information in the first data structure, and the second control may load the first text information. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control, the second device may convert the first data structure into a second data structure supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain the first text information in the second data structure, and the second control may load the first text information, so that the second control of the second application may respond to the drag function.
The drag processing method shown in fig. 15 will be described below by way of examples shown in fig. 16 to 20.
Referring to fig. 16, fig. 16 is a schematic diagram illustrating the smart phone 400 being projected to the tablet pc 300. The tablet computer 300 is internally provided with application software such as a memo 301, the smart phone 400 is internally provided with application software such as information 401, and wireless communication connection is established between the tablet computer 300 and the smart phone 400.
In the scenario shown in fig. 16, the smartphone 400 sends a screen-projection instruction to the tablet pc 300 through a wireless communication connection. After the tablet pc 300 agrees to the screen projection instruction, the smart phone 400 sends the display desktop of the smart phone 400 to the tablet pc 300. The tablet pc 300 generates the collaboration window 302 and displays the display desktop of the smartphone 400 in the collaboration window 302. The user may then open a memo 301 on tablet 300 and open a chat 401 on smartphone 400.
Referring to fig. 17, fig. 17 is a schematic view of the tablet pc 200 simultaneously displaying an interface of a memo 301 and a collaboration window 302. After the tablet computer 200 simultaneously displays the interface of the memo 301 and the collaboration window 302, the user can press the text information 303 in the text box 304 on the interface of the memo 301 for a long time using the finger 500. The specific content of the text message 303 is "the member account number is: 12345".
Referring to fig. 18, fig. 18 is a schematic diagram illustrating a user dragging and dropping the floating icon 305. After the tablet pc 300 monitors the long press operation input by the finger 500 of the user on the text message 303 of the text box 304, the tablet pc 300 acquires the text message 303 in the text box 304 and encapsulates the text message 303 in the data structure a. Meanwhile, the tablet pc 300 also draws the floating icon 305 according to the text message 303, and the floating icon 305 can move along with the finger 500 of the user. The user's finger 500 may then drag the hover icon 305 over the text box 402 of the chat 401 in the collaboration window 302. The user's finger 500 then leaves the floating icon 305.
Referring to fig. 19, fig. 19 is a schematic diagram illustrating a text box 402 of a chat 401 loading text information 303. After the user's finger 500 leaves the floating icon 305, the tablet pc 300 sends the data structure a to the smartphone 400. After the smartphone 400 receives the data structure a sent by the tablet pc 300, the smartphone 400 determines whether the text box 402 of the chat 401 supports the data structure a. Since the text box 402 of the chat 401 is a non-native control, the text box 402 of the chat 401 does not support the data structure a, the smartphone 400 can convert the data structure a into the data structure B supported by the text box 402, and the smartphone 400 sends the data structure B to the chat 401, so that the text box 402 of the chat 401 can acquire the text information 303 in the data structure B, and the text box 402 loads the text information 303, so that the text box 402 of the chat 401 can respond to a drag function.
Referring to fig. 20, fig. 20 is a schematic diagram illustrating interaction between the tablet pc 300 and the smart phone 400, and fig. 20 is an internal processing flow of the tablet pc 300 and an internal processing flow of the smart phone 400 shown in fig. 16 to 19.
In the example shown in fig. 20, which may be combined with the examples shown in fig. 16 to fig. 19, after the user drags the text information 303 in the text box 304 of the memo 301 to the top of the text box 402 of the chat 401 in the collaboration window 302 and releases his hand, the operating system 306 of the tablet pc 300 acquires the text information 303 in the text box 304 of the memo 301, encapsulates the text information 303 in the data structure a, and sends the data structure a to the communication module 307. The communication module 307 of the tablet pc 300 sends the data structure a to the communication module 404 of the smart phone 400, and the communication module 404 of the smart phone 400 sends the data structure a to the operating system 403 of the smart phone 400. The operating system 403 of the smartphone 400 determines whether the text box 402 of the chat 401 supports data structure a. Since the text box 402 of the chat 401 is a non-native control, the text box 402 of the chat 401 does not support the data structure a, the operating system 403 of the smartphone 400 converts the data structure a into the data structure B supported by the text box 402, and then sends the data structure B to the chat 401, so that the text box 402 of the chat 401 can acquire the text information 303 in the data structure B, and the text box 402 loads the text information 303, so that the text box 402 of the chat 401 can respond to the drag function.
Referring to fig. 21, fig. 21 is a flowchart of another drag processing method provided by the present application. The drag processing method shown in fig. 21 may make the non-native control of the application software respond to the drag function, and the method shown in fig. 21 includes steps S401 to S407.
S401, the first device receives a dragging operation input by a user.
The first device is internally provided with a first application, the second device is internally provided with a second application, and a first communication connection is established between the first device and the second device.
Before S401, the second device sends a screen-projection instruction to the first device. After the first device agrees with the screen projection instruction, the second device sends the display desktop of the second device to the first device. The first device generates a collaboration window and displays a display desktop of the second device in the collaboration window.
In S401, the drag operation refers to moving the content of the first control of the first application onto a second control or icon of a second application within the collaborative window. And the collaboration window is a screen projection window of the second equipment on the first equipment.
Specifically, the first control is a first image frame, and the second control is a second image frame. For example, when the operating system of the electronic device is an android system, both the first control and the second control are ImageView.
S402, the first device obtains a storage path of a first picture in the first control.
S403, the first device packages the storage path of the first picture in the first data structure body.
The first data structure body comprises a storage path of the first picture.
For example, when the operating system of the electronic device is an android system, then the first data structure may be a clipdata object.
S404, the first device sends the first data structure body to the second device.
S405, the second device judges whether the second control of the second application supports the first data structure body. If yes, go to step S406; otherwise, step S407 is executed.
S406, the second device sends the first data structure body to the second application.
And S407, the second device converts the first data structure body into a second data structure body and sends the second data structure body to the second application.
Wherein the second data structure is a data structure supported by the second application.
For example, when the operating system of the electronic device is an android system, then the second data structure may be an intent object.
In the embodiment shown in fig. 21, if the second control of the second application supports the first data structure, which indicates that the second control of the second application is a control supported by an operating system of the second device, the second device may send the first data structure to the second application, so that the second control of the second application may obtain a storage path of the first picture in the first data structure, and the second control may load the first picture according to the storage path of the first picture. If the second control of the second application does not support the first data structure, which indicates that the second control of the second application is a non-native control, the second device may convert the first data structure into a second data structure supported by the second control, and then send the second data structure to the second application, so that the second control of the second application may obtain a storage path of the first picture in the second data structure, and the second control may load the first picture according to the storage path of the first picture, so that the second control of the second application may respond to a drag function.
Referring to fig. 22, fig. 22 is a flowchart of another drag processing method provided by the present application. The drag processing method shown in fig. 22 may make the non-native control of the application software respond to the drag function, and the method shown in fig. 22 includes steps S501 to S507.
S501, the first device receives a dragging operation input by a user.
The first device is internally provided with a first application, the second device is internally provided with a second application, and a first communication connection is established between the first device and the second device.
Before S501, the second device sends a screen-casting instruction to the first device. After the first device agrees with the screen projection instruction, the second device sends the display desktop of the second device to the first device. The first device generates a collaboration window and displays a display desktop of the second device in the collaboration window.
In S501, the drag operation refers to moving the content of the first control of the first application onto a second control or icon of a second application within the collaborative window. And the collaboration window is a screen projection window of the second equipment on the first equipment.
Specifically, the first control is a first text box, and the second control is a second text box. For example, when the operating system of the electronic device is an android system, both the first control and the second control are TextView.
S502, the first device obtains first text information in the first control.
S503, the first device packages the first text information in the first data structure body.
Wherein the first data structure includes first text information.
For example, when the operating system of the electronic device is an android system, then the first data structure may be a clipdata object.
S504, the first equipment sends a first query request to the second equipment.
The first query request is used for instructing the second device to judge whether the second control of the second application supports the first data structure body.
And S505, the second device judges whether the second control of the second application supports the first data structure body.
S506, the second device sends the first response information to the first device.
Wherein the first response information is the corresponding information generated by the second device according to the determination result in S505. If the second control of the second application supports the first data structure, the first response information includes first identification information indicating that the second control of the second application supports the first data structure. If the second control of the second application can not support the first data structure, the first response information includes second identification information indicating that the second control of the second application does not support the first data structure.
And S507, the first equipment sends the first data structure body or the second data structure body to the second equipment according to the content in the first response information.
When the content in the first response information is the first identification information, it is described that the second control of the second application supports the first data structure, and then the first device sends the first data structure to the second device. When the content in the first response information is the second identification information, it is indicated that the second control of the second application does not support the first data structure, and then the first device converts the first data structure into the second data structure and sends the second data structure to the second device. The second data structure is a data structure supported by the second application.
For example, when the operating system of the electronic device is an android system, then the second data structure may be an intent object.
In the embodiment shown in fig. 22, after the first device generates the first data structure, the first device sends a first query request to the second device, so that the second device determines whether the second control of the second application supports the first data structure. And the second device sends the first response information to the first device, so that the first device can know whether the second control of the second application supports the first data structure body. When the content in the first response information is the first identification information, it is described that the second control of the second application supports the first data structure, and then the first device sends the first data structure to the second device. When the content in the first response information is the second identification information, it is indicated that the second control of the second application does not support the first data structure, and then the first device converts the first data structure into the second data structure supported by the second application and sends the second data structure to the second device, so that the second control of the second application can respond to the drag function.
Referring to fig. 23, fig. 23 is a schematic view of an electronic device according to an embodiment of the present disclosure. The electronic device shown in fig. 23 includes a processor 61 and a memory 62.
In the embodiment shown in fig. 23, the processor 61 is configured to execute instructions stored in the memory 62 to cause the electronic device to perform the following operations: according to the dragging operation of moving the content of a first control of a first application to a second application, first information corresponding to the first control is obtained, and the first information is packaged in a first data structure body; determining whether the second application supports the first data structure; if the second application supports the first data structure body, the first data structure body is sent to the second application; and if the second application does not support the first data structure, converting the first data structure into a second data structure, and sending the second data structure to the second application, wherein the second data structure is a data structure supported by the second application.
The processor 61 is one or more CPUs. Optionally, the CPU is a single-core CPU or a multi-core CPU.
The Memory 62 includes, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), an erasable programmable Read-only Memory (EPROM or flash Memory), a flash Memory, an optical Memory, or the like. The memory 62 holds the code of the operating system.
Optionally, the electronic device further includes a bus 63, and the processor 61 and the memory 62 are connected to each other through the bus 63, and may be connected to each other in other manners.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the scope of the invention. Thus, to the extent that such modifications and variations of the present application fall within the scope of the claims, it is intended that the present invention encompass such modifications and variations as well.

Claims (21)

1. A drag processing method, the method comprising:
according to the dragging operation of moving the content of a first control of a first application to a second application, electronic equipment acquires first information corresponding to the first control, and the first information is packaged in a first data structure body;
the electronic equipment judges whether the second application supports the first data structure body;
if the second application supports the first data structure, the electronic equipment sends the first data structure to the second application;
if the second application does not support the first data structure, the electronic device converts the first data structure into a second data structure, and sends the second data structure to the second application, where the second data structure is a data structure supported by the second application.
2. The drag processing method according to claim 1, wherein before the electronic device acquires the first information corresponding to the first control, the method further comprises:
the electronic equipment receives the dragging operation of moving the content of the first control of the first application to the second application.
3. The drag processing method according to claim 2, wherein the drag operation of moving the content of the first control of the first application to the second application includes:
and moving the content of the first control of the first application to the second control of the second application.
4. The drag processing method according to claim 2, wherein the drag operation of moving the content of the first control of the first application to the second application includes:
and moving the content of the first control of the first application to the icon of the second application.
5. The drag processing method according to claim 4, characterized in that: after the electronic device receives a drag operation that moves content of a first control of a first application onto an icon of a second application, the method further comprises:
the electronic equipment displays one or more floating icons corresponding to the second application;
the electronic equipment receives an operation of selecting a first floating icon from the one or more floating icons by a user, wherein the first floating icon corresponds to a first interface with a second control;
the electronic device determining whether the second application supports the first data structure includes:
and the electronic equipment judges whether the second control supports the first data structure body.
6. The drag processing method according to any one of claims 1 to 5, wherein the first application and the second application are installed in the electronic device.
7. The drag processing method according to claim 2, wherein the drag operation of moving the content of the first control of the first application to the second application includes:
and moving the content of the first control of the first application to a second control or icon of a second application in a collaborative window, wherein the collaborative window is a screen projection window of an external device on the electronic device, and the second application is installed on the external device.
8. The drag processing method according to claim 7, wherein the determining, by the electronic device, whether the first data structure is supported by the second application comprises:
the electronic equipment sends query information to the external equipment;
the electronic equipment receives response information sent by the external equipment in response to the query information;
and the electronic equipment judges whether the second application supports the first data structure body or not according to the response information.
9. The drag processing method according to any one of claims 1 to 8, characterized in that:
the content of a first control of the first application comprises first text, and first information corresponding to the first control comprises text information of the first text in the first control; alternatively, the first and second electrodes may be,
the content of a first control of the first application comprises a first picture, and first information corresponding to the first control comprises a storage path of the first picture in the first control.
10. The drag processing method according to any one of claims 1 to 9, wherein the first data structure comprises clipdata; the second data structure includes intent.
11. An electronic device, comprising one or more processors and memory to store instructions;
the processor is configured to execute the instructions to cause the electronic device to:
according to the dragging operation of moving the content of a first control of a first application to a second application, first information corresponding to the first control is obtained, and the first information is packaged in a first data structure body; determining whether the second application supports the first data structure; if the second application supports the first data structure body, the first data structure body is sent to the second application; and if the second application does not support the first data structure, converting the first data structure into a second data structure, and sending the second data structure to the second application, wherein the second data structure is a data structure supported by the second application.
12. The electronic device of claim 11, wherein: the processing executes the instructions, and further causes the electronic device to perform, before acquiring the first information corresponding to the first control:
receiving a drag operation of moving the content of the first control of the first application to the second application.
13. The electronic device of claim 12, wherein the drag operation to move the content of the first control of the first application to the second application comprises: and moving the content of the first control of the first application to the second control of the second application.
14. The electronic device of claim 12, wherein the drag operation to move the content of the first control of the first application to the second application comprises:
and moving the content of the first control of the first application to the icon of the second application.
15. The electronic device of claim 14, wherein the processing executes the instructions to further cause the electronic device, after receiving a drag operation to move content of a first control of a first application onto an icon of a second application, to further perform:
displaying one or more floating icons corresponding to the second application; receiving an operation of a user for selecting a first floating icon from one or more floating icons, wherein the first floating icon corresponds to a first interface with a second control; and judging whether the second control supports the first data structure body.
16. The electronic device according to any one of claims 11 to 15, wherein the first application and the second application are installed in the electronic device.
17. The electronic device of claim 12, wherein the drag operation to move the content of the first control of the first application to the second application comprises:
and moving the content of the first control of the first application to a second control or icon of a second application in a collaborative window, wherein the collaborative window is a screen projection window of an external device on the electronic device, and the second application is installed on the external device.
18. The electronic device of claim 17, wherein the determining whether the second application supports the first data structure comprises:
sending query information to the external device; receiving response information sent by the external equipment in response to the query information; and judging whether the second application supports the first data structure body or not according to the response information.
19. The electronic device of any of claims 11-18, wherein:
the content of a first control of the first application comprises first text, and first information corresponding to the first control comprises text information of the first text in the first control; alternatively, the first and second electrodes may be,
the content of a first control of the first application comprises a first picture, and first information corresponding to the first control comprises a storage path of the first picture in the first control.
20. The electronic device of any of claims 11-19, wherein the first data structure comprises clipdata and the second data structure comprises an intent.
21. A computer storage medium storing computer software instructions for an electronic device, comprising a program designed to perform the method of any one of claims 1-10.
CN202110074033.0A 2021-01-20 2021-01-20 Drag processing method and device Active CN114860142B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110074033.0A CN114860142B (en) 2021-01-20 2021-01-20 Drag processing method and device
PCT/CN2021/137666 WO2022156427A1 (en) 2021-01-20 2021-12-14 Dragging processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110074033.0A CN114860142B (en) 2021-01-20 2021-01-20 Drag processing method and device

Publications (2)

Publication Number Publication Date
CN114860142A true CN114860142A (en) 2022-08-05
CN114860142B CN114860142B (en) 2024-06-04

Family

ID=82548454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110074033.0A Active CN114860142B (en) 2021-01-20 2021-01-20 Drag processing method and device

Country Status (2)

Country Link
CN (1) CN114860142B (en)
WO (1) WO2022156427A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700554A (en) * 2022-10-24 2023-09-05 荣耀终端有限公司 Information display method, electronic device and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061058A (en) * 1993-03-03 2000-05-09 Apple Computer, Inc. Method and apparatus for transferring data by type according to data types available
US20010018715A1 (en) * 1993-03-03 2001-08-30 Stern Mark Ludwig Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program
CN102541426A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Electronic equipment and object processing method thereof
CN105912191A (en) * 2016-03-31 2016-08-31 北京奇虎科技有限公司 Method and device for realizing interaction between applications of terminal equipment
CN106681711A (en) * 2016-11-30 2017-05-17 维沃移动通信有限公司 Method for content sharing under split screen mode and mobile terminal
CN111095215A (en) * 2017-09-26 2020-05-01 谷歌有限责任公司 Inter-application delivery format specific data objects
CN111158543A (en) * 2019-12-24 2020-05-15 华为技术有限公司 File processing method, electronic equipment, system and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6061058A (en) * 1993-03-03 2000-05-09 Apple Computer, Inc. Method and apparatus for transferring data by type according to data types available
US20010018715A1 (en) * 1993-03-03 2001-08-30 Stern Mark Ludwig Method and apparatus for improved interaction with an application program according to data types and actions performed by the application program
CN102541426A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Electronic equipment and object processing method thereof
CN105912191A (en) * 2016-03-31 2016-08-31 北京奇虎科技有限公司 Method and device for realizing interaction between applications of terminal equipment
CN106681711A (en) * 2016-11-30 2017-05-17 维沃移动通信有限公司 Method for content sharing under split screen mode and mobile terminal
CN111095215A (en) * 2017-09-26 2020-05-01 谷歌有限责任公司 Inter-application delivery format specific data objects
CN111158543A (en) * 2019-12-24 2020-05-15 华为技术有限公司 File processing method, electronic equipment, system and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700554A (en) * 2022-10-24 2023-09-05 荣耀终端有限公司 Information display method, electronic device and readable storage medium
CN116700554B (en) * 2022-10-24 2024-05-24 荣耀终端有限公司 Information display method, electronic device and readable storage medium

Also Published As

Publication number Publication date
WO2022156427A1 (en) 2022-07-28
CN114860142B (en) 2024-06-04

Similar Documents

Publication Publication Date Title
US11921996B2 (en) Information processing terminal and control method
US11853820B2 (en) Cross-process communication method, apparatus, and device
US9386264B2 (en) Augmenting capabilities of a host device
US20140075394A1 (en) Method and apparatus to facilitate interoperability of applications in a device
KR102159416B1 (en) Systems and methods for building and using hybrid mobile applications
CN107818023B (en) Thread-based message processing method, intelligent device and storage medium
CN113360807B (en) Page display method and device of mixed-mode mobile application and related equipment
US9195367B2 (en) Managing active GUI elements remotely
CN109101228B (en) Application program execution method and device
US11314391B2 (en) Navigation bar controlling method and terminal
KR20060047403A (en) Device behavior based on surrounding devices
CN109361948B (en) Interface management method, intelligent terminal and readable storage medium
US20140250444A1 (en) Information processing system, device and information processing method
KR20180008872A (en) Application activation framework
CN110619096A (en) Method and apparatus for synchronizing data
US20170003982A1 (en) Method for operating on web page of terminal and terminal
WO2022242281A1 (en) File storage method and apparatus, and electronic device and storage medium
WO2022156427A1 (en) Dragging processing method and apparatus
WO2016095686A1 (en) Method for performing communication between browser and mobile terminal, and browser device
KR20110097300A (en) Web browsing system using the mobile web browser and method thereof and mobile terminal in the used the same
CN114780888A (en) Webpage picture generation method and system, electronic equipment and storage medium
US20150356310A1 (en) Application software service system for controlling ui access according to user level and method thereof
CN118193141A (en) File processing method, device, electronic equipment and computer program product
US10126993B2 (en) Routing outputs from jobs submitted on a mainframe system
CN113094607A (en) Method and device for acquiring local resources

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant