WO2023065957A1 - 跨设备拖拽方法、电子设备及存储介质 - Google Patents

跨设备拖拽方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2023065957A1
WO2023065957A1 PCT/CN2022/120658 CN2022120658W WO2023065957A1 WO 2023065957 A1 WO2023065957 A1 WO 2023065957A1 CN 2022120658 W CN2022120658 W CN 2022120658W WO 2023065957 A1 WO2023065957 A1 WO 2023065957A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
dragging
terminal device
interface
identifier
Prior art date
Application number
PCT/CN2022/120658
Other languages
English (en)
French (fr)
Inventor
周星辰
魏曦
胡潇艺
王海军
孙晓康
范振华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22882568.3A priority Critical patent/EP4390645A1/en
Publication of WO2023065957A1 publication Critical patent/WO2023065957A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the embodiments of the present application relate to the field of electronic devices, and in particular, to a cross-device dragging method, an electronic device, and a storage medium.
  • the user can drag an object from terminal device 1 to terminal device 2 (that is, cross-device drag and drop), and open or save it on terminal device 2.
  • the dragged objects may include: files (such as documents, pictures, music, videos, etc.), text/literal content, application icons, widgets/components (widgets), and the like.
  • the terminal device 1 may be called a source (source) end or a pull-out end
  • the terminal device 2 may be called a sink (sink) end or a pull-in end.
  • the receiving end when cross-device dragging is implemented between the source end and the receiving end, the receiving end needs to have a clear drag-in application, and the user needs to open the drag-in application on the receiving end in advance, and then the user can drag the source end Objects in the receiving end are dragged into the drag-in application in the receiving end.
  • the drag application can save or open the dragged object.
  • the user when the user does not open the drag-in application at the receiving end in advance, the user can also drag the object in the source end to the desktop of the receiving end, and the receiving end can save the dragged object to the local Default storage path (such as file manager) or use the default application program (such as browser) to open.
  • Embodiments of the present application provide a cross-device dragging method, an electronic device, and a storage medium.
  • the receiving end can respond to the dragging of the object more intelligently.
  • the embodiment of the present application provides a cross-device dragging method, the method is applied to the first terminal device; the method includes: the first terminal device displays the first interface; the first terminal device receives the first operation, The first operation is the operation of dragging the first object from the display interface of the second terminal device to the first interface; in response to the first operation, the first terminal device displays the first window; the first window includes one or more application corresponding and/or service identifiers corresponding to one or more services.
  • the method is applicable to the scene of dragging and dropping an object between any two terminal devices (such as a first terminal device and a second terminal device). Take the user dragging an object from the second terminal device to the first terminal device as an example.
  • the first terminal device can recommend a first window for the user, and the first window includes application identifiers corresponding to one or more applications, and/or service corresponding to one or more services logo.
  • the user can quickly select the application program or service to open the object according to the application identification or service identification displayed in the first window.
  • the dragging method is simpler, and the response of the first terminal device to the dragged object is more intelligent.
  • the application identifier included in the first window is related to the type of the first object.
  • the method further includes: the first terminal device obtains the type of the first object; the first terminal device determines that the first window includes application ID.
  • the first terminal device determines the application identifier included in the first window according to the type of the first object, including: the first terminal device selects all applications installed on the first terminal device according to the type of the first object An application that supports dragging the first object is determined; the first terminal device uses the application identifier of the application that supports dragging the first object as the application identifier included in the first window.
  • the first terminal device may determine the type of the first object according to the first object included in the first message, or the first message may separately include a field for Indicates the type of the first object, and the first terminal device may determine the type of the first object according to this field.
  • the first terminal device may select applications that can open the first object from all applications installed on the first terminal device according to the type of the first object, and display application identifiers of these applications that can open the first object in the first window.
  • the application identifier included in the first window is a preset application identifier.
  • the user or the service manufacturer of the first terminal device may pre-configure the application identifiers of which applications are displayed in the first window.
  • the application identifiers included in the first window are application identifiers of all applications installed on the first terminal device.
  • the first window may include application icons of the N applications installed on the first terminal device.
  • the method further includes: the first terminal device receives an operation of dragging the first object to the first application identifier; the first application identifier is one of the application identifiers included in the first window; An operation of dragging an object to the first application identifier, when the application corresponding to the first application identifier is an application that supports dragging the first object, the first terminal device displays a second window, and the second window includes the Service identifiers of one or more services included in the application; or, when the corresponding application identified by the first application is an application that does not support dragging the first object, the first terminal device displays first prompt information, and the first prompt information is used to It prompts that the application corresponding to the first application identifier does not support dragging in the first object.
  • the first terminal device displays a second window for the user to directly drag the first object to one of the applications corresponding to the first application identifier. Open on the service.
  • the first terminal device displays first prompt information, which may prompt that the application corresponding to the first application identifier cannot open the first object.
  • the service identifier included in the second window is related to the type of the first object.
  • the method further includes: the first terminal device acquires the type of the first object; the first terminal device corresponds to the first object type according to the first object type and the first application identifier All the services included in the application determine the service identifiers included in the second window.
  • the first terminal device determines the service identifier included in the second window according to the type of the first object and all services included in the application corresponding to the first application identifier, including: the first terminal device determines the service identifier included in the second window according to the first The type of the object is determined from all the services included in the application corresponding to the first application identifier, and the service that supports dragging the first object is determined; the first terminal device uses the service identifier of the service that supports the dragging of the first object as the The service ID for .
  • the first terminal device may determine the type of the first object according to the first object included in the first message, or the first message may include a separate field for indicating the type of the first object , the first terminal device may determine the type of the first object according to this field.
  • the first terminal device can select from all the services included in the application corresponding to the first application identifier according to the type of the first object to open the first application identifier. services of an object, and display the service identifiers of the services that open the first object in the second window.
  • the service identifier included in the second window is a preset service identifier.
  • the user or the service manufacturer of the first terminal device may pre-configure which service identifiers are included in the second window corresponding to each application. That is, the service identifier included in the second window is fixed.
  • the service identifier included in the second window is the service identifier of all services included in the application corresponding to the first application identifier.
  • the second window corresponding to the application corresponding to the first application identifier may include the service identifiers of the aforementioned N services.
  • the first interface includes an interface that supports drag-in and an interface that does not support drag-in; displaying the first window on the first terminal device includes: when the first interface is an interface that does not support drag-in, The first terminal device displays the first window.
  • the interfaces that do not support drag-in may include system desktop (desktop for short), system pop-up windows, application interfaces that do not support drag-in, and the like.
  • the interface supporting drag-in may include an application interface supporting drag-in.
  • the application interfaces that support drag-in may be chat interfaces of some chat applications installed on the mobile phone.
  • the displaying of the first window by the first terminal device includes: after the first object is dragged to the first interface, the dragging operation on the first object is not completed, and the pointer is on the first interface When the stay time in reaches the first duration, and/or the dragging operation on the first object is not over and the sliding distance of the pointer in the first interface is greater than the first threshold, the first terminal device displays the first window.
  • the first terminal device may determine that the user intends to open the first object to find a certain application, and at this time, the first terminal device may display the first window.
  • the further interaction action is: the user does not end the drag operation on the first object, and the dragging/sliding distance of the pointer in the display interface of the receiving end is greater than the first threshold , the first terminal device may determine that the user intends to open the first object to find a certain application, and at this time, the first terminal device may display the first window.
  • the further interaction action is: the user does not end the drag operation on the first object, and the pointer (such as a mouse pointer) stays in the first interface for the second time.
  • the first terminal device can determine that the user intends to open the first object to find a certain application. At this time, the first terminal device can display the first object. a window.
  • the first terminal device may determine the user's intention according to further interaction actions of the user after the first object is dragged to the first interface, and determine whether to display the first window according to the user's intention.
  • the first interface is an interface that does not support drag-in; the method further includes: the first terminal device receives a third operation, and the third operation is dragging the second object from the display interface of the second terminal device to the second object. After an interface, the operation of dragging the second object is directly ended; in response to the third operation, the first terminal device saves the second object, or opens the default application and transfers the second object to the default application.
  • directly ending the dragging operation on the first object may refer to: the time between when the user drags the first object to the first interface and when the user ends the dragging operation on the first object is less than the first duration.
  • the method further includes: the first terminal device receives an operation of dragging the first object to the first application identifier; the first application identifier is one of the application identifiers included in the first window; An operation of dragging an object to the first application identifier.
  • the first terminal device opens the application corresponding to the first application identifier and transfers the first object A corresponding application is identified for the first application.
  • the first terminal device displays first prompt information, and the first prompt information is used to prompt that the application corresponding to the first application identifier does not support dragging first object.
  • the displaying the first prompt information by the first terminal device includes: displaying the first prompt information by the first terminal device by changing a display state of the first application identifier.
  • the first terminal device may display the first application logo as the first prompt information by darkening or graying out.
  • the method further includes: the first terminal device receives the operation of dragging the first object to the first service identifier; the first service identifier is one of the service identifiers included in the second window; The operation of dragging an object to the first service identifier, when the service corresponding to the first service identifier is a service that supports dragging the first object, the first terminal device opens the service corresponding to the first service identifier, and transfers the first object A corresponding service is identified for the first service. Or, when the service corresponding to the first service identifier is a service that does not support dragging the first object, the first terminal device displays second prompt information, and the second prompt information is used to prompt that the service corresponding to the first service identifier does not support dragging first object.
  • the method further includes: the first terminal device receives a fourth operation, and the fourth operation is dragging the second object from the display interface of the second terminal device to the first area in the first interface. Operation; the first area is a blank area; in response to the fourth operation, the first terminal device saves the second object.
  • the user may drag the drag object to the first area, so as to trigger the first terminal device to save the drag object.
  • the first window includes a second area or a first icon; the method further includes: the first terminal device receives an operation of dragging the first object onto the second area or the first icon; In response to the operation of dragging the first object onto the second area or the first icon, the first terminal device saves the first object.
  • the user may drag the dragged object to the second area or the first icon, so as to trigger the first terminal device to save the dragged object.
  • the first interface is the system desktop, and the first interface includes one or more application identifiers; the method further includes: the first terminal device receives an operation to close the first window; and responding to closing the first window operation, the first terminal device closes the first window; the first terminal device receives the operation of dragging the first object to the second application identifier; the second application identifier is one of the application identifiers included in the system desktop; in response to dragging the first object to the second application identifier; The operation of dragging the first object to the second application identifier, when the application corresponding to the second application identifier is an application that supports dragging the first object, the first terminal device opens the application corresponding to the second application identifier, and drags the first object passed to the application corresponding to the second application identifier; or, the first terminal device displays a third window, and the third window includes service identifiers of one or more services included in the application corresponding to the second application identifier.
  • the first terminal device displays first prompt information, and the first prompt information is used to prompt that the application corresponding to the second application identifier does not support dragging first object.
  • the first terminal device displaying the first window includes: the first terminal device displays the first window in a full-screen display manner, or in a non-full-screen display manner, or in a drawer display manner.
  • the non-full-screen display may specifically include half-screen display (that is, the area of the first window occupies half of the display screen area of the first terminal device), and one-third screen display (that is, the area of the first window occupies half of the display screen area of the first terminal device). One-third of the display screen area of the device), etc., without limitation.
  • the method further includes: when the first terminal device detects that the dragging behavior of the first object being dragged on the first terminal device is interrupted, the The first floating window corresponding to the first object is displayed at the first position.
  • the first position may be the edge of the desktop (such as the first interface).
  • the method further includes: the first terminal device receives a drag operation on the first floating window; in response to the drag operation on the first floating window, the first terminal device changes the first floating window to the first The dragging shadow of the object is used for the user to continue dragging the first object.
  • the user can continue to drag the first object through the first floating window without dragging the first object from the second terminal device again.
  • the method further includes: when the first object is dragged into the first interface, the first terminal device displays a first UI dynamic effect on the screen edge of the first interface; the first UI dynamic effect is used to prompt the user to first The object is dragged into the first interface.
  • the display area of the first UI dynamic effect is related to the position where the dragging shadow of the first object is dragged in on the first interface.
  • the first terminal device may be within a certain range (such as a preset distance range) above and below the position where the dragging shadow of the first object is dragged in on the first interface (that is, the position that appears on the edge of the screen of the first terminal device).
  • the first UI animation is displayed inside.
  • the position where the dragging shadow of the first object is dragged in on the first display interface may be determined according to the position where the dragging shadow of the first object is dragged out in the display interface of the second terminal device, for example: the first object The distance between the position where the dragging shadow is dragged out on the display interface of the second terminal device and the edge of the upper (top) screen of the second terminal device is 30% of the distance between the upper and lower screens, then the first The distance between the position where the dragging shadow of the object is dragged in on the first interface and the edge of the upper (top) screen of the first terminal device may also be 30% of the distance between the upper and lower screens.
  • the position where the dragged shadow of the first object is dragged in on the first interface may also be a relative receiving position in the physical space, which is not limited here.
  • the method further includes: when the first object is dragged until the drag shadow of the first object is away from the edge of the screen, the first terminal device aligns the display area of the first UI dynamic effect with the first object The area related to the location of the dragged shadow is highlighted or color enhanced.
  • the color when the first UI dynamic effect color is enhanced may be related to the color of the first object or the color of the first interface, for example: the color when the first UI dynamic effect color is enhanced may be the same as the color of the first object, Or, when the first interface is a desktop, the color of the first UI dynamic effect color enhancement can be similar to the color of the desktop wallpaper, for example: select the color in the main color of the desktop wallpaper.
  • the method further includes: when the first object is dragged until the dragged shadow of the first object is away from the edge of the screen, the first terminal device displays the trailing effect of the first object; the display of the trailing effect The area moves along with the dragging shadow of the first object; when the display area of the trailing effect moves with the dragging shadow of the first object, the trailing effect gradually becomes smaller or remains unchanged, and the display brightness of the trailing effect And/or the color gradually fades; when the dragging shadow of the first object moves beyond a preset distance, the first terminal device no longer displays the trailing effect.
  • the method further includes: in response to ending the operation of dragging the first object, the first terminal device no longer displays the first UI dynamic effect.
  • the embodiment of the present application provides a cross-device dragging device, which can be applied to the first terminal device described in the first aspect above, so that the first terminal device can realize the first aspect and the first aspect
  • the method described in any one of the possible implementations can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the steps in the method described in the first aspect and any possible implementation manner of the first aspect.
  • the device may include: a display unit, a receiving unit, a processing unit, etc., and the display unit, the receiving unit, and the processing unit may cooperate to implement the method described in the first aspect and any possible implementation manner of the first aspect.
  • the display unit may be used to display the first interface; the receiving unit may be used to receive the first operation; the processing unit may be used to control the display unit to display the first window in response to the first operation.
  • the display unit, the receiving unit, and the processing unit can cooperate to realize the functions corresponding to all the steps of the method described in the first aspect and any possible implementation manner of the first aspect, which will not be repeated here.
  • an embodiment of the present application provides an electronic device, where the electronic device may be the first terminal device described in the foregoing first aspect.
  • the electronic device includes: a processor, a memory for storing processor-executable instructions; when the processor is configured to execute the instructions, the electronic device implements the first aspect and any possible implementation manner of the first aspect. described method.
  • an embodiment of the present application provides a computer-readable storage medium, on which computer program instructions are stored; when the computer program instructions are executed by an electronic device, the electronic device realizes the first aspect and the first aspect.
  • the embodiment of the present application provides a computer program product, including computer readable code, or a non-volatile computer readable storage medium bearing computer readable code, when the computer readable code is stored in an electronic device During operation, the processor in the electronic device implements the method described in the first aspect and any possible implementation manner of the first aspect.
  • the embodiment of the present application also provides a cross-device dragging method, the method is applied to a first terminal device; the method includes: the first terminal device displays a first interface; the first interface includes one or more The application identifier of the application; the first terminal device receives the operation of dragging the first object to the second application identifier; the second application identifier is one of the application identifiers included in the first interface, and the application corresponding to the second application identifier is An application that supports dragging in the first object; the first object comes from the display interface of the second terminal device; in response to the operation of dragging the first object to the second application identifier, the first terminal device opens the application corresponding to the second application identifier, and transfer the first object to the application corresponding to the second application identifier; or, the first terminal device displays a third window, and the third window includes service identifiers of one or more services included in the application corresponding to the second application identifier.
  • the method further includes: the first terminal device receives an operation of dragging the second object to a third application identifier; the third application identifier is one of the application identifiers included in the first interface, and the third application The corresponding application is identified as an application that does not support dragging the second object; the second object comes from the display interface of the second terminal device; in response to the operation of dragging the second object to the third application identification, the first terminal device displays the first The prompt information, the first prompt information is used to prompt that the application corresponding to the third application identifier does not support dragging in the first object.
  • the application identifier includes an application icon or a card.
  • the card includes one or more service identifiers; the method further includes: the first terminal device receives an operation of dragging the first object to the second service identifier; the second service identifier is the One of the service identifiers; in response to the operation of dragging the first object to the second service identifier, when the service corresponding to the second service identifier is a service that supports dragging the first object, the first terminal device opens the second service identifier the corresponding service, and pass the first object to the second service to identify the corresponding service; or, when the second service identifies the corresponding service as a service that does not support dragging the first object, the first terminal device displays the second prompt information , the second prompt information is used to prompt that the service corresponding to the second service identifier does not support dragging in the first object.
  • the first interface includes a first folder; the first folder includes application identifiers of one or more applications; the second application identifier is one of the application identifiers included in the first folder; the The operation of dragging the first object to the second application identifier includes: opening the first folder, and dragging the first object to the second application identifier.
  • the method further includes: when the first object is dragged into the first interface, the first terminal device displays a first UI dynamic effect on the screen edge of the first interface; the first UI dynamic effect is used to prompt the user to first The object is dragged into the first interface.
  • the display area of the first UI dynamic effect is related to the position where the dragging shadow of the first object is dragged in on the first interface.
  • the first terminal device may be within a certain range (such as a preset distance range) above and below the position where the dragging shadow of the first object is dragged in on the first interface (that is, the position that appears on the edge of the screen of the first terminal device).
  • the first UI animation is displayed inside.
  • the position where the dragging shadow of the first object is dragged in on the first display interface may be determined according to the position where the dragging shadow of the first object is dragged out in the display interface of the second terminal device, for example: the first object The distance between the position where the dragging shadow is dragged out on the display interface of the second terminal device and the edge of the upper (top) screen of the second terminal device is 30% of the distance between the upper and lower screens, then the first The distance between the position where the dragging shadow of the object is dragged in on the first interface and the edge of the upper (top) screen of the first terminal device may also be 30% of the distance between the upper and lower screens.
  • the position where the dragged shadow of the first object is dragged in on the first interface may also be a relative receiving position in the physical space, which is not limited here.
  • the method further includes: when the first object is dragged until the drag shadow of the first object is away from the edge of the screen, the first terminal device aligns the display area of the first UI dynamic effect with the first object The area related to the location of the dragged shadow is highlighted or color enhanced.
  • the color when the first UI dynamic effect color is enhanced may be related to the color of the first object or the color of the first interface, for example: the color when the first UI dynamic effect color is enhanced may be the same as the color of the first object, Or, when the first interface is a desktop, the color of the first UI dynamic effect color enhancement can be similar to the color of the desktop wallpaper, for example: select the color in the main color of the desktop wallpaper.
  • the method further includes: when the first object is dragged until the dragged shadow of the first object is away from the edge of the screen, the first terminal device displays the trailing effect of the first object; the display of the trailing effect The area moves along with the dragging shadow of the first object; when the display area of the trailing effect moves with the dragging shadow of the first object, the trailing effect gradually becomes smaller or remains unchanged, and the display brightness of the trailing effect And/or the color gradually fades; when the dragging shadow of the first object moves beyond a preset distance, the first terminal device no longer displays the trailing effect.
  • the method further includes: in response to ending the operation of dragging the first object, the first terminal device no longer displays the first UI dynamic effect.
  • the embodiment of the present application provides a cross-device dragging device, which can be applied to the first terminal device described in the sixth aspect, so that the first terminal device can realize the sixth aspect and the sixth aspect
  • the method described in any one of the possible implementations can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the sixth aspect and the steps in the method described in any possible implementation manner of the sixth aspect.
  • the device may include: a display unit, a receiving unit, a processing unit, etc., and the display unit, the receiving unit, and the processing unit may cooperate to implement the method described in the sixth aspect and any possible implementation manner of the sixth aspect.
  • the display unit can be used to display the first interface; the receiving unit can be used to receive the operation of dragging the first object to the second application identifier; the processing unit can be used to respond to dragging the first object to the second application identifier The operation of opening the application corresponding to the second application identifier, and passing the first object to the application corresponding to the second application identifier; or controlling the display unit to display the third window and the like.
  • the display unit, the receiving unit, and the processing unit can cooperate to realize the functions corresponding to all the steps of the method described in the sixth aspect and any possible implementation manner of the sixth aspect, which will not be repeated here.
  • the embodiment of the present application provides an electronic device, and the electronic device may be the first terminal device described in the sixth aspect.
  • the electronic device includes: a processor, a memory for storing processor-executable instructions; when the processor is configured to execute the instructions, the electronic device implements the sixth aspect and any possible implementation manner of the sixth aspect. described method.
  • the embodiment of the present application provides a computer-readable storage medium, on which computer program instructions are stored; when the computer program instructions are executed by the electronic device, the electronic device realizes the sixth aspect and the sixth aspect.
  • the embodiment of the present application provides a computer program product, including computer readable code, or a non-volatile computer readable storage medium carrying computer readable code, when the computer readable code is stored in an electronic device During operation, the processor in the electronic device implements the sixth aspect and the method described in any possible implementation manner of the sixth aspect.
  • the embodiment of the present application also provides a cross-device dragging method, the method is applied to the second terminal device; the method includes: the second terminal device displays a second interface, and the second interface includes the first object ; In response to the second operation, the second terminal device displays a second UI dynamic effect on the screen edge of the second interface; the second operation is an operation of dragging the first object on the second interface; the second UI dynamic effect is used to prompt the user to The first object is dragged across devices; the display position of the second UI animation is related to the orientation of the first terminal device relative to the second terminal device, and the first terminal device is connected to the second terminal device.
  • the display position of the second UI dynamic effect may be the screen edge of the second terminal device, and the screen edge is the side where the first terminal device connected to the second terminal device is located.
  • the screen of the second terminal device can be divided into an upper screen edge, a lower screen edge, a left screen edge, and a right screen edge, then when the side where the first terminal device connected to the second terminal device is located is When the second terminal device is on the right side, the display position of the second UI animation effect may be the right screen edge of the second terminal device.
  • the method further includes: during the process that the position of the dragging shadow of the first object in the second interface is gradually approaching the edge of the screen where the second UI dynamic effect is located, the second terminal device gradually increases the second The display range of the UI animation.
  • the area where the display range of the second UI animation effect increases is related to the position where the drag shadow of the first object is located in the second interface.
  • the position of the drag shadow of the first object in the second interface will gradually approach the second UI animation effect, or gradually approach the second UI animation effect.
  • the edge of the screen where the UI animation is located As the position of the dragging shadow of the first object in the second interface gradually approaches the edge of the screen where the second UI animation effect is located, the second terminal device can gradually enhance the display effect of the second UI animation effect, for example, it can increase the The display range of the second UI animation effect.
  • the method further includes: when the first object is dragged into the display area of the second UI animation effect, the second terminal device drags the first object in the display area of the second UI animation effect Areas relative to where the shadows are located are highlighted or color-enhanced.
  • the color when the second UI dynamic effect color is enhanced may be related to the color of the first object or the color of the second interface, for example: the color when the second UI dynamic effect color is enhanced may be the same as the color of the first object, Or, when the second interface is a desktop, the color of the second UI dynamic effect color enhancement can be the same as or similar to the color of the desktop wallpaper.
  • the embodiment of the present application provides a cross-device dragging device, which can be applied to the second terminal device described in the eleventh aspect above, so that the second terminal device can realize the The method described in any possible implementation manner of the eleventh aspect.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the steps in the method described in the eleventh aspect and any possible implementation manner of the eleventh aspect.
  • the device may include: a display unit, a receiving unit, a processing unit, etc., and the display unit, the receiving unit, and the processing unit may cooperate to implement the method described in any possible implementation manner of the eleventh aspect and the eleventh aspect .
  • the display unit can be used to display the second interface; the receiving unit can be used to receive the second operation; the processing unit can be used to respond to the second operation, control the display unit to display the second UI dynamic effect on the screen edge of the second interface, etc.
  • the display unit, the receiving unit, and the processing unit can cooperate to implement the functions corresponding to all the steps of the method described in the eleventh aspect and any possible implementation manner of the eleventh aspect, and details will not be repeated here. .
  • the embodiment of the present application provides an electronic device, and the electronic device may be the second terminal device described in the eleventh aspect above.
  • the electronic device includes: a processor, a memory for storing processor-executable instructions; when the processor is configured to execute the instructions, the electronic device realizes any one of the possible implementations of the eleventh aspect and the eleventh aspect method described in the method.
  • an embodiment of the present application provides a computer-readable storage medium, on which computer program instructions are stored; when the computer program instructions are executed by an electronic device, the electronic device realizes the aspects of the eleventh aspect and the tenth aspect.
  • the embodiment of the present application provides a computer program product, including computer readable codes, or a non-volatile computer readable storage medium bearing computer readable codes, when the computer readable codes are stored in an electronic device
  • the processor in the electronic device implements the method described in the eleventh aspect and any possible implementation manner of the eleventh aspect.
  • the embodiment of the present application provides a cross-device dragging method, the method is applied to the first terminal device; the method includes: the first terminal device displays the first interface; when the second terminal device displays the first When the first object in the second interface is dragged into the first interface, the first terminal device displays the first UI dynamic effect on the screen edge of the first interface; the first UI dynamic effect is used to prompt the user that the first object is dragged into the first interface .
  • the display area of the first UI dynamic effect is related to the position where the dragging shadow of the first object is dragged in on the first interface.
  • the first terminal device may be within a certain range (such as a preset distance range) above and below the position where the dragging shadow of the first object is dragged in on the first interface (that is, the position that appears on the edge of the screen of the first terminal device).
  • the first UI animation is displayed inside.
  • the position where the dragging shadow of the first object is dragged in on the first display interface may be determined according to the position where the dragging shadow of the first object is dragged out in the display interface of the second terminal device, for example: the first object The distance between the position where the dragging shadow is dragged out on the display interface of the second terminal device and the edge of the upper (top) screen of the second terminal device is 30% of the distance between the upper and lower screens, then the first The distance between the position where the dragging shadow of the object is dragged in on the first interface and the edge of the upper (top) screen of the first terminal device may also be 30% of the distance between the upper and lower screens.
  • the position where the dragged shadow of the first object is dragged in on the first interface may also be a relative receiving position in the physical space, which is not limited here.
  • the method further includes: when the first object is dragged until the drag shadow of the first object is away from the edge of the screen, the first terminal device aligns the display area of the first UI dynamic effect with the first object The area related to the location of the dragged shadow is highlighted or color enhanced.
  • the color when the first UI dynamic effect color is enhanced may be related to the color of the first object or the color of the first interface, for example: the color when the first UI dynamic effect color is enhanced may be the same as the color of the first object, Or, when the first interface is a desktop, the color of the first UI dynamic effect color enhancement can be similar to the color of the desktop wallpaper, for example: select the color in the main color of the desktop wallpaper.
  • the method further includes: when the first object is dragged until the dragged shadow of the first object is away from the edge of the screen, the first terminal device displays the trailing effect of the first object; the display of the trailing effect The area moves along with the dragging shadow of the first object; when the display area of the trailing effect moves with the dragging shadow of the first object, the trailing effect gradually becomes smaller or remains unchanged, and the display brightness of the trailing effect And/or the color gradually fades; when the dragging shadow of the first object moves beyond a preset distance, the first terminal device no longer displays the trailing effect.
  • the method further includes: in response to ending the operation of dragging the first object, the first terminal device no longer displays the first UI dynamic effect.
  • the embodiment of the present application provides a cross-device dragging device, which can be applied to the first terminal device described in the sixteenth aspect, so that the first terminal device can realize the The method described in any possible implementation manner of the sixteenth aspect.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the sixteenth aspect and the steps in the method described in any possible implementation manner of the sixteenth aspect.
  • the device may include: a display unit, a receiving unit, a processing unit, etc., and the display unit, the receiving unit, and the processing unit may cooperate to implement the method described in the sixteenth aspect and any possible implementation manner of the sixteenth aspect .
  • the display unit can be used to display the first interface;
  • the receiving unit can be used to receive the operation that the first object is dragged into the first interface;
  • the processing unit can be used when the first object in the second interface displayed by the second terminal device When being dragged into the first interface, the display unit is controlled to display the first UI dynamic effect and the like on the edge of the screen of the first interface.
  • the display unit, the receiving unit, and the processing unit can cooperate to implement the functions corresponding to all the steps of the method described in the sixteenth aspect and any possible implementation manner of the sixteenth aspect, and details will not be repeated here. .
  • the embodiment of the present application provides an electronic device, and the electronic device may be the first terminal device described in the sixteenth aspect above.
  • the electronic device includes: a processor, and a memory for storing processor-executable instructions; when the processor is configured to execute the instructions, the electronic device realizes any possible implementation of the sixteenth aspect and the sixteenth aspect method described in the method.
  • the embodiment of the present application provides a computer-readable storage medium, on which computer program instructions are stored; when the computer program instructions are executed by the electronic device, the electronic device realizes the sixteenth aspect and the tenth aspect.
  • the embodiment of the present application provides a computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium bearing computer-readable codes, when the computer-readable codes are stored in an electronic device
  • the processor in the electronic device implements the method described in the sixteenth aspect and any possible implementation manner of the sixteenth aspect.
  • FIG. 1 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the composition of the multi-device hybrid dragging system provided by the embodiment of the present application;
  • FIG. 3 is a schematic flow diagram of a cross-device dragging method provided in an embodiment of the present application
  • FIG. 4 is a schematic diagram of a scene of dragging and dropping based on non-screen projection provided by the embodiment of the present application;
  • FIG. 5 is a schematic diagram of a scene of dragging based on screen projection provided by the embodiment of the present application.
  • FIG. 6 is a schematic diagram of a display interface of a mobile phone provided by an embodiment of the present application.
  • FIG. 7 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 8 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 9 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 10A is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 10B is a schematic diagram of a scenario in which a user triggers a mobile phone to display an application recommendation panel in a drawer display mode provided by an embodiment of the present application;
  • FIG. 10C is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 11 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 12 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 13A is a schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 13B is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 14A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 14B is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 15 is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • Fig. 16 is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 17 is a schematic diagram of a scene of a mobile phone switching display interface provided by an embodiment of the present application.
  • FIG. 18A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 18B is a schematic diagram of a scene where a mobile phone opens "folder 1" 1801 provided by the embodiment of the present application;
  • FIG. 19A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • FIG. 19B is a schematic diagram of a scene that triggers the mobile phone to display all application icons of "folder 1" 1901 provided by the embodiment of the present application;
  • FIG. 19C is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 20 is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • FIG. 21 is a schematic diagram of a PC displaying UI dynamic effects provided by an embodiment of the present application.
  • FIG. 22 is another schematic diagram of a PC displaying UI dynamic effects provided by an embodiment of the present application.
  • FIG. 23 is another schematic diagram of a PC displaying UI dynamic effects provided by an embodiment of the present application.
  • FIG. 24 is a schematic diagram of a mobile phone displaying UI motion effects provided by an embodiment of the present application.
  • FIG. 25 is another schematic diagram of a mobile phone displaying UI dynamic effects provided by the embodiment of the present application.
  • FIG. 26 is another schematic diagram of the mobile phone displaying UI dynamic effects provided by the embodiment of the present application.
  • FIG. 27 is another schematic diagram of the mobile phone displaying UI dynamic effects provided by the embodiment of the present application.
  • Fig. 28 is a schematic structural diagram of the cross-device dragging device provided by the embodiment of the present application.
  • FIG. 29 is another schematic structural diagram of the cross-device dragging device provided by the embodiment of the present application.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • the term “connected” includes both direct and indirect connections, unless otherwise stated.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • terminal devices such as mobile phones, tablet computers, personal computers (PCs), and smart home devices (such as televisions)
  • PCs personal computers
  • smart home devices such as televisions
  • multi-device collaboration scenarios the user can drag an object from terminal device 1 to terminal device 2 (that is, cross-device drag and drop), and open or save it on terminal device 2.
  • the dragged objects may include: files (such as documents, pictures, music, videos, etc.), text/literal content, application icons, widgets (widgets), and the like.
  • the terminal device 1 may be called a source (source) end or a pull-out end
  • the terminal device 2 may be called a sink (sink) end or a pull-in end.
  • a device that acts as a source in a pair of relationships may also be a sink in another pair of relationships, that is, for a terminal device, it may be the source of another terminal device , and possibly as a receiver for another terminal device.
  • the receiving end when cross-device dragging is implemented between the source end and the receiving end, the receiving end needs to have a clear drag-in application, and the user needs to open the drag-in application on the receiving end in advance, and then the user can drag the source end Objects in the receiving end are dragged into the drag-in application in the receiving end.
  • the drag application can save or open the dragged object.
  • the user when the user does not open the drag-in application at the receiving end in advance, the user can also drag the object in the source end to the desktop of the receiving end, and the receiving end can save the dragged object to the local Default storage path (such as file manager) or use the default application program (such as browser) to open.
  • an embodiment of the present application provides a cross-device dragging method, which is applicable to a scene where an object is dragged between any two terminal devices.
  • the receiver end can recommend one or more application programs (applications for short) to the user.
  • the user can quickly select the application program to open the object or save the object according to the application program recommended by the receiving end.
  • the source end and the receiver end may establish a connection in a wired or wireless manner. Based on the established connection, the source and sink can work together.
  • the wireless communication protocol used when the source end and the receiving end establish a connection in a wireless manner can be wireless fidelity (wireless fidelity, Wi-Fi) protocol, bluetooth (bluetooth) protocol, ZigBee protocol, near field communication (near field communication, NFC) ) protocol, etc., may also be various cellular network protocols, which are not specifically limited here.
  • the source end and the receiving end can be mobile phones, tablet computers, handheld computers, PCs, cellular phones, personal digital assistants (personal digital assistants, PDAs), wearable devices (such as smart watches), smart home devices (such as TVs) machine), on-board computer, game console, and augmented reality (augmented reality, AR) ⁇ virtual reality (virtual reality, VR) equipment, etc.
  • This application does not limit the specific device forms of the source end and the receiver end.
  • FIG. 1 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • mobile phone can comprise processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charging management module 140, power management module 141, battery 142, antenna 1.
  • Antenna 2 mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller can be the nerve center and command center of the phone.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (general-purpose input/output, GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing instructions stored in the internal memory 121 .
  • the mobile phone can implement the steps performed by the source end in the cross-device drag method provided by the embodiment of the present application by executing the instructions of the internal memory 121.
  • the mobile phone can implement the steps performed by the receiving end in the cross-device drag method provided by the embodiment of the present application by executing the instructions of the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the charging management module 140 is configured to receive a charging input from a charger. While the charging management module 140 is charging the battery 142 , it can also provide power for the mobile phone through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 can also receive the input of the battery 142 to provide power for the mobile phone.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a mobile phone can be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to mobile phones.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (wireless local area networks, WLAN) (such as Wi-Fi network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), FM (frequency modulation, FM), NFC, infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, perform frequency modulation on it, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the mobile phone is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • WLAN NFC
  • FM
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • a mobile phone can be coupled with the mobile communication module 150 and interact with another mobile phone through the wireless communication module 160, such as: the mobile phone can serve as a source to send a first message, a second message, etc. to another mobile phone as a receiver.
  • the mobile phone can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the mobile phone detects the intensity of the touch operation according to the pressure sensor 180A.
  • the mobile phone can also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the gyroscope sensor 180B can be used to determine the motion posture of the mobile phone.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile phone can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the mobile phone in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • the mobile phone can use the proximity light sensor 180G to detect that the user holds the mobile phone close to the ear to talk, so that the screen can be automatically turned off to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile phone can use the fingerprint characteristics collected to unlock the fingerprint, access the application lock, take pictures with the fingerprint, answer the incoming call with the fingerprint, etc.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone, which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the mobile phone can realize shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • the mobile phone may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile phone may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to realize contact and separation with the mobile phone.
  • the mobile phone can support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the mobile phone interacts with the network through the SIM card to realize functions such as calling and data communication.
  • the mobile phone employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone and cannot be separated from the mobile phone.
  • FIG. 1 does not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the source or sink is a tablet computer, a handheld computer, a PC, a cell phone, a personal digital assistant (PDA), a wearable device (such as a smart watch), a smart home device (such as a TV),
  • PDA personal digital assistant
  • a wearable device such as a smart watch
  • a smart home device such as a TV
  • the structure of these other terminal devices can also refer to the above-mentioned FIG. 1 .
  • the structures of these other terminal devices may be based on the structure shown in FIG. 1 with components added or reduced.
  • each terminal device may include a multi-device hybrid drag system, and the source end and the receiving end may respectively realize the described function.
  • FIG. 2 is a schematic composition diagram of a multi-device hybrid dragging system provided by an embodiment of the present application.
  • the multi-device hybrid drag-and-drop system may include: a UI layer, a cross-device capability layer, an event processing module, a drag event producer, a drag event consumer, and a message module (DragDropEventBus).
  • the UI layer can be used by the terminal device to realize functions such as displaying shadows and bubbles during the dragging process, receiving dragging events, and controlling the display and hiding of related windows.
  • the cross-device capability layer is used for terminal devices to control the entire life cycle of drag and drop, and to ensure that the drag shadow is consistent with gestures or pointer actions.
  • the pointer may refer to a mouse pointer, a stylus pointer, a touchpad pointer, and the like.
  • the pointer can be a static or dynamic image, and the display style of the pointer may be different in different situations.
  • the event processing module can be used in the terminal device to implement functions such as processing drag events at both ends of the drag event producer and drag event consumer, as well as mouse over, click, and move events.
  • the terminal device When the terminal device is used as the source, it can be used as the initiator of the drag action, and use the modules related to the drag event producer to perform corresponding functions.
  • the terminal device When the terminal device is used as the receiving end, it can be used as the receiving device of the drag action, and use the modules related to the drag event consumer to perform corresponding functions.
  • the message module can be used as a transmission channel for drag events, mouse events, gesture events, etc., and terminal devices can transmit events or information with other devices through the message module. For example, information exchange can be performed between the source end and the receiver end based on their respective message modules.
  • the source end and the receiver end described in the embodiments of the present application may be touch screen or non-touch screen.
  • the source end is touch screen, and the sink end is non-touch screen; or, the source end is non-touch screen, and the sink end is touch screen; or, both the source end and the sink end are touch screen, or both are non-touch screen screen.
  • the terminal device is a touch screen device, the user can click, slide, etc.
  • the terminal device is controlled by means, for example, the user can drag an object on the display screen of the terminal device with a finger.
  • the terminal device When the terminal device is a non-touch screen device, the terminal device can be connected to an input device such as a mouse, a keyboard, or a touch panel, and the user can control the terminal device through the input device. For example, the user can control the terminal device through the mouse. Drag an object on the screen.
  • the terminal device when the terminal device is a touch screen device, the terminal device may also be connected to an input device such as a mouse, a keyboard, or a touch panel, and the user may control the terminal device through the input device, which is not limited here.
  • the source end may be the second terminal device, and the receiving end may be the first terminal device.
  • the source end may be the first terminal device, and the sink end may be the second terminal device.
  • FIG. 3 is a schematic flowchart of a cross-device dragging method provided by an embodiment of the present application.
  • the cross-device dragging method may include:
  • the source end receives a drag operation on a first object.
  • the source end may display an interface, where the interface includes the first object.
  • the first object may be text (or text, text), files, folders, windows, components, etc. in the display interface of the source end.
  • Files can include files in one or more of the following formats, such as word documents, Excel workbooks, PowerPoint presentations, bitmaps, image files, plain text files, sound files, movie files, flash animation files, web files, compressed files wait.
  • the drag operation on the first object may be that the user clicks and drags the first object on the display screen (touch screen) of the source end with a finger, stylus, etc. Perform mobile operations.
  • the source is a non-touch screen device
  • the source can be connected to input devices such as mouse, keyboard, touch panel, etc., and display the pointer corresponding to the input device on the display interface.
  • the drag operation of the first object can be user
  • the operation of dragging the first object by the pointer to move can be controlled through the input device.
  • the source end displays a drag shadow of the first object in response to a drag operation on the first object, and sends a first message to the receiver end, where the first message includes the first object and the drag shadow of the first object.
  • the first message may be a drag start message.
  • the first object included in the first message is also the specific content to be dragged.
  • the receiving end receives the first message. After receiving the first message, the receiving end may execute S303.
  • the receiving end generates a drag shadow of the first object according to the first message.
  • the source After the source detects that the first object is dragged to the edge of the display interface of the source, the source sends a second message to the receiver, and the second message is used to notify the receiver that the first object will be dragged to the display of the receiver. interface.
  • the receiving end receives the second message.
  • the source detects that the first object is dragged to the edge of the display interface of the source may mean that the source detects that the distance between the position where the first object is dragged and the edge of the display interface of the source is less than a preset distance Threshold, such as: less than 2 pixels, or less than 0.1 centimeter (cm), etc.
  • Threshold such as: less than 2 pixels, or less than 0.1 centimeter (cm), etc.
  • the source can determine the coordinate position of the pointer on the display screen according to the initial position and relative displacement of the pointer, so as to determine whether the pointer slides to the edge of the display screen.
  • the pointer slides to the edge of the display screen Indicates that the first object is dragged to the edge of the display interface of the source end.
  • the receiving end may execute S305.
  • the receiving end displays the drag shadow of the first object, and displays an application recommendation panel, where the application recommendation panel includes application identifiers corresponding to one or more applications.
  • the application identifier may be an application icon of the application.
  • the application recommendation panel may also be called a recommendation panel, a recommendation window, an application recommendation window, an application recommendation pop-up window, etc., and the name of the application recommendation panel is not limited here.
  • the aforementioned application recommendation panel, recommendation panel, recommendation window, application recommendation window, application recommendation pop-up window, etc. may be referred to as the first window.
  • the above operation of dragging the first object from the display interface of the source end to the display interface of the receiver end may be referred to as a first operation. That is, the receiving end may display the first window in response to the first operation.
  • the above-mentioned cross-device dragging between the source end and the receiving end may refer to dragging based on non-screen projection.
  • the user can use a set of input devices (such as a mouse) to control both the source end and the receiver end, and drag the first object from the display interface of the source end to the The display interface of the receiver.
  • the keyboard and mouse sharing technology may refer to a technology of using an input device (such as a mouse or a touch pad) of one terminal to realize control of other terminals.
  • the receiving end after receiving the first message, the receiving end can create a virtual input device, which has the same function as conventional input devices such as mouse and touchpad, and can be used for the receiving end to simulate Corresponds to input events.
  • the source is a PC
  • the input device of the PC is a mouse
  • the receiver is a mobile phone
  • the virtual input device created by the mobile phone has the same function as a conventional mouse.
  • the mobile phone can be regarded as a mouse shared by the PC to the mobile phone, and can be used to The mobile phone simulates mouse events to realize the control of the mobile phone by the mouse of the PC.
  • the operating system of the mobile phone is an Android system as an example.
  • the mobile phone can use the uinput capability of linux to realize the creation of virtual input devices.
  • uinput is a kernel layer module that can simulate an input device.
  • a process can create a virtual input device with a specific function. Once the virtual input device is created, it can simulate corresponding events.
  • this application does not limit the specific implementation principle of the keyboard and mouse sharing technology.
  • the keyboard and mouse sharing technology may also be implemented based on other principles between the source end and the receiving end, so that users can use A set of input devices enables control of both the source and the sink.
  • FIG. 4 is a schematic diagram of a scenario of dragging based on non-screen projection provided by an embodiment of the present application.
  • the display interface 401 of the PC may include a first object 402 and a mouse pointer (a small arrow in the figure, not marked).
  • the dragging operation on the first object 402 may be an operation in which the user clicks and drags the first object 402 with a mouse.
  • the user can move the pointer of the mouse to the first object 402 , click and hold the left button of the mouse and move the mouse, so that the first object 402 can be dragged.
  • the PC may display the drag shadow 403 of the first object 402 .
  • the user can use the mouse of the PC to drag the first object 402 in the display interface 401 of the PC from the display interface 401 of the PC to the display interface 404 of the mobile phone.
  • the aforementioned cross-device drag between PC and mobile phone is drag and drop based on non-screen projection.
  • the PC may send a second message to the mobile phone, notifying the mobile phone that the first object 402 will be dragged to the display interface 404 of the mobile phone.
  • the mobile phone After the mobile phone receives the second message, it can display the drag shadow 403 of the first object 402, and display the application recommendation panel 405.
  • the application recommendation panel 405 includes application identifiers corresponding to one or more applications, such as the application icon of application A , the application icon of application B, . . . , the application icon of application F, and so on.
  • the dragged shadow 403 of the first object 402 displayed on the display interface 404 of the mobile phone is consistent with the display of the PC.
  • the drag shadow 403 of the first object 402 displayed on the interface 401 is related, the drag shadow 403 of the first object 402 displayed on the display interface 404 of the mobile phone will gradually increase to complete, and the first object displayed on the display interface 401 of the PC
  • the drag shadow 403 of the object 402 will gradually shrink until it disappears.
  • the cross-device dragging between the source end and the receiving end may also refer to dragging based on screen projection.
  • the receiver end can project the display interface to the display screen of the source end, or the source end can project the display interface to the display screen of the receiver end.
  • the reverse control capability referring to the ability to use the input device of the source end to control the receiving end
  • the keyboard and mouse sharing technology the user can use the input device of the source end to realize the control of the source end.
  • FIG. 5 is a schematic diagram of a drag-and-drop scenario based on screen projection provided by an embodiment of the present application.
  • the mobile phone can project the display interface of the mobile phone to the display screen of the PC.
  • the display screen of the PC can include the display interface 401 of the PC and the display interface of the mobile phone. 404.
  • the display interface 401 of the PC may include a first object 402 and a mouse pointer (a small arrow in the figure, not marked).
  • the dragging operation on the first object 402 may be an operation in which the user clicks and drags the first object 402 with a mouse.
  • the user can move the pointer of the mouse to the first object 402 , click and hold the left button of the mouse and move the mouse, so that the first object 402 can be dragged.
  • the PC may display the drag shadow 403 of the first object 402 .
  • the user can use the mouse of the PC to drag the first object 402 in the display interface 401 of the PC from the display interface 401 of the PC to the display interface 404 of the mobile phone.
  • the aforementioned cross-device drag between PC and mobile phone is drag and drop based on screen projection.
  • the PC can send a second message to the mobile phone, informing the mobile phone that the first object 402 will be dragged to the mobile phone on the display interface 404 of the After the mobile phone receives the second message, it can display the drag shadow 403 of the first object 402, and display the application recommendation panel 405.
  • the application recommendation panel 405 includes application identifiers corresponding to one or more applications, such as the application icon of application A , the application icon of application B, . . . , the application icon of application F, and so on.
  • the application identification included in the above application recommendation panel may be manually defined or preset.
  • the user or the service provider at the receiving end may pre-configure the application identifiers of which applications are displayed in the application recommendation panel. That is, the application identification included in the application recommendation panel is fixed.
  • the above application recommendation panel may also include application identifications of all applications installed on the receiving end. For example, taking the above example of dragging the first object from the display interface of the PC to the display interface of the mobile phone, assuming that there are N applications installed on the mobile phone (N is an integer greater than 0), the application recommendation panel may include the Application icons for N applications.
  • the application identification included in the above application recommendation panel may also be an application identification of an application that can open the first object determined by the receiving end according to the type of the first object.
  • the receiving end can determine the type of the first object according to the first object included in the first message, or, the first message can include a separate field for indicating the type of the first object, and the receiving end The terminal can determine the type of the first object according to this field.
  • the receiving end may select applications that can open the first object from all applications installed on the receiving end according to the type of the first object, and display application identifications of these applications that can open the first object in the application recommendation panel. For example, assuming that the word application and the excel application are installed on the receiving end, and the first object is a word document, the recommendation panel may include an application icon of the word application.
  • the receiving end after the receiving end determines the type of the first object, it can obtain a list of applications supporting the type through the queryIntentActivities method of the PackageManager according to the type of the first object information.
  • the receiving end displays the dragging shadow of the first object, which means that the first object is dragged to the display interface of the receiving end.
  • the user may continue to perform the above-mentioned drag operation on the first object (for example, continue to use the mouse to drag and drop), and drag the first object to move in the display interface of the receiving end.
  • the user may drag the first object to any application identifier displayed in the application recommendation panel by continuing to perform the above-mentioned drag operation on the first object, so as to open the first object through the application corresponding to the application identifier.
  • the receiving end may respond by dragging the first object onto the application identifier of the application that cannot open the first object An operation is performed to display first prompt information, where the first prompt information is used to prompt that the application corresponding to the application identifier cannot open the first object.
  • the receiving end may respond by dragging the first object Go to the application identifier of the application that can open the first object, and end the operation of dragging the first object, and open the first object through the application corresponding to the application identifier.
  • the receiving end opens the first object through the application corresponding to the application identifier, which may include: the receiving end opens the application corresponding to the application identifier, and transfers the first object to the application corresponding to the application identifier.
  • the user may drag and drop the first object onto the first application identifier, and the first application identifier is one of the application identifiers included in the first window (that is, the above-mentioned application recommendation panel).
  • the receiving end may display first prompt information.
  • FIG. 6 is a schematic diagram of a display interface of the mobile phone provided by the embodiment of the present application.
  • the mobile phone when the user uses the mouse to drag the first object from the display interface of the PC to the display interface of the mobile phone, the mobile phone can display an application recommendation panel 601, and the display interface of the mobile phone also includes the dragging and dropping of the first object. shade 602.
  • the application recommendation panel 601 includes application identifications corresponding to application A, application B, application C, application D, application E, application F, and application G, respectively.
  • application A, application B, and application C are all applications that can open the first object
  • application D, application E, application F, and application G are all applications that cannot open the first object.
  • the mobile phone can monitor the drag drop event.
  • the mobile phone may open the first object through the application corresponding to the application identifier in response to the user dragging the first object onto the application identifier and ending the operation of dragging the first object.
  • the first object may be a word document
  • application A is a word application.
  • the mobile phone may open the word document through application A.
  • the mobile phone may respond to the user dragging the first object
  • An operation of dragging an object onto the application identifier displays first prompt information, and the first prompt information is used to prompt that the application corresponding to the application identifier cannot open the first object.
  • the first object may be a word document
  • application D is an excel application
  • the mobile phone may display first prompt information.
  • the receiving end can change the display of the application identifier (such as the application icon) state to display the first prompt message.
  • FIG. 7 is another schematic diagram of a display interface of a mobile phone provided by an embodiment of the present application.
  • the mobile phone may respond to the user dragging the first object to the application ID of application D.
  • the operation on the logo changes the originally displayed pattern of "D" in the application icon of application D to display a "slash/slash" 603 to prompt the user that application D cannot open the first object.
  • FIG. 8 is another schematic diagram of a display interface of a mobile phone provided in an embodiment of the present application.
  • the mobile phone may respond to the user dragging the first object to the application ID of application D.
  • the operation on the logo changes the originally displayed pattern of "D" in the application icon of application D to display a prompt of "X" 604, so as to remind the user that application D cannot open the first object.
  • the above “slash/slash” 603 and "X" 604 shown in Figs. 7 and 8 may also be referred to as prohibition signs.
  • the prohibition sign is the above-mentioned first prompt information.
  • the prohibition sign may also be other patterns, which are not limited here.
  • the receiving end displays the first prompt information by changing the display state of the application identifier (such as an application icon) It may also include: the receiving end darkens or grays the application identification, and the effect of darkening or graying the application identification is the above-mentioned first prompt information. For example, the receiving end may change the display color of the application identification to gray. After the gray color, the application corresponding to the application identification may be regarded as inactive, and the user may be prompted that the application corresponding to the application identification cannot open the first object.
  • the first prompt information is implemented in the form of non-text prompts.
  • the receiving end may also display text prompts as the first prompt information on a display interface (eg, around or on the application logo). It should be noted that, the present application does not limit the implementation manner of the first prompt information.
  • the application identifications included in the above application recommendation panel are defined or preset manually, or the application identifications included in the application recommendation panel include application identifications of all applications installed on the receiving end.
  • the receiving end displays the application recommendation panel, it may also display the application identifications of the applications that cannot open the first object and the application identifications of the applications that can open the first object in the application recommendation panel, so as to prompt the user which applications can Open the first object, which applications cannot open the first object.
  • the user may directly drag the first object onto the application identifier of the application that can open the first object, so as to trigger the corresponding application to open the first object.
  • the receiving end may determine the type of the first object according to the first message, and the specific manner of determining the type of the first object may refer to the foregoing embodiments, and details are not repeated here.
  • the receiving end can classify the application identifiers included in the application recommendation panel, and determine which applications corresponding to the application identifiers can open the first object, and which applications corresponding to the application identifiers cannot open the first object.
  • the receiving end displays the application recommendation panel, for the application identifications of the applications in which the first object cannot be opened, the receiving end may darken or gray out such application identifications.
  • the receiving end can change the display color of the application identification of the application that cannot open the first object in the application recommendation panel to gray.
  • the corresponding application cannot open the first object.
  • the receiving end can highlight such application identifiers to prompt the user that the application corresponding to this type of application identifier can be opened first object.
  • the user does not need to open the application interface that can open the first object on the receiving end in advance.
  • the user can recommend the first object according to the application displayed on the receiving end. panel, select the application you want to open the first object.
  • the drag and drop method is simpler, and the receiver's response to the dragged object is more intelligent.
  • the purpose of the user dragging the first object to the display interface of the receiving end may be to save the first object on the receiving end.
  • the user can drag the first object to a blank area of the display interface of the receiving end (such as an area without application icons), and end the operation of dragging the first object, and the receiving end can
  • the first object is saved, for example, the first object may be saved in a default storage path.
  • the blank area may refer to a blank area in the application recommendation panel, or other blank areas in the display interface of the receiving end except the application recommendation panel.
  • the blank area in the aforementioned application recommendation panel, or the blank area other than the application recommendation panel in the display interface of the receiving end may be referred to as the first area, and the operation of dragging the first object to the first area may be It is called the fourth operation, and the dragged object corresponding to the fourth operation may also be called the second object.
  • the second object includes the aforementioned first object.
  • the application recommendation panel may also include a dedicated area for triggering saving of the first object, or an icon (icon) for triggering saving of the first object, and the user may drag the first object to the application recommendation panel In the area specially used to trigger saving of the first object, or on the icon of the trigger saving of the first object, and end the operation of dragging the first object, so as to trigger the receiving end to save the first object.
  • the aforementioned area dedicated to triggering saving of the first object may be called a second area
  • the icon dedicated to triggering saving of the first object may be called a first icon.
  • FIG. 9 is another schematic diagram of a display interface of a mobile phone provided in an embodiment of the present application.
  • the application recommendation panel may include a storage area 901 .
  • the user may drag the first object into the saving area 901, and end the operation of dragging the first object.
  • the mobile phone may save the first object in response to the user dragging the first object into the saving area 901 and ending the operation of dragging the first object.
  • the way the receiving end displays the application recommendation panel may include: full-screen display or non-full-screen display.
  • the non-full-screen display is used as an example in the above-mentioned Fig. 4 and Fig. 5
  • Fig. 6 to Fig. 9 are Take full screen display as an example.
  • Non-full-screen display can specifically include half-screen display (that is, the area of the application recommendation panel occupies half of the display screen area of the mobile phone), one-third screen display (that is, the area of the application recommendation panel occupies one-third of the display screen area of the mobile phone), etc. .
  • the method of displaying the application recommendation panel by the receiving end may also be a drawer display.
  • the steps described in S305 above may include: the receiving end displays the drag shadow of the first object, and displays the drawer button (drawer icon) corresponding to the application recommendation panel; the receiving end responds to The user drags the first object onto the drawer button, and stays on the drawer button for a preset duration (for example, a fourth duration) to display the application recommendation panel in a drawer display mode.
  • the preset duration can be 2 seconds, 3 seconds, 5 seconds, etc., without limitation.
  • FIG. 10A is another schematic diagram of a display interface of the mobile phone provided in the embodiment of the present application.
  • the mobile phone may display the drag shadow 602 of the first object, and display the drawer button 1001 corresponding to the application recommendation panel.
  • FIG. 10B is a schematic diagram of a scenario in which a user triggers a mobile phone to display an application recommendation panel in a drawer display mode according to an embodiment of the present application.
  • the user may drag the first object onto the drawer button 1001 and stay on the drawer button 1001 for a preset period of time.
  • the mobile phone may display the application recommendation panel 601 in a drawer display mode in response to the user's operation of dragging the first object onto the drawer button 1001 and staying on the drawer button 1001 for a preset period of time.
  • the mobile phone can simulate a click event injection system.
  • the click event injection system can generate a click on the drawer button
  • the event of 1001, the event of clicking the drawer button 1001 may trigger the mobile phone to display the application recommendation panel 601.
  • the above are some possible display modes of the application recommendation panel, and the application does not limit the display mode of the application recommendation panel.
  • the receiving end when the receiving end displays the application recommendation panel, if the number of application identifications included in the application recommendation panel is too large and the application recommendation panel cannot display all of these application identifications, the receiving end may display some application identifications in the application recommendation panel to scroll/slide the rest of the app logos below the app recommendation panel.
  • FIG. 10C is another schematic diagram of the display interface of the mobile phone provided in the embodiment of the present application.
  • the mobile phone may preferentially display some application identifications (such as the application identifications of application A to application I) on the application recommendation panel,
  • the rest of the application logos can be displayed by sliding below.
  • the user wants to view the rest of the application IDs, he can drag the first object to the bottom area where the application IDs are displayed, triggering the mobile phone to slide up the application IDs displayed in the application recommendation panel, so that the rest of the application IDs are displayed.
  • the user may also trigger the mobile phone to slide to display more application logos by means of multi-point/finger touch (focus points other than the focus point of the first object), which is not limited here.
  • multi-point/finger touch focus points other than the focus point of the first object
  • another finger can be used to drag the rectangular black slide shown in Figure 10C (the actual display effect may also be other colors), triggering the phone to slide to display more application logos .
  • the partial application identifiers displayed preferentially in the application recommendation panel may be the application identifiers of the applications most/highly used by the user.
  • FIG. 11 is another schematic diagram of a display interface of a mobile phone provided by an embodiment of the present application.
  • the mobile phone can also group and display the part of the application identifications that are displayed first and the rest of the application identifications in the application recommendation panel .
  • the application recommendation panel may include two groups of "recommended applications” 1101 and "more applications” 1102 .
  • Part of the application identifiers that are displayed preferentially can be displayed in the group of "recommended applications” 1101, and the rest of the application identifiers can be displayed in the group of "more applications” 1102, and the application identifiers displayed in the group of "more applications” 1102 can support sliding show.
  • the receiver can display (pop up) the service menu (or function menu) of the application corresponding to the application ID next to the application ID.
  • you can It includes one or more service identifiers of services included in the application corresponding to the application identifier.
  • the user may continue to perform the above-mentioned dragging operation on the first object, drag the first object onto a certain service identifier (such as the first service identifier) displayed in the service menu, and end the dragging operation on the first object,
  • the receiving end may respond to the user dragging the first object onto a certain service identifier displayed in the service menu and ending the operation of dragging the first object, open the first object through the application corresponding to the application identifier, and start
  • the service corresponding to the service identifier processes/operates on the first object.
  • the user may drag and drop the first object onto the first application identifier, and the first application identifier is one of the application identifiers included in the first window (that is, the above-mentioned application recommendation panel).
  • the receiving end may display a service menu corresponding to the application.
  • the service menu corresponding to the application may also be referred to as the second window.
  • the application recommendation panel includes an application identification of an image processing application P
  • the image processing application P includes services such as preview, filter, and mosaic
  • Figure 12 is provided in the embodiment of this application.
  • the application recommendation panel may include the application identification 1201 of the image processing application P.
  • a service menu 1202 of the image processing application P is displayed.
  • the service menu 1202 may include service identifiers such as "preview”, "filter”, and "mosaic".
  • the user can continue to perform the dragging operation on the picture, drag the picture to any service logo in "Preview”, “Filter”, “Mosaic”, etc., and end the dragging operation on the picture, and the mobile phone can respond to The user drags the picture onto the service identifier, finishes the operation of dragging the picture, opens the picture through the image processing application P, and starts the service corresponding to the service identifier to process/operate the first object.
  • the mobile phone can respond to the user dragging the picture to the "preview” and ending the operation of dragging the picture, and pass the image
  • the processing application P opens the first object, and starts the preview service of the image processing application P to preview the picture.
  • the service identifier included in the above service menu may be manually defined or preset.
  • the user or the service manufacturer at the receiving end may pre-configure which service identifiers are included in the service menu corresponding to each application. That is, the service identifier included in the service menu is fixed.
  • the service menu may also include service identifiers of all services corresponding to the application. For example, taking dragging the first object onto the application identifier of the above-mentioned image processing application P as an example, assuming that the image processing application P includes N services (N is an integer greater than 0), then in the service menu corresponding to the image processing application P It may include service identifiers of the aforementioned N services.
  • the service identifier included in the above service menu may also be a service identifier of a service that can open the first object determined by the receiving end according to the type of the first object.
  • the receiving end can determine the type of the first object according to the first object included in the first message, or, the first message can include a separate field for indicating the type of the first object, and the receiving end The terminal can determine the type of the first object according to this field.
  • the receiving end can select the service that can open the first object from all the services included in the application corresponding to the application identifier according to the type of the first object. Services, and display the service IDs of these services that open the first object in the Services menu.
  • the receiving end may display a second prompt message, which is used to prompt The service corresponding to the service identifier cannot open the first object.
  • the second prompt information may refer to the first prompt information.
  • the receiving end may display the second prompt information by changing the display status of the service identifier, which will not be repeated here.
  • the receiving end when the user drags the first object to the display interface of the receiving end, the receiving end may also first determine whether the current display interface supports dragging. When the current display interface of the receiving end is an interface that does not support drag-in, the receiving end may display an application recommendation panel. When the current display interface of the receiving end is an interface that supports drag-in, the receiving end may not display the application recommendation panel.
  • the interface that does not support drag-in may include a system desktop (desktop for short), a system pop-up window, an application interface that does not support drag-in, and the like.
  • the interface supporting drag-in may include an application interface supporting drag-in.
  • the application interface that supports drag-in may be the chat interface of some chat applications installed on the mobile phone.
  • the current display interface of the receiving end can respond to the drag event, the current display interface can be Add View.OnDragListener to monitor and return a return value (True).
  • the framework layer of the receiving end can judge whether the current display interface supports responding to drag events based on the return value (True). If the current display interface does not support responding to drag events, it means that the current display interface does not support The dragged interface. If the current display interface supports responding to drag events, it means that the current display interface supports drag-in.
  • the user's dragging purpose is originally some scenes that support drag-in interfaces, which can better meet the user's actual dragging requirements.
  • the receiving end does not need to display an application recommendation panel, and the interface that supports drag-in can directly respond to the drag event of the first object.
  • the interface that supports drag-in is the application interface of an application
  • the application can respond to the drag event of the first object and perform creation/editing, viewing, attachment, sending, insertion, search and jumping according to the first object. Turn and wait.
  • the receiving end when the user drags the first object to the display interface of the receiving end, if the receiving end judges that the current display interface is an interface that does not support drag-in (such as the desktop of a mobile phone), the receiving end can also according to the user's After the first object is dragged to the display interface of the receiving end, a further interactive action is performed to determine the user's intention, and determine whether to display the application recommendation panel according to the user's intention.
  • the receiving end may classify user intentions into two types: saving the first object, or looking for an application to open the first object.
  • the receiving end may determine that the user intends to save the first object.
  • the receiving end may respond to the user dragging the first object to the display interface of the receiving end and directly end the dragging operation on the first object, and save the first object, for example, in a default storage path.
  • the receiving end may determine that the user intends to open the first object to find a certain application.
  • the first object may be a picture, and the user may want to find an application in the receiving end to preview the picture.
  • the receiving end may display the above-mentioned application recommendation panel in response to the trigger condition that the user has not finished dragging the first object, and the pointer) stays in the display interface of the receiving end for a first duration, so as to realize The purpose of recommending apps to users.
  • directly ending the dragging operation on the first object may also refer to: the time between when the user drags the first object to the display interface of the receiving end and when the user ends the dragging operation on the first object is less than the above-mentioned First time.
  • the receiving end may classify user intentions into the following two types: opening the first object by default, or searching for an application to open the first object.
  • opening the first object by default
  • searching for an application to open the first object if the user drags the first object to the display interface of the receiving end, if the receiving end judges that the current display interface does not support drag-in, and the user drags the first object to the display interface of the receiving end, further interaction
  • directly ending the dragging operation on the first object such as loosening/releasing the mouse
  • the receiving end may determine that the user intends to open the first object in a default manner.
  • the receiving end may respond to the user dragging the first object to the display interface of the receiving end directly to end the operation of dragging the first object, and open the first object in a default manner, such as: open the first object in a default browser. object.
  • the receiving end determines that the current display interface does not support drag-in, and the user drags the first object to the display interface of the receiving end, further The interaction action is: the user does not end the drag operation on the first object, and the pointer stays in the display interface of the receiving end for the first time, the receiving end can determine that the user intends to open the first object to find a certain application, and receives
  • the application recommendation panel may be displayed on the terminal, and details are not repeated here.
  • the above-mentioned trigger condition for displaying the application recommendation panel "the user has not finished dragging the first object, and the pointer stays in the display interface of the receiving end for a first duration" It can also be replaced with “the user has not finished dragging the first object, and the dragging/sliding distance of the pointer on the display interface of the receiving end is greater than the first threshold", for example: the first threshold can be 3 centimeters (cm) , 5cm, etc., there is no limitation on the size of the first threshold.
  • the receiving end determines that the current display interface does not support drag-in, and the user drags the first object to the display interface of the receiving end, further The interaction action is: the user does not end the drag operation on the first object, and the drag/slide distance of the pointer on the display interface of the receiving end is greater than the first threshold, then the receiving end can display the above-mentioned application recommendation panel.
  • the above two trigger conditions for displaying the application recommendation panel can also be in the relationship of "and” or “or”, when the above two trigger conditions for displaying the application recommendation panel are "and” relationship, when the above two trigger conditions for displaying the application recommendation panel occur at the same time, the receiving end may display the above application recommendation panel.
  • the above two trigger conditions for displaying the application recommendation panel are in an "or” relationship, when any one of the trigger conditions occurs, the receiving end can display the above application recommendation panel.
  • This application does not limit the triggering conditions for displaying the application recommendation panel at the receiving end.
  • the receiving end can judge the user's intention according to the further interaction after the user drags the first object to the display interface of the receiving end, and determine whether to display the application recommendation panel according to the user's intention, so that the receiving end can know more accurately
  • the user's dragging purpose and provide users with complete dragging interaction functions based on the user's dragging purpose.
  • the user drags an object to the display interface of the receiving end, and the operation of directly ending the dragging of the object can be referred to as the first Three operations.
  • the drag object corresponding to the third operation may also be referred to as a second object.
  • the second object may include the first object described above.
  • the receiving end when the user drags the first object to the display interface of the receiving end, if the receiving end determines that the current display interface is an interface that supports drag-in (such as an application interface of a third-party application), then the receiving end The terminal may also judge the user's intention according to the further interaction after the user drags the first object to the display interface of the receiving terminal, and determine whether to display the application recommendation panel according to the user's intention.
  • the receiving end determines that the current display interface is an interface that supports drag-in (such as an application interface of a third-party application)
  • the receiving end The terminal may also judge the user's intention according to the further interaction after the user drags the first object to the display interface of the receiving terminal, and determine whether to display the application recommendation panel according to the user's intention.
  • the receiving end can divide the user intention into the following two types: use the relevant functions in the current display interface to perform subsequent operations on the first object operation (that is, the receiving end responds to the drag event of the first object in the current display interface), or finds an application to open the first object.
  • the user drags the first object to the display interface of the receiving end if the receiving end judges that the current display interface supports drag-in, and the user drags the first object to the display interface of the receiving end, the further interaction is : Directly end the dragging operation on the first object, then the receiving end may determine that the user intends to perform subsequent operations on the first object by using related functions in the current display interface.
  • the receiving end may respond to the user dragging the first object to the display interface of the receiving end and directly end the dragging operation on the first object, and respond to the dragging event of the first object, such as: create an object based on the first object / Edit, view, add attachments, send, insert, search and jump, etc.
  • the further interaction is : The user has not finished dragging the first object, and the pointer stays on the display interface of the receiving end for a first duration, and/or, the user has not finished dragging the first object, and the pointer is displayed on the receiving end If the dragging/sliding distance in the interface is greater than the first threshold, the receiving end can determine that the user does not have a clear drag-in destination, and the user intends to open the first object to find a certain application. At this point, the receiving end may display an application recommendation panel.
  • the receiving end when the user drags the first object to the display interface of the receiving end, if the current display interface of the receiving end is a desktop, the receiving end may not display the above-mentioned application recommendation panel.
  • the receiving end may respond to the user dragging the first object to the The operation on the application icon (such as the second application identifier) prompts the user that the application corresponding to the application icon can open the first object (that is, supports drag-in); at this time, the user can end the drag operation on the first object (such as release the mouse), the receiving end may respond to the user finishing the operation of dragging the first object, and open the first object through the application corresponding to the application icon.
  • the receiving end may prompt the user that the application icon corresponds to The application cannot open the first object (that is, does not support drag-in), and the user can continue to drag the first object to other application icons.
  • the receiving end may prompt the user whether the application corresponding to the application icon can open the first object by changing the display state of the application icon.
  • FIG. 13A is a schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the desktop of the mobile phone includes an application icon of application A and an application icon of application B.
  • application A is an application that can open the first object
  • application B is an application that cannot open the first object.
  • the mobile phone may respond to the user's operation of dragging the first object onto the application icon of application A, and change the display status of the application icon of application A to available.
  • the activation state (as shown by the dotted line in FIG.
  • 13A may represent an activatable state) is to prompt the user that the application A can open the first object.
  • the mobile phone can also change the display state of the application icon of application A (such as brightening, changing the display effect or color, etc.) in other ways to prompt the user that application A can open the first object, which is not limited here.
  • the user may end the dragging operation on the first object (for example, release the mouse), and the mobile phone may open the first object through application A in response to the user ending the dragging operation on the first object.
  • FIG. 13B is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the mobile phone can respond to the user's operation of dragging the first object onto the application icon of application B, and the originally displayed application B
  • the pattern of "B" in the application icon is changed to display a "slash/slash" to remind the user that application B cannot open the first object.
  • the mobile phone may also change the display state of the application icon of application B (such as darkening, graying out, etc.) in other ways to prompt the user that application B cannot open the first object, and no limitation is set here.
  • the user can continue to drag the first object onto other application icons.
  • the receiving end can also display a corner mark around the drag shadow or pointer (such as a mouse pointer) of the first object (such as the upper left corner, the lower left corner, the upper right corner, the lower right corner, etc.) to prompt the user whether the application corresponding to the application icon can open the first object.
  • a corner mark around the drag shadow or pointer such as a mouse pointer
  • the first object such as the upper left corner, the lower left corner, the upper right corner, the lower right corner, etc.
  • FIG. 14A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the desktop of the mobile phone includes the application icon of application A and the application icon of application B.
  • application A is an application that can open the first object
  • application B is an application that cannot open the first object.
  • the mobile phone may, in response to the user's operation of dragging the first object onto the application icon of application A, click on the upper left corner of the pointer (or the upper left corner of the first object).
  • corner displays a corner mark in the shape of a plus sign "+" 1401 to prompt the user that application A can open the first object.
  • the user may end the dragging operation on the first object (for example, release the mouse), and the mobile phone may open the first object through application A in response to the user ending the dragging operation on the first object.
  • FIG. 14B is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the mobile phone may respond to the user's operation of dragging the first object onto the application icon of application B, and in the upper left corner of the pointer (also It may be the upper left corner of the first object) to display a subscript in the shape of a minus sign "-" 1402 to prompt the user that application B cannot open the first object.
  • the user can continue to drag the first object onto other application icons.
  • the above-mentioned corner mark used to prompt the user that the application A can open the first object, and/or that the first object cannot be opened may also be in other shapes, which are not limited here.
  • the method of prompting the user whether the application can open the first object by changing the display state of the application icon shown in the above-mentioned Figures 13A-13B, and the method of prompting the user by adding a corner mark shown in the above-mentioned Figures 14A-14B Whether the application can open the first object is an example description.
  • the receiving end may also use other methods (such as text prompts) to prompt the application to open or fail to open the first object, which is not limited in this application.
  • the receiving end prompting that the application corresponding to a certain application icon cannot open the first object may also be regarded as displaying the first prompt information.
  • the first terminal device may display first prompt information in response to the operation of dragging the second object to the third application identifier, and the first prompt information is used to prompt that the application corresponding to the third application identifier does not support dragging the first object .
  • the first prompt information is the display information shown in FIGS. 13A-13B and 14A-14B above.
  • FIGS. 13A-14B only show an application icon of an application that can open the first object, and an application icon of an application that cannot open the first object.
  • the desktop of the receiving end may further include application icons of more applications.
  • the receiving end changes the display state of the application icon or adds a corner mark to remind the user that the application corresponding to the application icon can open the first object
  • the receiving end adds a corner mark to remind the user that the application corresponding to the application icon cannot open the first object
  • the receiving end can also change the display of the application identifier state or add a corner mark to prompt the user that the application corresponding to the application icon can open the first object.
  • the receiving end can also add a corner mark, To prompt the user that the application corresponding to the application identifier cannot open the first object.
  • the user drags the first object to the display interface of the receiving end, if the current display interface of the receiving end is the desktop, the receiving end does not display the application recommendation panel, in some possible scenarios, the user drags the first object
  • the purpose of dragging to the display interface of the receiving end may also be to save the first object on the receiving end or to open the first object in a default manner.
  • the user may drag the first object to a blank area of the desktop of the receiving end (such as an area without application icons), and end the operation of dragging the first object, and the receiving end may respond to the user You can drag the first object to the blank area of the desktop of the receiving end, and end the operation of dragging the first object, save the first object or open the first object in the default mode, such as: you can save the first object in the default storage path.
  • the receiving end may also respond to the user dragging the first object to the location where the first object cannot be opened. Open the application icon corresponding to the application of the first object, end the operation of dragging the first object, save the first object or open the first object in a default manner.
  • the receiving end may also determine the user's intention according to further interaction actions after the user drags the first object to the desktop of the receiving end.
  • the user drags the first object to the desktop of the receiving end if the user's further interaction action is to directly end the dragging operation on the first object (such as loosening/releasing the mouse), the receiving end can determine that the user intends to save first object or open the first object by default.
  • the receiving end may respond to the user dragging the first object to the desktop of the receiving end and directly ending the operation of dragging the first object, saving the first object or opening the first object in a default mode, such as: saving in the default storage path.
  • the receiving end can determine that the user intends to find a certain application Open the first object.
  • the receiving end may, in response to the user's operation of dragging the first object onto the application icon, prompt the user that the application corresponding to the application icon can be opened according to the method described in the above-mentioned embodiment in which the receiving end does not display the application recommendation panel.
  • the first object ie, supports drag-in
  • prompting the user that the application corresponding to the application icon cannot open the first object ie, does not support drag-in.
  • the receiving end when the user drags the first object to the desktop of the receiving end, if the receiving end determines that the user intends to open the first object to find a certain application according to the above method, when the user drags the first object When the user drags the first object to the application icon corresponding to the application that cannot open the first object and ends the drag operation of the first object, or when the user drags the first object to a blank area of the desktop and ends the drag operation of the first object During the drag operation, the receiving end can respond to the aforementioned operation by changing the shadow of the first object into a bubble-shaped floating window, such as: floating balls, bubbles (or called drag bubbles), etc., and corresponding to the first object The bubble-shaped floating window of is attached (moved) to the edge of the desktop for display.
  • a bubble-shaped floating window such as: floating balls, bubbles (or called drag bubbles), etc.
  • the bubble-shaped floating window corresponding to the first object can support the user to drag and drop the first object again.
  • the corresponding bubble-shaped floating window is changed to the shadow of the first object for the user to continue dragging the first object.
  • the aforementioned bubble-shaped floating window may be referred to as the first floating window, and in some other embodiments, the first floating window may also be in other shapes, which are not limited.
  • FIG. 15 is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the mobile phone determines that the user intends to open the first object to find a certain application according to the above method, then when the user drags the first object to the application B (As mentioned above, application B is an application that cannot open the first object) and ends the drag operation on the first object, or when the user drags the first object to a blank area of the desktop and ends
  • the mobile phone may respond to the aforementioned operation by changing the shadow of the first object into a drag bubble 1501, and attaching the drag bubble 1501 to the edge of the desktop for display.
  • the drag operation can be performed on the drag bubble 1501, and the mobile phone can change the drag bubble 1501 to the shadow of the first object for the user to continue to perform operations on the first object. drag and drop.
  • the expression styles of the dragging bubble 1501 and the shadow of the first object may be the same or different, which is not limited here. It should be understood that when the user performs a drag operation on the drag bubble 1501 , the subsequent dragging process of the first object is the same as that of the first object described in the foregoing embodiment, and will not be repeated here.
  • the receiving end may include multiple interfaces.
  • the desktop at the receiving end may include a main interface and one or more other interfaces.
  • Application icons can be included in both the main interface and other interfaces.
  • the receiving end may also support the user to drag the first object to slide and switch between the main interface and other interfaces, so as to select the application that the user wants to use.
  • FIG. 16 is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the desktop of the mobile phone can include the main interface as shown in (a) in Figure 16, and other interfaces as shown in (b) in Figure 16, and the interface shown in (a) in Figure 16
  • the main interface and other interfaces shown in (b) in FIG. 16 can be switched and displayed on the front end.
  • FIG. 17 is a schematic diagram of a scene of switching a display interface of a mobile phone provided by an embodiment of the present application. As shown in Figure 17, suppose that when the user drags the first object to the desktop of the mobile phone, the main interface shown in (a) in Figure 16 is displayed on the front of the mobile phone.
  • the mobile phone when the user drags the first object to the edge of the main interface shown in (a) in Figure 16, the mobile phone can transparently transmit the drag movement event Go to the desktop launcher (launcher), control the desktop to respond to the drag movement event, and slide to the next interface.
  • the user when the user wants to switch the desktop of the receiving end to the next interface (such as switching from the main interface to another interface), the user can also end the dragging operation on the first object first, and trigger the receiving end to switch to the next interface.
  • the shadow of the first object is changed into a bubble-shaped floating window, and the bubble-shaped floating window corresponding to the first object is adsorbed (moved) to the edge of the desktop for display.
  • the user can switch the display interface of the receiving end from the main interface to other interfaces through a normal screen cutting operation (such as sliding the screen of the mobile phone).
  • the bubble-shaped floating window corresponding to the first object is still attached to the edge of the desktop (other interfaces at this time) for display, and the user can display it based on the bubble shape corresponding to the first object
  • the floating window continues to perform the drag operation on the first object, continues to drag the first object in other interfaces, and selects an application that can open the first object.
  • the present application does not limit the specific manner of how to trigger the operation of switching the display interface of the mobile phone.
  • the desktop of the receiving end may also include: a folder, and the folder includes one or more application icons.
  • the receiving end may also support the user to drag the first object onto the application icons included in the folder, so that the user can select the desired application from the application icons included in the folder.
  • FIG. 18A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the desktop of the mobile phone may include the application icon of application A, the application icon of application B, and "folder 1" 1801, and the "folder 1" 1801 may include the application icon of application D, the application icon of application E icon, and the application icon of application F.
  • FIG. 18B is a schematic diagram of a scene where a mobile phone opens "folder 1" 1801 according to an embodiment of the present application. As shown in FIG. 18B , assuming that the user drags the first object to the desktop of the mobile phone as shown in FIG.
  • the display interface corresponding to "folder 1" 1801 may include an application icon of application D, an application icon of application E, and an application icon of application F. The user may continue to drag the first object, and based on any one of the application icon of application D, the application icon of application E, and the application icon of application F, select the application corresponding to the application icon to open the first object.
  • the mobile phone can simulate a click event injection system.
  • the click event injection system can generate an event of clicking to open "folder 1" 1801, and the event of clicking to open "folder 1" 1801 can trigger the mobile phone Open folder 1 to display the display interface corresponding to folder 1.
  • the desktop of the receiving end may also include: a large folder (such as the first folder).
  • the large folder can also include multiple application icons, but for the multiple application icons included in the large folder, the large folder can be configured in a similar manner to the desktop.
  • the method directly displays at least N application icons, for example, N can be 8, 9 and so on.
  • the receiving end may display all application icons of the large folder in response to the user's operation of clicking the folding button.
  • the receiving end may also support the user to drag the first object onto the application icons included in the large folder, so that the user can select the desired object from the application icons included in the large folder.
  • the application icon included in the aforementioned large folder may also be referred to as a second application identifier.
  • FIG. 19A is another schematic diagram of the desktop of the mobile phone provided by the embodiment of the present application.
  • the desktop of the mobile phone may include "folder 1" 1901, and "folder 1" 1901 is displayed in the form of a large folder.
  • "Folder 1" 1901 includes application icons of applications such as application A, application B, application C...application I, and so on.
  • the application icons of application A to application H can be directly displayed in "folder 1" 1901, and the application icons of application I and other applications after application I are superimposed and displayed as an overlapping button in "folder 1" 1901 1902.
  • the mobile phone may respond to the operation of dragging the first object onto the application icon of the application to open the first object through the application.
  • FIG. 19B is a schematic diagram of a scenario where a mobile phone is triggered to display all application icons of "folder 1" 1901 provided by an embodiment of the present application.
  • the user wants to select application 1 and one of other applications after application 1 to open the first object, the user can first drag the first object to the overlapping button 1902 and hover for a second duration. (for example, 1 second), the mobile phone may display all the application icons of "folder 1" 1901 in response to the user's operation of dragging the first object to the overlapping button 1902 and hovering for a second duration.
  • all application icons of "folder 1" 1901 include application icons of application A to application L as an example.
  • the user can also click the overlapping button 1902 by means of multi-finger touch, triggering the mobile phone to display all the application icons of the "folder 1" 1901 .
  • the user can use another finger to click the overlapping button 1902, and the mobile phone can respond to the user's operation of clicking the overlapping button 1902, and display "folder 1" 1901 All apps icon for .
  • the receiving end does not display the application recommendation panel, when the user executes the first object Object drag operation, when dragging the first object to an application icon (which can be an application icon on the desktop, or an application icon in a folder or a large folder), if the application icon corresponding to the application icon
  • the first object can be opened and includes multiple services (or called functions), and the receiving end can also display (pop up) the service menu (or called function menu) of the application corresponding to the application icon next to the application icon,
  • the service menu may include one or more service identifiers of services included in the application corresponding to the application icon.
  • the user may continue to perform the above-mentioned dragging operation on the first object, and drag the first object to a certain service logo displayed in the service menu, and the receiving end may respond to the user dragging the first object to a service logo displayed in the service menu.
  • the application corresponding to the application identifier opens the first object, and starts the service corresponding to the service identifier to process/operate the first object.
  • the display mode of the service menu may be similar to that shown in FIG. 12 above, and the difference is that in this embodiment, the service menu is displayed on the desktop.
  • the user drags the first object to an application icon by performing a drag operation on the first object, if the application corresponding to the application icon cannot open the first object, the user can also follow the above-mentioned embodiment way, displaying the first prompt message to prompt the user.
  • the user drags the first object to the display interface of the receiving end, if the current display interface of the receiving end is the desktop, the receiving end does not display the application recommendation panel, when the user drags the first object Operation, when the first object is dragged to a certain application icon, the service menu corresponding to the application icon displayed on the receiving end may be referred to as a third window.
  • the receiving end may receive the operation of dragging the first object to the second application ID, and in response to the operation of dragging the first object to the second application ID, display a third window, the third window including the second application ID
  • the service identifiers of one or more services included in the corresponding application may be received.
  • the receiving end described in the above embodiments responds to the user's operation of dragging the first object onto a certain application identifier (application icon), and opens the first object through the application corresponding to the application identifier, or, Displaying the service menu of the application corresponding to the application identifier may refer to: when the user drags the first object onto a certain application identifier (application icon) and the stay time is longer than the third duration, the receiving end responds to the user dragging the first object An operation of dragging an object onto the application identifier (application icon) opens the first object through the application corresponding to the application identifier, or displays a service menu of the application corresponding to the application identifier.
  • the third duration may be 2 seconds, 3 seconds, and so on.
  • the receiving end responds to the user's operation of dragging the first object onto a certain service identifier, and opens the first object through the service corresponding to the service identifier, which may refer to: when the user drags the first object When the first object is dragged onto a certain service identifier and the stay time is longer than the third duration, the receiving end responds to the user’s operation of dragging the first object onto the service identifier, and opens the first object through the service corresponding to the service identifier. object.
  • the above embodiments respectively introduce the cases of displaying the application recommendation panel and not displaying the application recommendation panel.
  • the receiving end may also according to the shadow of the first object or the position of the pointer (that is, the position where the first object is dragged), Make sure to show the app recommendation panel or show the service menu.
  • the receiving end may display the above-mentioned application recommendation panel.
  • the receiving end may display the service menu of the application corresponding to the application icon.
  • the display interface of the receiving end may also include a card corresponding to the application.
  • Cards also known as FA cards, are areas that appear directly on the desktop of the mobile phone and have certain functions. These functions can be provided by the corresponding application of the card.
  • the user can drag the first object onto the card, open the first object through the application corresponding to the card, or drag the first object onto a certain service corresponding to the card, and open the first object through the service.
  • the service corresponding to the card refers to the service that the application corresponding to the card can provide.
  • FIG. 19C is another schematic diagram of the display interface of the mobile phone provided by the embodiment of the present application.
  • the display interface of the mobile phone may include a card corresponding to the gallery.
  • the corresponding card of the gallery may also include service marks of services such as “microfilm creation”, “free creation” and “puzzle creation” (such as the pattern and text in the figure).
  • the first object can be dragged onto any of the service logos corresponding to the aforementioned "microfilm creation”, “free creation”, “puzzle creation” and other services, and the mobile phone can respond to dragging the first object onto the service logo. Operation, using the service corresponding to the service identifier to open the first object. For example, the user can drag and drop the picture to the "puzzle creation", and the mobile phone can start the "puzzle creation” service in the gallery to open the picture for the user to create a collage.
  • the prompt information may also be displayed to remind the user in the manner described in the foregoing embodiments, for example, the second prompt information, or the second prompt information may also be referred to as the first prompt information, without limitation.
  • the service identifier in the foregoing card may be called a second service identifier.
  • One or more of the application icons, folders, large folders, cards, etc. described in the above embodiments may also be displayed on the display interface of the mobile phone at the same time, and there is no limitation here.
  • the receiving end when the user drags the first object to the display interface of the receiving end, the receiving end can display the application recommendation panel regardless of whether the current display interface of the receiving end is a desktop, but the user can actively close the panel. App recommendation panel.
  • FIG. 20 is another schematic diagram of a display interface of the mobile phone provided by the embodiment of the present application.
  • the receiving end may display an application recommendation panel.
  • the application recommendation panel may include a close button 2001, the user may drag the first object onto the close button 2001, and stay on the close button 2001 for a preset duration (such as the fifth duration), and the mobile phone may respond to the user dragging the first object The object is dragged onto the close button 2001, and the operation of staying on the close button 2001 for a preset period of time closes the application recommendation panel.
  • the mobile phone can simulate an event of clicking the close button 2001 in response to the user's operation of dragging the first object onto the close button 2001 and staying on the close button 2001 for a preset time, and the event of clicking the close button 2001 can trigger the mobile phone Close the app recommendation panel.
  • the display interface of the mobile phone is currently the desktop, after the mobile phone closes the application recommendation panel, the display interface will be restored to the desktop.
  • the user may drag the first object to an application icon on the desktop according to the manner described in the foregoing embodiments.
  • the user may also click the close button 2001 through multi-finger touch to trigger the mobile phone to close the application recommendation panel. For example, assuming that the user drags the first object to the display interface of the receiving end with a certain finger, the user can use another finger to click the close button 2001, and the mobile phone can close the application recommendation panel in response to the user's operation of clicking the close button 2001.
  • the user when the user drags the first object to the display interface of the receiving end, the user may end the drag operation on the first object first, trigger the mobile phone to change the shadow of the first object into a bubble-shaped floating window, and Adhering (moving) the bubble-shaped floating window corresponding to the first object to the edge of the desktop for display. Then, the user can click the close button 2001 using a mouse or a finger. After the display interface returns to the desktop, the user can continue to drag the first object based on the bubble-shaped floating window corresponding to the first object, and drag the first object to the desktop according to the method described in the previous embodiment On an application icon displayed on , I won't repeat it here.
  • the user when the application recommendation panel is displayed in a non-full screen, the user can also click an area outside the area where the application recommendation panel is located in the display interface, or drag the first object to the area where the application recommendation panel is located. and stay in an area other than the area where the application recommendation panel is located for a preset period of time (such as the fifth duration), triggering the receiving end to close the application recommendation panel.
  • This application does not limit the operation method of closing the application recommendation panel.
  • the above operation of closing the application recommendation panel is also an operation of closing the first window, and the receiving end may close the first window in response to any one of the above operations of closing the first window.
  • the receiving end responds to the operation of closing the first window.
  • the user may perform an operation of dragging the first object to the second application identifier, and the second application identifier is one of the application identifiers included in the desktop.
  • the receiving end may open the application corresponding to the second application identifier, and An object is passed to the application corresponding to the second application identifier; or, a third window (ie, the service menu of the application corresponding to the desktop icon) is displayed, and the third window includes the service of one or more services included in the application corresponding to the second application identifier logo.
  • a third window ie, the service menu of the application corresponding to the desktop icon
  • the receiving end may display first prompt information, the first prompt information is used to prompt that the application corresponding to the second application identifier does not support dragging the first object. an object.
  • first prompt information is used to prompt that the application corresponding to the second application identifier does not support dragging the first object.
  • the dragging behavior when the user is dragging the first object, the dragging behavior may be interrupted.
  • the factor causing the interruption of the dragging behavior may be the user's active interruption or passive interruption.
  • the display interface (or application recommendation panel) of the receiving end may include an interruption area, and the user may drag the first object to the interruption area and end the dragging operation on the first object, so as to realize active interruption of the dragging behavior.
  • the user drags the first object onto the application icon corresponding to the application that cannot open the first object, and ends the dragging operation on the first object, or when the user drags the first object to a blank area of the desktop and end the dragging operation on the first object, it may also be considered that the user actively interrupts the dragging action.
  • the receiving end receives a phone call or some notification, or when the receiving end pops up some system pop-up windows, the dragging behavior may be interrupted. This type of interruption scenario can be considered as passive interruption.
  • the receiving end can change the shadow of the first object into a bubble-shaped floating Windows, such as: floating balls, bubbles (or drag bubbles), etc., and attach (move) the bubble-shaped floating window corresponding to the first object to the edge of the desktop for display.
  • the bubble-shaped floating window corresponding to the first object can support the user to drag and drop the first object again.
  • the corresponding bubble-shaped floating window is changed to the shadow of the first object for the user to continue dragging the first object.
  • FIG. 15 for the bubble-shaped floating window corresponding to the first object, reference may be made to the aforementioned FIG. 15 , which is not shown in the accompanying drawings here.
  • the receiving end may display the application recommendation panel again.
  • the receiving end may not display the application recommendation panel. That is, when the user re-clicks the bubble-shaped floating window corresponding to the first object and drags it, the response action of the receiving end can be consistent with the response action of the receiving end when the first object is dragged to the display interface of the receiving end for the first time .
  • the aforementioned bubble-shaped floating window may be the first floating window described in the foregoing embodiments, and the position where the first floating window is displayed may be referred to as a first position.
  • the first location may be the edge of the desktop.
  • the application recommendation panel may include not only the application identification, but also service identifications of services corresponding to some or all of the applications. Similar to application identifiers, these service identifiers may be defined or preset; or may be determined by the receiving end according to the type of the first object, which is not limited here. Alternatively, the receiving end may also only display a service recommendation panel (without displaying an application recommendation panel), and the service recommendation panel may include service identifiers of services corresponding to some applications, but not include application identifiers. That is, the first window may include application identifiers corresponding to one or more applications, and/or service identifiers corresponding to one or more services.
  • the application that opens the first object described in the above embodiments is not limited to the application that can display the specific content of the first object by launching the application interface.
  • the application that opens the first object may also include: an application capable of operating the first object and obtaining relevant information contained in the first object, or an application capable of bookmarking the first object, and the like.
  • the first object may be a two-dimensional code picture
  • the application for opening the first object may include: a gallery application capable of displaying the two-dimensional code picture; Code application (such as camera), and collection application (such as memo, favorites, etc.) that can save the picture of the QR code.
  • an application that can open the first object is not limited to the explanations given in the preceding examples, and this application does not limit the applications that can open the first object.
  • the foregoing application that can open the first object may be called an application that supports dragging in the first object, and correspondingly, an application that cannot open the first object may be called an application that does not support dragging in the first object.
  • the source end may prompt the user to drag and drop the first object across devices.
  • the source end may display a user interface (user interface, UI) motion effect, such as the second UI motion effect , used to prompt the user to perform cross-device dragging of the first object.
  • UI user interface
  • the operation of dragging the first object on the second interface can be called the second operation.
  • the display position of the UI animation effect is related to the orientation of the receiving end relative to the source end.
  • the display position of the UI animation effect can be the edge of the screen of the source end.
  • the edge of the screen is the side where the sink connected to the source is located.
  • the screen of the source terminal can be divided into upper screen edge, lower screen edge, left screen edge, and right screen edge.
  • the display position of the animation effect can be the right screen edge of the source.
  • the source can gradually enhance the display effect of the UI animation, for example: the display of the UI animation can be increased scope.
  • the source can compare the location of the dragging shadow of the first object The UI animation effects of relevant areas are highlighted or enhanced in color.
  • the color when the UI dynamic color is enhanced can be related to the color of the first object or the color of the display interface of the source, for example: the color when the UI dynamic color is enhanced can be the same as the color of the first object, or the display of the source When the interface is a desktop, the color of the UI dynamic effect color enhancement can be the same or similar to the color of the desktop wallpaper.
  • FIG. 21 is a schematic diagram of a PC displaying UI motion effects provided by an embodiment of the present application.
  • the display interface 2101 of the PC may include a first object 2102 and a mouse pointer (a small arrow in the figure, not marked). The user can move the pointer of the mouse to the first object 2102, click and hold the left button of the mouse and move the mouse, so that the first object 2102 can be dragged.
  • the PC may display a UI animation effect 2103, and the UI animation effect 2103 may be used to prompt the user to perform cross-device dragging of the first object 2102.
  • the display position of the UI dynamic effect 2103 is related to the orientation of the mobile phone relative to the PC.
  • the display position of the UI dynamic effect 2103 may be the right screen edge of the PC.
  • FIG. 22 is another schematic diagram of a PC displaying UI motion effects provided by an embodiment of the present application.
  • the position of the dragged shadow of the first object 2102 in the display interface of the PC will gradually approach the UI.
  • the edge of the screen where the animation effect is located As the position of the dragging shadow of the first object 2102 in the display interface of the PC gradually approaches the edge of the screen where the UI dynamic effect 2103 is located, the PC may gradually increase the display range of the UI dynamic effect 2103 .
  • the PC increases the display range of the UI dynamic effect 2103, which may include: the PC increases all areas of the UI dynamic effect 2103 synchronously, that is, all areas of the UI dynamic effect 2103 increase to the same extent, such as: Increase the same width.
  • the area where the display range of the UI dynamic effect 2103 increases may be related to the position of the drag shadow of the first object 2102 in the display interface of the PC.
  • the PC may only increase the partial area of the UI animation effect 2103 that is close to the position where the dragging shadow of the first object 2102 is located on the display interface of the PC, such as increasing the area of 30% of the UI animation effect 2103, which The 30% area is close to the position where the dragged shadow of the first object 2102 is located in the display interface of the PC.
  • the PC can increase all areas of the UI animation effect 2103, but the part of the area of the UI animation effect 2103 that is close to the position where the dragging shadow of the first object 2102 is located in the display interface of the PC is enlarged more. big.
  • the area where the display range of the UI animation effect 2103 increases is related to the position where the dragging shadow of the first object 2102 is located in the display interface of the PC , it can present the effect of attracting and zooming in as the dragged shadow of the first object 2102 approaches.
  • the present application does not limit the specific implementation of the PC increasing the display range of the UI animation effect 2103 .
  • the PC may no longer increase the display range of the UI animation effect 2103.
  • the maximum display range of the UI dynamic effect 2103 may also be preset in the PC, and when the display range of the UI dynamic effect 2103 reaches the maximum display range, the PC will no longer increase the display range of the UI dynamic effect 2103 .
  • FIG. 23 is another schematic diagram of a PC displaying UI dynamic effects provided by an embodiment of the present application.
  • the PC can also heighten the UI dynamic effect 2103 of the area related to the position where the dragged shadow of the first object 2102 is located.
  • Bright display or color intensification In Figure 23, the area filled with oblique lines represents the effect of highlighting or color enhancement.
  • the color of the enhanced color of the UI dynamic effect 2103 can be related to the color of the first object 2102 or the color of the display interface of the PC, for example: the color of the enhanced color of the UI dynamic effect 2103 can be the same as the color of the first object 2102, or , when the display interface of the PC is the desktop, the color of the enhanced UI dynamic effect 2103 can be similar to the color of the desktop wallpaper, for example: select the color in the main color of the desktop wallpaper.
  • This application does not limit the color when the color of the UI dynamic effect 2103 is enhanced, for example, the color of the UI dynamic effect 2103 when the color is enhanced can also be a default color.
  • the area where the UI animation effect 2103 is highlighted or colored can be the same as the area occupied by the dragging shadow of the first object 2102 in the display area of the UI animation effect 2103 (in FIG. example).
  • the area where the UI animation effect 2103 is highlighted or colored may be larger than the area occupied by the dragging shadow of the first object 2102 in the display area of the UI animation effect 2103 .
  • the area where the UI animation effect 2103 is highlighted or colored can also be a fixed size, regardless of the area occupied by the drag shadow of the first object 2102 in the display area of the UI animation effect 2103. This application does not limit the area where the UI dynamic effect 2103 is highlighted or enhanced in color.
  • the UI animation effect can be kept at a certain distance from the adjacent screen edges (such as the upper and lower screen edges), such as: UI
  • the area occupied by the motion effect at the edge of the right screen can be 80% of the middle of the entire right screen edge, with a distance of 10% from the upper and lower screen edges.
  • the display effect of the UI motion effect is shown in a wave shape in the above Figures 21 to 23, in some other examples, the UI motion effect can also be presented as a rectangle, or other regular or irregular shapes, The present application does not limit the shape of the UI motion effect.
  • the above embodiment takes the source end as a PC and the receiver end as a mobile phone as an example, and illustrates that the source end prompts the user to drag and drop the first object across devices through the UI animation effect, and that the first object is moving in the display interface of the source end.
  • UI animation changes.
  • the edge of the screen of the receiver end may also display a UI motion effect, reflecting the effect of the first object passing through.
  • the UI animation effect displayed on the screen edge of the receiving end may be called the first UI animation effect
  • the UI animation effect displayed on the screen edge of the source end may be called the second UI animation effect.
  • the first UI dynamic effect may be used to prompt the user that the first object is dragged into the display interface (such as the first interface) of the receiving end.
  • FIG. 24 is a schematic diagram of a mobile phone displaying UI motion effects provided by an embodiment of the present application.
  • a UI animation effect 2401 may be displayed on the screen edge of the mobile phone.
  • the edge of the screen where the mobile phone displays UI animation effects 2401 is opposite to the edge of the screen where the PC displays UI animation effects 2103.
  • the PC displays UI animation effects 2103 on the right screen edge
  • the mobile phone displays UI animation effects 2401 on the left screen edge. .
  • the UI animation effect 2401 displayed on the mobile phone and the UI animation effect 2103 displayed on the PC may be the same (for example, both are in a wave shape), or they may be different.
  • the UI dynamic effect 2401 displayed on the mobile phone is also in the wave shape as an example.
  • the UI dynamic effect 2401 displayed on the mobile phone may also be rectangular, or other regular or irregular shapes.
  • the area where the mobile phone displays the UI animation effect 2401 is related to the position where the mouse pointer or the dragged shadow of the first object 2102 is dragged into the display interface of the mobile phone.
  • the mobile phone may display UI animations within a certain range (such as a preset distance range) above and below the position where the dragged shadow of the first object 2102 is dragged in on the display interface of the mobile phone (that is, the position that appears on the edge of the mobile phone screen). Effective 2401.
  • the position where the dragging shadow of the first object 2102 is dragged in on the display interface of the mobile phone can be determined according to the position where the dragging shadow of the first object 2102 is dragged out in the display interface of the PC, such as: the first object 2102 The distance between the position where the dragging shadow is dragged out in the display interface of the PC and the upper (top) screen edge of the PC is 30% of the distance between the screens on the upper and lower sides, then the dragging shadow of the first object 2102 The distance between the dragged-in position on the display interface of the mobile phone and the edge of the upper (top) screen of the mobile phone may also be 30% of the distance between the screens on the upper and lower sides.
  • the position where the dragged shadow of the first object 2102 is dragged in on the display interface of the mobile phone may also be a relative receiving position in the physical space, which is not limited here.
  • FIG. 25 is another schematic diagram of a mobile phone displaying UI motion effects provided by an embodiment of the present application.
  • the position where the dragging shadow of the first object 2102 is located in the display interface of the mobile phone will gradually move away from where the UI animation effect 2401 is located. edge of the screen.
  • the mobile phone can gradually increase the display range of the UI dynamic effect 2401 .
  • the specific method for the mobile phone to gradually increase the display range of the UI dynamic effect 2401 is similar to that of the PC for increasing the display range of the UI dynamic effect 2103. All areas of the UI dynamic effect 2401 can be increased synchronously, or the The area where the display range increases may be related to the position where the dragging shadow of the first object 2102 is located in the display interface of the mobile phone.
  • the mobile phone may no longer increase the display range of UI animation 2401.
  • the maximum display range of the UI dynamic effect 2401 may also be preset in the mobile phone, and when the display range of the UI dynamic effect 2401 reaches the maximum display range, the mobile phone no longer increases the display range of the UI dynamic effect 2401 .
  • the mobile phone may not increase the display range of the UI dynamic effect 2401. That is, the display range of the UI animation effect 2401 may remain unchanged.
  • the mobile phone can also display the area related to the location of the dragging shadow of the first object 2102.
  • the UI dynamic effect 2401 is highlighted or enhanced in color.
  • the area filled with oblique lines in Figure 25 represents the effect of highlighting or color enhancement.
  • the color of the enhanced color of the UI dynamic effect 2401 can be related to the color of the first object 2102 or the color of the display interface of the mobile phone, for example: the color of the enhanced color of the UI dynamic effect 2401 can be the same as the color of the first object 2102, or , when the display interface of the mobile phone is the desktop, the color of the enhanced UI dynamic effect 2401 can be similar to the color of the desktop wallpaper, for example: select the color in the main color of the desktop wallpaper.
  • This application does not limit the color when the color of the UI dynamic effect 2401 is enhanced, for example, the color of the UI dynamic effect 2401 when the color is enhanced can also be a default color.
  • the color of the enhanced UI dynamic effect 2401 is related to the color of the first object 2102 or the color of the display interface of the mobile phone.
  • the dragging shadow of the object 2102 occupies the same area in the display area of the UI animation effect 2401 (this is taken as an example in FIG. 25 ).
  • the color of the enhanced UI dynamic effect 2401 is related to the color of the first object 2102 or the color of the display interface of the mobile phone. The area occupied by the drag shadow of the first object 2102 in the display area of the UI animation effect 2401 .
  • the area where the UI animation effect 2401 is highlighted or colored can also be a fixed size, regardless of the area occupied by the drag shadow of the first object 2102 in the display area of the UI animation effect 2401. This application does not limit the area where the UI dynamic effect 2401 is highlighted or enhanced in color.
  • FIG. 26 is another schematic diagram of a mobile phone displaying UI motion effects provided by an embodiment of the present application.
  • the user continues to drag the first object 2102 on the display interface of the mobile phone, so that the dragging shadow of the first object 2102 is far away from the edge of the screen, and the mobile phone can also display the trail of the first object 2102 Effect 2601. That is, when the first object 2102 is dragged until the dragging shadow of the first object 2102 is away from the edge of the screen, the mobile phone may display the trailing effect of the first object 2102 .
  • the smearing effect 2601 may be presented by means of highlight projection display.
  • the display area of the trailing effect 2601 may be greater than, smaller than, or equal to the display area of the dragging shadow of the first object 2102 .
  • the display area of the trailing effect 2601 may move following the dragged shadow of the first object 2102 .
  • the trailing effect 2601 may gradually become smaller or remain unchanged, and the trailing effect 2601 may gradually fade until it disappears.
  • the trailing effect 2601 may disappear, that is, the mobile phone no longer displays the trailing effect.
  • FIG. 27 is another schematic diagram of a mobile phone displaying UI motion effects provided by an embodiment of the present application.
  • the mobile phone may no longer display the UI animation 2401 in response to the user finishing the dragging operation of the first object 2102 . That is, after the user releases the first object 2102, the UI dynamic effect 2401 at the edge of the mobile phone screen can disappear.
  • the UI dynamic effects (including the UI dynamic effects 2103 displayed on the PC and the UI dynamic effects 2401 displayed on the mobile phone) and the trailing effects described in the above embodiments are all illustrative descriptions.
  • the UI motion effect can also be called UI traversal motion effect, traversal motion effect, traversal light effect, etc.
  • the trailing effect can also be called trailing motion effect, highlight projection, etc. Names such as trailing effects are not limited.
  • an embodiment of the present application provides a cross-device dragging device, which can be applied to the above-mentioned receiving end to implement the cross-device dragging described in the foregoing embodiments
  • the steps performed by the receiver in a method may be the first terminal device.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the steps performed by the receiving end in the above cross-device dragging method.
  • FIG. 28 is a schematic structural diagram of an inter-device dragging device provided by an embodiment of the present application. As shown in FIG.
  • the device may include: a display unit 2801 , a receiving unit 2802 , and a processing unit 2803 .
  • the display unit 2801 , the receiving unit 2802 , and the processing unit 2803 may be configured to cooperate to implement functions corresponding to the steps performed by the receiving end in the cross-device drag method described in the foregoing embodiments.
  • the display unit 2801 can be used to display the first interface; the receiving unit 2801 can be used to receive the first operation; the processing unit 2803 can be used to control the display unit to display the first window in response to the first operation.
  • the display unit 2801, the receiving unit 2802, and the processing unit 2803 can cooperate to implement the functions corresponding to all the steps performed by the receiving end in the cross-device drag method described in the foregoing embodiments, which will not be repeated here.
  • An embodiment of the present application also provides a cross-device dragging device, which can be applied to the above-mentioned source end to implement the steps performed by the source end in the cross-device drag method described in the foregoing embodiments.
  • the source end may be the second terminal device.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the steps executed by the source in the above cross-device drag method.
  • FIG. 29 is another schematic structural diagram of the cross-device dragging device provided by the embodiment of the present application. As shown in FIG.
  • the device may include: a display unit 2901 , a receiving unit 2902 , and a processing unit 2903 .
  • the display unit 2901 , the receiving unit 2902 , and the processing unit 2903 may be configured to cooperate to implement functions corresponding to the steps performed by the source in the cross-device drag method described in the foregoing embodiments.
  • the display unit 2901 can be used to display the second interface; the receiving unit 2902 can be used to receive the second operation; the processing unit 2903 can be used to respond to the second operation and control the display unit to display the second UI on the screen edge of the second interface Motion etc.
  • the display unit 2901, the receiving unit 2902, and the processing unit 2903 can cooperate to realize the functions corresponding to all the steps performed by the source in the cross-device drag method described in the foregoing embodiments, which will not be repeated here.
  • the division of modules (or called units) in the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity or physically separated during actual implementation.
  • the units in the device can all be implemented in the form of software called by the processing element; they can also be implemented in the form of hardware; some units can also be implemented in the form of software called by the processing element, and some units can be implemented in the form of hardware.
  • each unit can be a separate processing element, or it can be integrated in a certain chip of the device. In addition, it can also be stored in the memory in the form of a program, which is called and executed by a certain processing element of the device. Function. In addition, all or part of these units can be integrated together, or implemented independently.
  • the processing element described here may also be referred to as a processor, which may be an integrated circuit with signal processing capabilities. In the process of implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in the processor element or implemented in the form of software called by the processing element.
  • the units in the above device may be one or more integrated circuits configured to implement the above method, for example: one or more application specific integrated circuits (ASIC), or, one or more A digital signal processor (DSP), or, one or more field programmable gate arrays (FPGA), or a combination of at least two of these integrated circuit forms.
  • ASIC application specific integrated circuits
  • DSP digital signal processor
  • FPGA field programmable gate arrays
  • the processing element can be a general-purpose processor, such as a central processing unit (central processing unit, CPU) or other processors that can call programs.
  • CPU central processing unit
  • these units can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • the units of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler.
  • the device may include a processing element and a storage element, and the processing element invokes a program stored in the storage element to execute the steps performed by the source end or the sink end in the methods described in the above method embodiments.
  • the storage element may be a storage element on the same chip as the processing element, that is, an on-chip storage element.
  • the program for executing the above method may be stored in a storage element on a different chip from the processing element, that is, an off-chip storage element.
  • the processing element invokes or loads a program from the off-chip storage element to the on-chip storage element, so as to invoke and execute the steps performed by the source end or the sink end in the methods described in the above method embodiments.
  • the embodiment of the present application also provides an electronic device.
  • the electronic device may be the aforementioned source or sink.
  • the electronic device includes: a processor, a memory for storing processor-executable instructions; when the processor is configured to execute the instructions, the electronic device implements the steps performed by the source or the receiver in the methods described in the foregoing embodiments .
  • the memory can be located inside the electronic device or outside the electronic device.
  • the processor includes one or more.
  • the electronic device may be a mobile phone, or a tablet computer, a wearable device, a vehicle-mounted device, an AR/VR device, a notebook computer, a UMPC, a netbook, a PDA, and the like.
  • the unit of the electronic device that implements each step in the above method may be configured as one or more processing elements, where the processing elements may be integrated circuits, for example: one or more ASICs, or one Or multiple DSPs, or, one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits can be integrated together to form a chip.
  • an embodiment of the present application further provides a chip, and the chip can be applied to the above-mentioned electronic device.
  • the chip includes one or more interface circuits and one or more processors; the interface circuits and processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the A step in a method that is performed by the source or sink.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the software product is stored in a program product, such as a computer-readable storage medium, and includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all of the methods described in various embodiments of the present application. or partial steps.
  • the aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.
  • an embodiment of the present application also provides a computer-readable storage medium, on which computer program instructions are stored; when the computer program instructions are executed by an electronic device, the electronic device realizes the source end in the method described in the embodiment. or steps performed by the receiving end.
  • an embodiment of the present application further provides a computer program product, including: computer readable codes, or a non-volatile computer readable storage medium bearing computer readable codes, when the computer readable codes are stored in the electronic device
  • the processor in the electronic device implements the steps performed by the source or the receiver in the methods described in the foregoing embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种跨设备拖拽方法、电子设备及存储介质,涉及电子设备领域。该方法中,第一终端设备可以显示第一界面,并接收第一操作,第一操作为将第一对象由第二终端设备的显示界面拖拽至第一界面的操作。第一终端设备可以响应于第一操作,显示第一窗口,第一窗口包括一个或多个应用对应的应用标识,和/或,一个或多个服务对应的服务标识。本申请中,将第一对象由第二终端设备拖拽至第一终端设备时,第一终端设备可以更加智能地响应对第一对象的拖拽,用户的进行跨设备拖拽的方式更加简单。

Description

跨设备拖拽方法、电子设备及存储介质
本申请要求于2021年10月18日提交国家知识产权局、申请号为202111210629.5、申请名称为“跨设备拖拽方法、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子设备领域,尤其涉及一种跨设备拖拽方法、电子设备及存储介质。
背景技术
在多设备协同场景中,用户可以将终端设备1中的对象拖拽到终端设备2中(即跨设备拖拽),在终端设备2中打开或保存。例如,被拖拽的对象可以包括:文件(如文档、图片、音乐、视频等)、文本/文字内容、应用图标、微件/部件(widget)等。其中,终端设备1可以被称作源(source)端或拖出端,终端设备2可以被称作接收(sink)端或拖入端。
目前,一种实现方式中,源端和接收端之间实现跨设备拖拽时,需要接收端有明确的拖入应用,并且用户需要提前在接收端打开该拖入应用,然后用户可以将源端中的对象拖入接收端中的该拖入应用中。该拖入应用可以保存或打开拖拽过来的对象。或者,另外一种实现方式中,用户没有在接收端提前打开拖入应用时,用户也可以将源端中的对象拖入接收端的桌面,接收端可以将被拖拽过来的对象保存到本地的默认存储路径(如文件管理器)或使用默认的应用程序(如浏览器)打开。
上述现有的源端和接收端之间实现跨设备拖拽的方式较为繁琐,接收端对被拖拽对象的响应不够智能。
发明内容
本申请实施例提供一种跨设备拖拽方法、电子设备及存储介质,用户在不同设备之间对某个对象进行跨设备拖拽时,接收端可以更加智能地响应对该对象的拖拽。
第一方面,本申请实施例提供一种跨设备拖拽方法,所述方法应用于第一终端设备;所述方法包括:第一终端设备显示第一界面;第一终端设备接收第一操作,第一操作为将第一对象由第二终端设备的显示界面拖拽至第一界面的操作;响应于第一操作,第一终端设备显示第一窗口;第一窗口包括一个或多个应用对应的应用标识,和/或,一个或多个服务对应的服务标识。
该方法可以适用于在任意两个终端设备(如第一终端设备和第二终端设备)之间进行拖拽对象的场景。以用户将第二终端设备中的对象拖拽到第一终端设备中为例,该方法中,用户将第二终端设备中的对象拖拽到第一终端设备中时,用户无需提前在第一终端设备打开可以打开第一对象的应用界面,第一终端设备可以为用户推荐第一窗口,第一窗口包括一个或多个应用对应的应用标识,和/或,一个或多个服务对应的服务标识。用户在拖拽对象的过程中,可以根据第一窗口中显示的应用标识或服务标识,快速地选择打开该对象的应用程序或服务。拖拽方式更加简单,第一终端设备对 被拖拽对象的响应更加智能。
一种可能的实现方式中,第一窗口中包括的应用标识与第一对象的类型相关。
可选地,所述第一终端设备显示第一窗口之前,所述方法还包括:第一终端设备获取第一对象的类型;第一终端设备根据第一对象的类型,确定第一窗口中包括的应用标识。
一些实现方式中,所述第一终端设备根据第一对象的类型,确定第一窗口中包括的应用标识,包括:第一终端设备根据第一对象的类型,从第一终端设备安装的所有应用中确定支持拖入第一对象的应用;第一终端设备将支持拖入第一对象的应用的应用标识,作为第一窗口中包括的应用标识。
例如,第一终端设备接收到来自第二终端设备的第一消息后,可以根据第一消息中包括的第一对象确定第一对象的类型,或者,第一消息中可以单独包含一个字段用于指示第一对象的类型,第一终端设备可以根据该字段确定第一对象的类型。第一终端设备可以根据第一对象的类型,从第一终端设备安装的所有应用中选择可以打开第一对象的应用,并在第一窗口中显示这些可以打开第一对象的应用的应用标识。
另一种可能的实现方式中,第一窗口中包括的应用标识为预设的应用标识。
例如,用户或第一终端设备的服务厂家可以在预先配置在第一窗口中显示哪些应用的应用标识。
又一种可能的实现方式中,第一窗口中包括的应用标识为第一终端设备上安装的所有应用的应用标识。
例如,假设第一终端设备上安装有N个应用(N为大于0的整数),则第一窗口中可以包括第一终端设备安装的N个应用的应用图标。
可选地,所述方法还包括:第一终端设备接收将第一对象拖拽至第一应用标识的操作;第一应用标识为第一窗口中包括的应用标识中的一个;响应于将第一对象拖拽至第一应用标识的操作,当第一应用标识对应的应用为支持拖入第一对象的应用时,第一终端设备显示第二窗口,第二窗口包括第一应用标识对应的应用包括的一个或多个服务的服务标识;或者,当第一应用标识对应的应用为不支持拖入第一对象的应用时,第一终端设备显示第一提示信息,第一提示信息用于提示第一应用标识对应的应用不支持拖入第一对象。
当第一应用标识对应的应用为支持拖入第一对象的应用时,第一终端设备显示第二窗口,可以供用户将第一对象直接拖拽到第一应用标识对应的应用中的某个服务上打开。当第一应用标识对应的应用为不支持拖入第一对象的应用时,第一终端设备显示第一提示信息,可以提示第一应用标识对应的应用无法打开第一对象。
一种可能的实现方式中,第二窗口中包括的服务标识与第一对象的类型相关。
可选地,所述第一终端设备显示第二窗口之前,所述方法还包括:第一终端设备获取第一对象的类型;第一终端设备根据第一对象的类型、以及第一应用标识对应的应用包括的所有服务,确定第二窗口中包括的服务标识。
一些实现方式中,所述第一终端设备根据第一对象的类型、以及第一应用标识对应的应用包括的所有服务,确定第二窗口中包括的服务标识,包括:第一终端设备根据第一对象的类型,从第一应用标识对应的应用包括的所有服务中确定支持拖入第一 对象的服务;第一终端设备将支持拖入第一对象的服务的服务标识,作为第二窗口中包括的服务标识。
例如,第一终端设备接收到第一消息后,可以根据第一消息中包括的第一对象确定第一对象的类型,或者,第一消息中可以单独包含一个字段用于指示第一对象的类型,第一终端设备可以根据该字段确定第一对象的类型。当用户将第一对象拖拽至第一窗口中的第一应用标识上时,第一终端设备可以根据第一对象的类型,从第一应用标识对应的应用包括的所有服务中选择可以打开第一对象的服务,并在第二窗口中显示这些打开第一对象的服务的服务标识。
另一种可能的实现方式中,第二窗口中包括的服务标识为预设的服务标识。
例如,用户或第一终端设备的服务厂家可以在预先配置每个应用对应的第二窗口中包括哪些服务标识。也即,第二窗口中包括的服务标识是固定的。
又一种可能的实现方式中,第二窗口中包括的服务标识为第一应用标识对应的应用包括的所有服务的服务标识。
例如,假设第一应用标识对应的应用包括N个服务(N为大于0的整数),则第一应用标识对应的应用对应的第二窗口中可以包括前述N个服务的服务标识。
一些可能的实现方式中,第一界面包括支持拖入的界面和不支持拖入的界面;所述第一终端设备显示第一窗口,包括:当第一界面为不支持拖入的界面时,第一终端设备显示第一窗口。
示例性地,不支持拖入的界面可以包括系统桌面(简称桌面)、系统弹窗、不支持拖入的应用界面等。支持拖入的界面可以包括支持拖入的应用界面。例如,以第一终端设备为手机为例,支持拖入的应用界面可以是手机上安装的一些聊天应用的聊天界面。
一些可能的实现方式中,所述第一终端设备显示第一窗口,包括:当第一对象被拖拽至第一界面后,对第一对象的拖拽操作未结束、且指针在第一界面中的停留时间达到第一时长,和/或,对第一对象的拖拽操作未结束、且指针在第一界面中的滑动距离大于第一阈值时,第一终端设备显示第一窗口。
例如,当第一对象被拖拽至第一界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针(如鼠标的指针)在第一界面中停留时间达到第一时长,则第一终端设备可以确定用户意图为寻找某个应用打开第一对象,此时,第一终端设备可以显示第一窗口。或者,当第一对象被拖拽至第一界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中的拖动/滑动距离大于第一阈值,则第一终端设备可以确定用户意图为寻找某个应用打开第一对象,此时,第一终端设备可以显示第一窗口。又或者,当第一对象被拖拽至第一界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针(如鼠标的指针)在第一界面中停留时间达到第一时长、指针在接收端的显示界面中的拖动/滑动距离大于第一阈值,则第一终端设备可以确定用户意图为寻找某个应用打开第一对象,此时,第一终端设备可以显示第一窗口。
本方式中,第一终端设备可以根据第一对象被拖拽至第一界面后,用户进一步的交互动作来判断用户意图,并根据用户意图确定是否显示第一窗口。
可选地,第一界面为不支持拖入的界面;所述方法还包括:第一终端设备接收第三操作,第三操作为将第二对象由第二终端设备的显示界面拖拽至第一界面后,直接结束对第二对象进行拖拽的操作;响应于第三操作,第一终端设备保存第二对象,或者,打开默认应用、并将第二对象传递给默认应用。
示例性地,直接结束对第一对象的拖拽操作可以是指:用户将第一对象拖拽到第一界面后至用户结束对第一对象的拖拽操作之间的时间小于第一时长。
可选地,所述方法还包括:第一终端设备接收将第一对象拖拽至第一应用标识的操作;第一应用标识为第一窗口中包括的应用标识中的一个;响应于将第一对象拖拽至第一应用标识的操作,当第一应用标识对应的应用为支持拖入第一对象的应用时,第一终端设备打开第一应用标识对应的应用,并将第一对象传递给第一应用标识对应的应用。或者,当第一应用标识对应的应用为不支持拖入第一对象的应用时,第一终端设备显示第一提示信息,第一提示信息用于提示第一应用标识对应的应用不支持拖入第一对象。
一些可能的实现方式中,所述第一终端设备显示第一提示信息,包括:第一终端设备通过改变第一应用标识的显示状态显示第一提示信息。
例如,第一终端设备可以通过将第一应用标识变暗或灰化来作为第一提示信息进行显示。
可选地,所述方法还包括:第一终端设备接收将第一对象拖拽至第一服务标识的操作;第一服务标识为第二窗口中包括的服务标识中的一个;响应于将第一对象拖拽至第一服务标识的操作,当第一服务标识对应的服务为支持拖入第一对象的服务时,第一终端设备打开第一服务标识对应的服务,并将第一对象传递给第一服务标识对应的服务。或者,当第一服务标识对应的服务为不支持拖入第一对象的服务时,第一终端设备显示第二提示信息,第二提示信息用于提示第一服务标识对应的服务不支持拖入第一对象。
一些可能的实现方式中,所述方法还包括:第一终端设备接收第四操作,第四操作为将第二对象由第二终端设备的显示界面拖拽至第一界面中的第一区域的操作;第一区域为空白区域;响应于第四操作,第一终端设备保存第二对象。
本方式中,用户可以将拖拽对象拖拽至第一区域,以触发第一终端设备保存该拖拽对象。
另外一些可能的实现方式中,第一窗口中包括第二区域或第一图标;所述方法还包括:第一终端设备接收将第一对象拖拽至第二区域或第一图标上的操作;响应于将第一对象拖拽至第二区域或第一图标上的操作,第一终端设备保存第一对象。
本方式中,用户可以将拖拽对象拖拽至第二区域或第一图标,以触发第一终端设备保存该拖拽对象。
一些可能的实现方式中,第一界面为系统桌面,第一界面中包括一个或多个应用标识;所述方法还包括:第一终端设备接收关闭第一窗口的操作;响应于关闭第一窗口的操作,第一终端设备关闭第一窗口;第一终端设备接收将第一对象拖拽至第二应用标识的操作;第二应用标识为系统桌面中包括的应用标识中的一个;响应于将第一对象拖拽至第二应用标识的操作,当第二应用标识对应的应用为支持拖入第一对象的 应用时,第一终端设备打开第二应用标识对应的应用,并将第一对象传递给第二应用标识对应的应用;或者,第一终端设备显示第三窗口,第三窗口包括第二应用标识对应的应用包括的一个或多个服务的服务标识。或者,当第二应用标识对应的应用为不支持拖入第一对象的应用时,第一终端设备显示第一提示信息,第一提示信息用于提示第二应用标识对应的应用不支持拖入第一对象。
本方式中,当用户不想选择第一窗口中推荐的应用打开第一对象时,可以主动关闭第一窗口,从桌面选择其他应用打开第一对象。
可选地,所述第一终端设备显示第一窗口,包括:第一终端设备以全屏显示的方式、或者非全屏显示的方式、又或者抽屉显示的方式显示所述第一窗口。
示例性地,非全屏显示具体可以包括半屏显示(即第一窗口的区域占第一终端设备的显示屏幕区域的一半)、三分之一屏显示(即第一窗口的区域占第一终端设备的显示屏幕区域的三分之一)等,不作限制。
一些可能的实现方式中,所述方法还包括:当第一终端设备检测到第一对象在第一终端设备上被拖拽的拖拽行为发生中断时,第一终端设备在第一界面中的第一位置显示第一对象对应的第一浮窗。
示例性地,第一位置可以是桌面(如第一界面)的边缘。
可选地,所述方法还包括:第一终端设备接收对第一浮窗的拖拽操作;响应于对第一浮窗的拖拽操作,第一终端设备将第一浮窗改变为第一对象的拖拽阴影,供用户继续对第一对象进行拖拽。
本方式中,当用户对第一对象的拖拽行为中断后,用户可以通过第一浮窗继续对第一对象的拖拽操作,无需重新从第二终端设备拖拽第一对象。
可选地,所述方法还包括:当第一对象被拖入第一界面时,第一终端设备在第一界面的屏幕边缘显示第一UI动效;第一UI动效用于提示用户第一对象被拖入第一界面。
一种可能的实现方式中,第一UI动效的显示区域与第一对象的拖拽阴影在第一界面中被拖入的位置相关。
例如,第一终端设备可以在第一对象的拖拽阴影在第一界面中被拖入的位置(即出现在第一终端设备屏幕边缘上的位置)上下一定范围(如预设的距离范围)内显示第一UI动效。其中,第一对象的拖拽阴影在第一显示界面中被拖入的位置可以根据第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置确定,如:第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置距离第二终端设备的上侧(顶部)屏幕边缘之间的距离为上下两侧屏幕之间距离的30%,则第一对象的拖拽阴影在第一界面中被拖入的位置距离第一终端设备的上侧(顶部)屏幕边缘之间的距离也可以为上下两侧屏幕之间距离的30%。或者,第一对象的拖拽阴影在第一界面中被拖入的位置也可以是物理空间内相对的接收位置,在此不作限制。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备将第一UI动效的显示区域中与第一对象的拖拽阴影的所在位置相关的区域进行高亮显示或颜色加强。
示例性地,第一UI动效颜色加强时的颜色可以和第一对象的颜色或第一界面的颜 色相关,如:第一UI动效颜色加强时的颜色可以和第一对象的颜色相同,或者,第一界面为桌面时,第一UI动效颜色加强时的颜色可以和桌面壁纸的颜色相近,如:选择桌面壁纸主色调中的颜色。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备显示第一对象的拖尾效果;拖尾效果的显示区域跟随第一对象的拖拽阴影而移动;在拖尾效果的显示区域跟随第一对象的拖拽阴影移动的过程中,拖尾效果逐渐变小或保持不变,且拖尾效果的显示亮度和/或颜色逐渐变淡;当第一对象的拖拽阴影移动超过预设的距离后,第一终端设备不再显示拖尾效果。
可选地,所述方法还包括:响应于结束对第一对象进行拖拽的操作,第一终端设备不再显示第一UI动效。
第二方面,本申请实施例提供一种跨设备拖拽装置,该装置可以应用于上述第一方面中所述的第一终端设备,以使第一终端设备实现如第一方面及第一方面的任意一种可能的实现方式所述的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与第一方面及第一方面的任意一种可能的实现方式所述的方法中的步骤相对应的模块或单元。
例如,该装置可以包括:显示单元、接收单元、处理单元等,显示单元、接收单元和处理单元可以配合实现如第一方面及第一方面的任意一种可能的实现方式所述的方法。如:显示单元可以用于显示第一界面;接收单元可以用于接收第一操作;处理单元可以用于响应于第一操作,控制显示单元显示第一窗口等。类似地,显示单元、接收单元和处理单元可以配合实现如第一方面及第一方面的任意一种可能的实现方式所述的方法的全部步骤对应的功能,在此不再一一赘述。
第三方面,本申请实施例提供一种电子设备,该电子设备可以是上述第一方面中所述的第一终端设备。电子设备包括:处理器,用于存储处理器可执行指令的存储器;处理器被配置为执行所述指令时,使得电子设备实现如第一方面及第一方面的任意一种可能的实现方式所述的方法。
第四方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令;当所述计算机程序指令被电子设备执行时,使得电子设备实现如第一方面及第一方面的任意一种可能的实现方式所述的方法。
第五方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如第一方面及第一方面的任意一种可能的实现方式所述的方法。
上述第二方面至第五方面所具备的有益效果,可参考第一方面中所述,在此不再赘述。
第六方面,本申请实施例还提供一种跨设备拖拽方法,所述方法应用于第一终端设备;所述方法包括:第一终端设备显示第一界面;第一界面包括一个或多个应用的应用标识;第一终端设备接收将第一对象拖拽至第二应用标识的操作;第二应用标识为第一界面中包括的应用标识中的一个、且第二应用标识对应的应用为支持拖入第一对象的应用;第一对象来自第二终端设备的显示界面;响应于将第一对象拖拽至第二 应用标识的操作,第一终端设备打开第二应用标识对应的应用,并将第一对象传递给第二应用标识对应的应用;或者,第一终端设备显示第三窗口,第三窗口包括第二应用标识对应的应用包括的一个或多个服务的服务标识。
可选地,所述方法还包括:第一终端设备接收将第二对象拖拽至第三应用标识的操作;第三应用标识为第一界面中包括的应用标识中的一个、且第三应用标识对应的应用为不支持拖入第二对象的应用;第二对象来自第二终端设备的显示界面;响应于将第二对象拖拽至第三应用标识的操作,第一终端设备显示第一提示信息,第一提示信息用于提示第三应用标识对应的应用不支持拖入第一对象。
一种可能的实现方式中,应用标识包括应用图标或卡片。
一种可能的实现方式中,卡片包括一个或多个服务标识;所述方法还包括:第一终端设备接收将第一对象拖拽至第二服务标识的操作;第二服务标识为卡片包括的服务标识中的一个;响应于将第一对象拖拽至第二服务标识的操作,当第二服务标识对应的服务为支持拖入第一对象的服务时,第一终端设备打开第二服务标识对应的服务,并将第一对象传递给第二服务标识对应的服务;或者,当第二服务标识对应的服务为不支持拖入第一对象的服务时,第一终端设备显示第二提示信息,第二提示信息用于提示第二服务标识对应的服务不支持拖入第一对象。
一种可能的实现方式中,第一界面包括第一文件夹;第一文件夹包括一个或多个应用的应用标识;第二应用标识为第一文件夹包括的应用标识中的一个;所述将第一对象拖拽至第二应用标识的操作,包括:打开第一文件夹,并将第一对象拖拽至第二应用标识的操作。
可选地,所述方法还包括:当第一对象被拖入第一界面时,第一终端设备在第一界面的屏幕边缘显示第一UI动效;第一UI动效用于提示用户第一对象被拖入第一界面。
一种可能的实现方式中,第一UI动效的显示区域与第一对象的拖拽阴影在第一界面中被拖入的位置相关。
例如,第一终端设备可以在第一对象的拖拽阴影在第一界面中被拖入的位置(即出现在第一终端设备屏幕边缘上的位置)上下一定范围(如预设的距离范围)内显示第一UI动效。其中,第一对象的拖拽阴影在第一显示界面中被拖入的位置可以根据第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置确定,如:第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置距离第二终端设备的上侧(顶部)屏幕边缘之间的距离为上下两侧屏幕之间距离的30%,则第一对象的拖拽阴影在第一界面中被拖入的位置距离第一终端设备的上侧(顶部)屏幕边缘之间的距离也可以为上下两侧屏幕之间距离的30%。或者,第一对象的拖拽阴影在第一界面中被拖入的位置也可以是物理空间内相对的接收位置,在此不作限制。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备将第一UI动效的显示区域中与第一对象的拖拽阴影的所在位置相关的区域进行高亮显示或颜色加强。
示例性地,第一UI动效颜色加强时的颜色可以和第一对象的颜色或第一界面的颜色相关,如:第一UI动效颜色加强时的颜色可以和第一对象的颜色相同,或者,第一 界面为桌面时,第一UI动效颜色加强时的颜色可以和桌面壁纸的颜色相近,如:选择桌面壁纸主色调中的颜色。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备显示第一对象的拖尾效果;拖尾效果的显示区域跟随第一对象的拖拽阴影而移动;在拖尾效果的显示区域跟随第一对象的拖拽阴影移动的过程中,拖尾效果逐渐变小或保持不变,且拖尾效果的显示亮度和/或颜色逐渐变淡;当第一对象的拖拽阴影移动超过预设的距离后,第一终端设备不再显示拖尾效果。
可选地,所述方法还包括:响应于结束对第一对象进行拖拽的操作,第一终端设备不再显示第一UI动效。
第七方面,本申请实施例提供一种跨设备拖拽装置,该装置可以应用于上述第六方面中所述的第一终端设备,以使第一终端设备实现如第六方面及第六方面的任意一种可能的实现方式所述的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与第六方面及第六方面的任意一种可能的实现方式所述的方法中的步骤相对应的模块或单元。
例如,该装置可以包括:显示单元、接收单元、处理单元等,显示单元、接收单元和处理单元可以配合实现如第六方面及第六方面的任意一种可能的实现方式所述的方法。如:显示单元可以用于显示第一界面;接收单元可以用于接收将第一对象拖拽至第二应用标识的操作;处理单元可以用于响应于将第一对象拖拽至第二应用标识的操作,打开第二应用标识对应的应用,并将第一对象传递给第二应用标识对应的应用;或者,控制显示单元显示第三窗口等。类似地,显示单元、接收单元和处理单元可以配合实现如第六方面及第六方面的任意一种可能的实现方式所述的方法的全部步骤对应的功能,在此不再一一赘述。
第八方面,本申请实施例提供一种电子设备,该电子设备可以是上述第六方面中所述的第一终端设备。电子设备包括:处理器,用于存储处理器可执行指令的存储器;处理器被配置为执行所述指令时,使得电子设备实现如第六方面及第六方面的任意一种可能的实现方式所述的方法。
第九方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令;当所述计算机程序指令被电子设备执行时,使得电子设备实现如第六方面及第六方面的任意一种可能的实现方式所述的方法。
第十方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如第六方面及第六方面的任意一种可能的实现方式所述的方法。
上述第七方面至第十方面所具备的有益效果,可参考第六方面中所述,在此不再赘述。
第十一方面,本申请实施例还提供一种跨设备拖拽方法,所述方法应用于第二终端设备;所述方法包括:第二终端设备显示第二界面,第二界面包括第一对象;响应于第二操作,第二终端设备在第二界面的屏幕边缘显示第二UI动效;第二操作为在第二界面拖拽第一对象的操作;第二UI动效用于提示用户能够对第一对象进行跨设备拖 拽;第二UI动效的显示位置与第一终端设备相对于第二终端设备的方位相关,第一终端设备与第二终端设备连接。
例如,第二UI动效的显示位置可以是第二终端设备的屏幕边缘,且该屏幕边缘为与第二终端设备连接的第一终端设备所在的一侧。例如,第二终端设备的屏幕可以分为上侧屏幕边缘、下侧屏幕边缘、左侧屏幕边缘、以及右侧屏幕边缘,则当与第二终端设备连接的第一终端设备所在的一侧为第二终端设备的右侧时,第二UI动效的显示位置可以是第二终端设备的右侧屏幕边缘。
可选地,所述方法还包括:在第一对象的拖拽阴影在第二界面中所在的位置逐渐靠近第二UI动效所在的屏幕边缘的过程中,第二终端设备逐渐增大第二UI动效的显示范围。
一种可能的实现方式中,第二UI动效的显示范围增大的区域与第一对象的拖拽阴影在第二界面中所在的位置相关。
用户将第一对象由第二界面向第一界面进行拖拽的过程中,第一对象的拖拽阴影在第二界面中所在的位置会逐渐靠近第二UI动效,或者说逐渐靠近第二UI动效所在的屏幕边缘。随着第一对象的拖拽阴影在第二界面中所在的位置逐渐靠近第二UI动效所在的屏幕边缘,第二终端设备可以逐步加强第二UI动效的显示效果,如:可以增大第二UI动效的显示范围。
可选地,所述方法还包括:在第一对象被拖拽至第二UI动效的显示区域中时,第二终端设备将第二UI动效的显示区域中与第一对象的拖拽阴影的所在位置相关的区域进行高亮显示或颜色加强。
示例性地,第二UI动效颜色加强时的颜色可以和第一对象的颜色或第二界面的颜色相关,如:第二UI动效颜色加强时的颜色可以和第一对象的颜色相同,或者,第二界面为桌面时,第二UI动效颜色加强时的颜色可以和桌面壁纸的颜色相同或相近。
第十二方面,本申请实施例提供一种跨设备拖拽装置,该装置可以应用于上述第十一方面中所述的第二终端设备,以使第二终端设备实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与第十一方面及第十一方面的任意一种可能的实现方式所述的方法中的步骤相对应的模块或单元。
例如,该装置可以包括:显示单元、接收单元、处理单元等,显示单元、接收单元和处理单元可以配合实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法。如:显示单元可以用于显示第二界面;接收单元可以用于接收第二操作;处理单元可以用于响应于第二操作,控制显示单元在第二界面的屏幕边缘显示第二UI动效等。类似地,显示单元、接收单元和处理单元可以配合实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法的全部步骤对应的功能,在此不再一一赘述。
第十三方面,本申请实施例提供一种电子设备,该电子设备可以是上述第十一方面中所述的第二终端设备。电子设备包括:处理器,用于存储处理器可执行指令的存储器;处理器被配置为执行所述指令时,使得电子设备实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法。
第十四方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令;当所述计算机程序指令被电子设备执行时,使得电子设备实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法。
第十五方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如第十一方面及第十一方面的任意一种可能的实现方式所述的方法。
上述第十二方面至第十五方面所具备的有益效果,可参考第十一方面中所述,在此不再赘述。
第十六方面,本申请实施例提供一种跨设备拖拽方法,所述方法应用于第一终端设备;所述方法包括:第一终端设备显示第一界面;当第二终端设备显示的第二界面中的第一对象被拖入第一界面时,第一终端设备在第一界面的屏幕边缘显示第一UI动效;第一UI动效用于提示用户第一对象被拖入第一界面。
一种可能的实现方式中,第一UI动效的显示区域与第一对象的拖拽阴影在第一界面中被拖入的位置相关。
例如,第一终端设备可以在第一对象的拖拽阴影在第一界面中被拖入的位置(即出现在第一终端设备屏幕边缘上的位置)上下一定范围(如预设的距离范围)内显示第一UI动效。其中,第一对象的拖拽阴影在第一显示界面中被拖入的位置可以根据第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置确定,如:第一对象的拖拽阴影在第二终端设备的显示界面中被拖出的位置距离第二终端设备的上侧(顶部)屏幕边缘之间的距离为上下两侧屏幕之间距离的30%,则第一对象的拖拽阴影在第一界面中被拖入的位置距离第一终端设备的上侧(顶部)屏幕边缘之间的距离也可以为上下两侧屏幕之间距离的30%。或者,第一对象的拖拽阴影在第一界面中被拖入的位置也可以是物理空间内相对的接收位置,在此不作限制。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备将第一UI动效的显示区域中与第一对象的拖拽阴影的所在位置相关的区域进行高亮显示或颜色加强。
示例性地,第一UI动效颜色加强时的颜色可以和第一对象的颜色或第一界面的颜色相关,如:第一UI动效颜色加强时的颜色可以和第一对象的颜色相同,或者,第一界面为桌面时,第一UI动效颜色加强时的颜色可以和桌面壁纸的颜色相近,如:选择桌面壁纸主色调中的颜色。
可选地,所述方法还包括:在第一对象被拖拽至第一对象的拖拽阴影远离屏幕边缘的过程中,第一终端设备显示第一对象的拖尾效果;拖尾效果的显示区域跟随第一对象的拖拽阴影而移动;在拖尾效果的显示区域跟随第一对象的拖拽阴影移动的过程中,拖尾效果逐渐变小或保持不变,且拖尾效果的显示亮度和/或颜色逐渐变淡;当第一对象的拖拽阴影移动超过预设的距离后,第一终端设备不再显示拖尾效果。
可选地,所述方法还包括:响应于结束对第一对象进行拖拽的操作,第一终端设备不再显示第一UI动效。
第十七方面,本申请实施例提供一种跨设备拖拽装置,该装置可以应用于上述第 十六方面中所述的第一终端设备,以使第一终端设备实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与第十六方面及第十六方面的任意一种可能的实现方式所述的方法中的步骤相对应的模块或单元。
例如,该装置可以包括:显示单元、接收单元、处理单元等,显示单元、接收单元和处理单元可以配合实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法。如:显示单元可以用于显示第一界面;接收单元可以用于接收第一对象被拖入第一界面的操作;处理单元可以用于当第二终端设备显示的第二界面中的第一对象被拖入第一界面时,控制显示单元在第一界面的屏幕边缘显示第一UI动效等。类似地,显示单元、接收单元和处理单元可以配合实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法的全部步骤对应的功能,在此不再一一赘述。
第十八方面,本申请实施例提供一种电子设备,该电子设备可以是上述第十六方面中所述的第一终端设备。电子设备包括:处理器,用于存储处理器可执行指令的存储器;处理器被配置为执行所述指令时,使得电子设备实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法。
第十九方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令;当所述计算机程序指令被电子设备执行时,使得电子设备实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法。
第二十方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如第十六方面及第十六方面的任意一种可能的实现方式所述的方法。
上述第十七方面至第二十方面所具备的有益效果,可参考第十六方面中所述,在此不再赘述。
应当理解的是,本申请中对技术特征、技术方案、有益效果或类似语言的描述并不是暗示在任意的单个实施例中可以实现所有的特点和优点。相反,可以理解的是对于特征或有益效果的描述意味着在至少一个实施例中包括特定的技术特征、技术方案或有益效果。因此,本说明书中对于技术特征、技术方案或有益效果的描述并不一定是指相同的实施例。进而,还可以任何适当的方式组合本实施例中所描述的技术特征、技术方案和有益效果。本领域技术人员将会理解,无需特定实施例的一个或多个特定的技术特征、技术方案或有益效果即可实现实施例。在其他实施例中,还可在没有体现所有实施例的特定实施例中识别出额外的技术特征和有益效果。
附图说明
图1为本申请实施例提供的一种手机的结构示意图;
图2为本申请实施例提供的多设备混合拖拽系统的组成示意图;
图3为本申请实施例提供的跨设备拖拽方法的流程示意图;
图4为本申请实施例提供的基于非投屏基础上进行拖拽的场景示意图;
图5为本申请实施例提供的基于投屏基础上进行拖拽的场景示意图;
图6为本申请实施例提供的手机的显示界面的示意图;
图7为本申请实施例提供的手机的显示界面的另一示意图;
图8为本申请实施例提供的手机的显示界面的又一示意图;
图9为本申请实施例提供的手机的显示界面的又一示意图;
图10A为本申请实施例提供的手机的显示界面的又一示意图;
图10B为本申请实施例提供的用户触发手机以抽屉显示的方式显示应用推荐面板的场景示意图;
图10C为本申请实施例提供的手机的显示界面的又一示意图;
图11为本申请实施例提供的手机的显示界面的又一示意图;
图12为本申请实施例提供的手机的显示界面的又一示意图;
图13A为本申请实施例提供的手机的桌面的示意图;
图13B为本申请实施例提供的手机的桌面的另一示意图;
图14A为本申请实施例提供的手机的桌面的又一示意图;
图14B为本申请实施例提供的手机的桌面的又一示意图;
图15为本申请实施例提供的手机的桌面的又一示意图;
图16为本申请实施例提供的手机的桌面的又一示意图;
图17为本申请实施例提供的手机切换显示界面的场景示意图;
图18A为本申请实施例提供的手机的桌面的又一示意图;
图18B为本申请实施例提供的手机打开“文件夹1”1801的场景示意图;
图19A为本申请实施例提供的手机的桌面的又一示意图;
图19B为本申请实施例提供的触发手机显示“文件夹1”1901的全部应用图标的场景示意图;
图19C为本申请实施例提供的手机的显示界面的又一示意图;
图20为本申请实施例提供的手机的显示界面的又一示意图;
图21为本申请实施例提供的PC显示UI动效的示意图;
图22为本申请实施例提供的PC显示UI动效的另一示意图;
图23为本申请实施例提供的PC显示UI动效的又一示意图;
图24为本申请实施例提供的手机显示UI动效的示意图;
图25为本申请实施例提供的手机显示UI动效的另一示意图;
图26为本申请实施例提供的手机显示UI动效的又一示意图;
图27为本申请实施例提供的手机显示UI动效的又一示意图;
图28为本申请实施例提供的跨设备拖拽装置的结构示意图;
图29为本申请实施例提供的跨设备拖拽装置的另一结构示意图。
具体实施方式
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
在本申请实施例中,“示例性地”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性地”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性地”或者“例如”等词旨在以具体方式呈现相关概念。
手机、平板电脑、个人电脑(personal computer,PC)、智能家居设备(如电视机)等多个(如至少两个)终端设备可以进行协同使用,多个终端设备进行协同的场景可以称为多设备协同场景。在多设备协同场景中,用户可以将终端设备1中的对象拖拽到终端设备2中(即跨设备拖拽),在终端设备2中打开或保存。例如,被拖拽的对象可以包括:文件(如文档、图片、音乐、视频等)、文本/文字内容、应用图标、微件(widget)等。
其中,终端设备1可以被称作源(source)端或拖出端,终端设备2可以被称作接收(sink)端或拖入端。需要说明的是,在一对关系中作为源端的设备,在另一对关系中也可能为接收端,也就是说,对于一个终端设备来说,其既可能是作为另一个终端设备的源端,也可能作为另一个终端设备的接收端。
目前,一种实现方式中,源端和接收端之间实现跨设备拖拽时,需要接收端有明确的拖入应用,并且用户需要提前在接收端打开该拖入应用,然后用户可以将源端中的对象拖入接收端中的该拖入应用中。该拖入应用可以保存或打开拖拽过来的对象。或者,另外一种实现方式中,用户没有在接收端提前打开拖入应用时,用户也可以将源端中的对象拖入接收端的桌面,接收端可以将被拖拽过来的对象保存到本地的默认存储路径(如文件管理器)或使用默认的应用程序(如浏览器)打开。
上述现有的源端和接收端之间实现跨设备拖拽的方式较为繁琐,接收端对被拖拽对象的响应不够智能。
在此背景技术下,本申请实施例提供了一种跨设备拖拽方法,该方法可以适用于在任意两个终端设备之间进行拖拽对象的场景。该方法中,用户将源端中的对象拖拽到接收端中时,接收端可以为用户推荐一个或多个应用程序(简称应用)。用户在拖拽对象的过程中,可以根据接收端推荐的应用程序,快速地选择打开该对象的应用程序,或者,保存该对象。
可选地,在本申请实施例中,上述源端和接收端可通过有线或无线的方式建立连接。基于建立的连接,源端和接收端可配合一起使用。源端和接收端采用无线方式建 立连接时采用的无线通信协议可以为无线保真(wireless fidelity,Wi-Fi)协议、蓝牙(bluetooth)协议、ZigBee协议、近距离无线通信(near field communication,NFC)协议等,还可以是各种蜂窝网协议,在此不做具体限制。
其中,源端和接收端分别可以是手机,平板电脑,手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如智能手表),智能家居设备(如电视机),车载电脑,游戏机,以及增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等。本申请对源端和接收端的具体设备形态不作限制。
示例性地,以源端或接收端为手机为例,图1为本申请实施例提供的一种手机的结构示意图。如图1所示,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是手机的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。例如,当手机作为源端时,手机可以通过执行内部存储器121的 指令,实现本申请实施例提供的跨设备拖拽方法中源端所执行的步骤。或者,当手机作为接收端时,手机可以通过执行内部存储器121的指令,实现本申请实施例提供的跨设备拖拽方法中接收端所执行的步骤。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为手机供电。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141也可接收电池142的输入为手机供电。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如Wi-Fi网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),NFC,红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还 可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。例如,手机可以通过移动通信模块150耦合和无线通信模块160与另一个手机进行交互,如:手机可以作为源端向另一个作为接收端的手机发送第一消息、第二消息等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。当有触摸操作作用于显示屏194,手机根据压力传感器180A检测所述触摸操作强度。手机也可以根据压力传感器180A的检测信号计算触摸的位置。
陀螺仪传感器180B可以用于确定手机的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。手机可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测手机在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。手机可以利用接近光传感器180G检测用户手持手机贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。手机可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机的表面,与显示屏194 所处的位置不同。
骨传导传感器180M可以获取振动信号。按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机的接触和分离。手机可以支持1个或N个SIM卡接口,N为大于1的正整数。手机通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机中,不能和手机分离。
可以理解的是,图1所示的结构并不构成对手机的具体限定。在另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
另外,当源端或接收端是平板电脑,手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如智能手表),智能家居设备(如电视机),车载电脑,游戏机,以及增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等其他终端设备时,这些其他终端设备的结构也可以参考上述图1所示。其区别在于,这些其他终端设备的结构可能是在图1给出的结构的基础上增加或减少了组件。
本申请实施例中,每个终端设备(包括源端和接收端)可以包括一个多设备混合拖拽系统,源端和接收端可以分别基于各自的多设备混合拖拽系统实现本申请实施例中所述的功能。示例性地,图2为本申请实施例提供的多设备混合拖拽系统的组成示意图。如图2所示,多设备混合拖拽系统可以包括:UI层、跨设备能力层、事件处理模块、拖拽事件生产者、拖拽事件消费者、消息模块(DragDropEventBus)。
其中,UI层可以用于终端设备实现显示拖拽过程中的阴影(shadow)、气泡(bubbles),接收拖拽事件已控制相关窗口的显示、隐藏等功能。
跨设备能力层用于终端设备实现控制整个拖拽的生命周期,保证拖拽shadow跟手势或者指针动作保持一致等功能。其中,指针可以是指鼠标指针、手写笔指针、触摸板指针等。指针可以是一个静态或动态的图像,在不同情况下指针的表现样式也可能有所不同。
事件处理模块可以用于终端设备实现处理拖拽事件生产者和拖拽事件消费者两端的拖拽事件,以及鼠标穿越、点击、移动事件等功能。
当终端设备作为源端时,可以作为拖拽动作的发起设备,使用拖拽事件生产者相关的模块执行相应的功能。当终端设备作为接收端时,可以作为拖拽动作的接收设备,使用拖拽事件消费者相关的模块执行相应的功能。
消息模块可以作为拖拽事件、鼠标事件、手势事件等的传输通道,终端设备可以通过消息模块与其他设备进行事件或信息的传输。如:源端和接收端之间可以分别基于各自的消息模块进行信息交互。
可选地,本申请实施例中所述的源端和接收端可以是触屏的、也可以是非触屏的。例如,源端是触屏的、接收端是非触屏的;或者,源端是非触屏的,接收端是触屏的;又或者,源端和接收端均是触屏的,或均是非触屏的。对于源端和接收端中的任意一个终端设备,当该终端设备是触屏的设备时,用户可以通过手指、触控笔等在该终端设备的显示屏幕(触控屏)上点击、滑动等方式对该终端设备进行控制,如用户可以通过手指在该终端设备的显示屏幕上拖拽某个对象。当该终端设备是非触屏的设备时,该终端设备可以连接鼠标、键盘、触控面板等输入设备,用户可以通过输入设备对该终端设备进行控制,如用户可以通过鼠标在该终端设备的显示屏幕上拖拽某个对象。或者,当该终端设备是触屏的设备时,该终端设备也可以连接鼠标、键盘、触控面板等输入设备,用户可以通过输入设备对该终端设备进行控制,在此不作限制。
本申请实施例中,源端可以是第二终端设备,接收端可以是第一终端设备。或者,源端可以是第一终端设备,接收端可以是第二终端设备。
示例性地,图3为本申请实施例提供的跨设备拖拽方法的流程示意图。如图3所示,该跨设备拖拽方法可以包括:
S301、源端接收对第一对象的拖拽操作。
可选地,源端可以显示一个界面,该界面中包括第一对象。第一对象可以是源端的显示界面中的文本(或称为文字,text)、文件、文件夹、窗口、部件等。文件可以包括以下一种或多种格式的文件,如word文档,Excel工作簿,PowerPoint演示文稿,位图,图像文件,纯文本文件,声音文件,影片文件,flash动画文件,网页文件,压缩文件等。
示例性地,当源端是触屏的设备时,对第一对象的拖拽操作可以是用户通过手指、触控笔等在源端的显示屏幕(触控屏)上点击并拖动第一对象进行移动的操作。当源端是非触屏的设备时,源端可以与的鼠标、键盘、触控面板等输入设备连接,并在显示界面中显示输入设备对应的指针,对第一对象的拖拽操作可以是用户可以通过输入设备控制指针拖动第一对象进行移动的操作。
S302、源端响应于对第一对象的拖拽操作,显示第一对象的拖拽阴影,并向接收端发送第一消息,第一消息包括第一对象、以及第一对象的拖拽阴影。
示例性地,第一消息可以是开始拖拽(drag start)消息。第一消息中包括的第一对象也即被拖拽的具体内容。
相应地,接收端接收第一消息。接收端接收到第一消息后,可以执行S303。
S303、接收端根据第一消息,生成第一对象的拖拽阴影。
S304、源端检测到第一对象被拖拽至源端的显示界面的边缘后,源端向接收端发送第二消息,第二消息用于通知接收端第一对象将被拖拽至接收端的显示界面上。
相应地,接收端接收第二消息。
可选地,源端检测到第一对象被拖拽至源端的显示界面的边缘可以是指源端检测到第一对象被拖拽的位置与源端的显示界面的边缘的距离小于预设的距离阈值,如:小于2个像素点,或者,小于0.1厘米(cm)等。
一种可能的实现方式中,源端可根据指针的初始位置和相对位移确定指针在显示屏上的坐标位置,从而确定指针是否滑动至显示屏的边缘,当指针滑动至显示屏的边缘时,表示第一对象被拖拽至源端的显示界面的边缘。
接收端接收到第二消息后,可以执行S305。
S305、接收端显示第一对象的拖拽阴影,并显示应用推荐面板,应用推荐面板中包括一个或多个应用对应的应用标识。
其中,应用标识可以是应用的应用图标。应用推荐面板也可以称为推荐面板、推荐窗口、应用推荐窗口、应用推荐弹窗等,在此对应用推荐面板的名称不作限制。本申请中,可以将前述应用推荐面板、推荐面板、推荐窗口、应用推荐窗口、应用推荐弹窗等称为第一窗口。
本申请实施例中,上述将第一对象由源端的显示界面拖拽到接收端的显示界面的操作可以称为第一操作。也即,接收端可以响应于第一操作,显示第一窗口。
一些实施例中,上述源端和接收端之间的跨设备拖拽可以是指基于非投屏基础上进行拖拽。在源端和接收端连接后,利用键鼠共享技术,用户可使用一套输入设备(如鼠标)实现对源端和接收端两者的控制,将第一对象由源端的显示界面拖拽到接收端的显示界面。
其中,键鼠共享技术可以是指用一个终端的输入设备(如鼠标,触摸板),实现对其他终端控制的技术。例如,一种可能的实现方式中,接收端接收到第一消息后,可创建一个虚拟输入设备,该虚拟输入设备与常规的如鼠标,触摸板等输入设备的作用相同,可用于接收端模拟对应输入事件。例如,以源端为PC,PC的输入设备为鼠标,接收端为手机例,手机创建的该虚拟输入设备与常规的鼠标作用相同,可以看作是PC共享给手机的鼠标,能够用于在手机端模拟鼠标事件,以实现PC的鼠标对手机的控制。示例性的,以手机的操作系统是安卓系统为例。手机可利用linux的uinput能力实现虚拟输入设备的创建。其中,uinput是一个内核层模块,可以模拟输入设备。通过写入/dev/uinput(或/dev/input/uinput)设备,进程可以创建具有特定功能的虚拟输入设备。一旦创建了该虚拟输入设备,其便可模拟对应的事件。需要说明的是,本申请对键鼠共享技术的具体实现原理不作限制,在其他一些实施例中,源端和接收端之间也可能基于其他原理所实现的键鼠共享技术,使得用户可使用一套输入设备实现对源端和接收端两者的控制。
示例性地,以源端为PC、接收端为手机,PC的输入设备为鼠标为例,图4为本申请实施例提供的基于非投屏基础上进行拖拽的场景示意图。
如图4所示,PC的显示界面401中可以包括第一对象402、以及鼠标的指针(图中的小箭头,未标出)。对第一对象402的拖拽操作可以是用户使用鼠标点击并拖动第一对象402的操作。例如,用户可以将鼠标的指针移动至第一对象402上,点击长按鼠标的左键并移动鼠标,从而,可以对第一对象402进行拖拽。响应于对第一对象402的拖拽操作,PC可以显示第一对象402的拖拽阴影403。利用键鼠共享技术,用户可使用PC的鼠标将PC的显示界面401中的第一对象402从PC的显示界面401拖拽到手机的显示界面404。前述PC和手机之间的跨设备拖拽即为基于非投屏基础上进行的拖拽。
当PC检测到第一对象402被拖拽至PC的显示界面401的边缘后(也即,检测到鼠标的指针或者第一对象402的拖拽阴影403移动至PC的显示界面401的边缘后),PC可以向手机发送第二消息,通知手机第一对象402将被拖拽至手机的显示界面404上。手机接收到第二消息后,可以显示第一对象402的拖拽阴影403,并显示应用推荐面板405,应用推荐面板405中包括一个或多个应用对应的应用标识,如:应用A的应用图标、应用B的应用图标、…、应用F的应用图标等。
需要说明的是,在拖拽第一对象402由PC的显示界面401穿越至手机的显示界面404的过程中,手机的显示界面404上显示的第一对象402的拖拽阴影403与PC的显示界面401上显示的第一对象402的拖拽阴影403相关,手机的显示界面404上显示的第一对象402的拖拽阴影403会逐渐增大至完整,PC的显示界面401上显示的第一对象402的拖拽阴影403会逐渐缩小至消失。
另外一些实施例中,源端和接收端之间的跨设备拖拽也可以是指基于投屏基础上进行拖拽。在源端和接收端连接后,接收端可以将显示界面投屏至源端的显示屏上,或者,源端可以将显示界面投屏至接收端的显示屏上。以接收端将显示界面投屏至源端的显示屏为例,利用反向控制能力(指利用源端的输入设备控制接收端的能力)和键鼠共享技术,用户可以使用源端的输入设备实现对源端和接收端两者的控制,将第一对象由源端的显示界面拖拽到接收端的显示界面。
例如,同样以源端为PC、接收端为手机,PC的输入设备为鼠标为例,图5为本申请实施例提供的基于投屏基础上进行拖拽的场景示意图。
如图5所示,手机与PC建立连接后,手机可以将手机的显示界面投屏至PC的显示屏上,此时,PC的显示屏上可以包括PC的显示界面401,以及手机的显示界面404。PC的显示界面401中可以包括第一对象402、以及鼠标的指针(图中的小箭头,未标出)。对第一对象402的拖拽操作可以是用户使用鼠标点击并拖动第一对象402的操作。例如,用户可以将鼠标的指针移动至第一对象402上,点击长按鼠标的左键并移动鼠标,从而,可以对第一对象402进行拖拽。响应于对第一对象402的拖拽操作,PC可以显示第一对象402的拖拽阴影403。利用反向控制能力和键鼠共享技术,用户可使用PC的鼠标将PC的显示界面401中的第一对象402从PC的显示界面401拖拽到手机的显示界面404。前述PC和手机之间的跨设备拖拽即为基于投屏基础上进行的拖拽。
当PC检测到第一对象402被拖拽至PC的显示界面401的边缘后(也即,检测到鼠标的指针或者第一对象402的拖拽阴影403移动至PC的显示界面401的边缘后),或者,PC检测到第一对象402被拖拽至靠近被投屏过来的手机的显示界面404的边缘后,PC可以向手机发送第二消息,通知手机第一对象402将被拖拽至手机的显示界面404上。手机接收到第二消息后,可以显示第一对象402的拖拽阴影403,并显示应用推荐面板405,应用推荐面板405中包括一个或多个应用对应的应用标识,如:应用A的应用图标、应用B的应用图标、…、应用F的应用图标等。
在一种可能的实现方式中,上述应用推荐面板中包括的应用标识可以是人为定义或预设的。例如,用户或接收端的服务厂家可以在预先配置在应用推荐面板中显示哪些应用的应用标识。也即,应用推荐面板中包括的应用标识是固定的。
在另外一种可能的实现方式中,上述应用推荐面板中也可以包括接收端上安装的所有应用的应用标识。例如,以上述将第一对象由PC的显示界面拖拽至手机的显示界面为例,假设手机上安装有N个应用(N为大于0的整数),则应用推荐面板中可以包括手机安装的N个应用的应用图标。
在又一种可能的实现方式中,上述应用推荐面板中包括的应用标识也可以是接收端根据第一对象的类型所确定的可以打开第一对象的应用的应用标识。例如,接收端接收到第一消息后,可以根据第一消息中包括的第一对象确定第一对象的类型,或者,第一消息中可以单独包含一个字段用于指示第一对象的类型,接收端可以根据该字段确定第一对象的类型。接收端可以根据第一对象的类型,从接收端安装的所有应用中选择可以打开第一对象的应用,并在应用推荐面板中显示这些可以打开第一对象的应用的应用标识。例如,假设接收端中安装有word应用、excel应用,第一对象为一个word文档,则推荐面板中可以包括word应用的应用图标。
可选地,以接收端的操作系统为安卓(andriod TM)系统为例,接收端在确定第一对象的类型后,可以根据第一对象的类型,通过PackageManager的queryIntentActivities方法获取支持该类型的应用列表信息。
可以理解,接收端显示第一对象的拖拽阴影,即表示第一对象被拖拽到了接收端的显示界面。在接收端显示上述应用推荐面板时,用户可以继续执行上述对第一对象的拖拽操作(如继续使用鼠标进行拖拽),拖动第一对象在接收端的显示界面中移动。用户可以通过继续执行上述对第一对象的拖拽操作,将第一对象拖拽至应用推荐面板中显示的任意一个应用标识上,以通过该应用标识对应的应用打开第一对象。
一些实施例中,如果用户将第一对象拖拽至无法打开第一对象的应用的应用标识上,则接收端可以响应于将第一对象拖拽至无法打开第一对象的应用的应用标识上的操作,显示第一提示信息,第一提示信息用于提示该应用标识对应的应用无法打开第一对象。如果用户将第一对象拖拽至可以打开第一对象的应用的应用标识上、并结束对第一对象的拖拽操作(如松开鼠标),则接收端可以响应于将第一对象拖拽至可以打开第一对象的应用的应用标识上、并结束对第一对象进行拖拽的操作,通过该应用标识对应的应用打开第一对象。其中,接收端通过该应用标识对应的应用打开第一对象,可以包括:接收端打开该应用标识对应的应用,并将第一对象传递给应用标识对应的应用。
例如,用户可以将第一对象拖拽至第一应用标识上,第一应用标识为第一窗口(即上述应用推荐面板)中包括的应用标识中的一个。响应于将第一对象拖拽至第一应用标识的操作,当第一应用标识对应的应用为无法打开第一对象的应用时,接收端可以显示第一提示信息。
以接收端为手机为例,图6为本申请实施例提供的手机的显示界面的示意图。如图6所示,当用户使用鼠标将第一对象由PC的显示界面拖拽到手机的显示界面后,手机可以显示应用推荐面板601,且手机的显示界面中还包括第一对象的拖拽阴影602。假设应用推荐面板601中包括应用A、应用B、应用C、应用D、应用E、应用F、以及应用G分别对应的应用标识。其中,应用A、应用B、以及应用C均为可以打开第一对象的应用,应用D、应用E、应用F、以及应用G均为无法打开第一对象的应用。
当用户继续执行对第一对象的拖拽操作,将第一对象拖拽至应用A、应用B、以及应用C中的任意一个应用的应用标识上,并结束对第一对象的拖拽操作(如松开鼠标)时,手机可以监听到拖拽结束(drag drop)事件。手机可以响应于用户将第一对象拖拽至该应用标识上、并结束对第一对象进行拖拽的操作,通过该应用标识对应的应用打开第一对象。例如,第一对象可以是word文档,应用A为word应用,当用户将第一对象拖拽至应用A的应用标识上时,手机可以通过应用A打开该word文档。
当用户继续执行对第一对象的拖拽操作,将第一对象拖拽至应用D、应用E、应用F、以及应用G中的任意一个应用的应用标识上时,手机可以响应于用户将第一对象拖拽至该应用标识上的操作,显示第一提示信息,第一提示信息用于提示该应用标识对应的应用无法打开第一对象。例如,第一对象可以是word文档,应用D为excel应用,当用户将第一对象拖拽至应用D的应用标识上时,手机可以显示第一提示信息。
在一些可能的实现方式中,用户将第一对象拖拽至无法打开第一对象的应用的应用标识(如第一应用标识)上时,接收端可以通过改变应用标识(如应用图标)的显示状态来显示第一提示信息。
例如,图7为本申请实施例提供的手机的显示界面的另一示意图。如图7所示,当用户继续执行对第一对象的拖拽操作,将第一对象拖拽至应用D的应用标识上时,手机可以响应于用户将第一对象拖拽至应用D的应用标识上的操作,将原来显示的应用D的应用图标中“D”的图案改变为显示一个“斜线/斜杠”603,以提示用户应用D无法打开第一对象。
又例如,图8为本申请实施例提供的手机的显示界面的又一示意图。如图8所示,当用户继续执行对第一对象的拖拽操作,将第一对象拖拽至应用D的应用标识上时,手机可以响应于用户将第一对象拖拽至应用D的应用标识上的操作,将原来显示的应用D的应用图标中“D”的图案改变为显示一个“X”604的提示,以提示用户应用D无法打开第一对象。
上述图7和图8中所示出的“斜线/斜杠”603和“X”604也可以称为禁止标识。该禁止标识即上述第一提示信息。在其他一些示例中,禁止标识也可以是其他图案,在此不作限制。
在另外一些可能的实现方式中,用户将第一对象拖拽至无法打开第一对象的应用的应用标识上时,接收端通过改变应用标识(如应用图标)的显示状态来显示第一提 示信息也可以包括:接收端将应用标识变暗或灰化,应用标识变暗或灰化的效果即上述第一提示信息。例如,接收端可以将应用标识的显示颜色变为灰色,变为灰色后,应用标识对应的应用可以视为未激活状态,能够提示用户该应用标识对应的应用无法打开第一对象。
以上示出了一些第一提示信息是以非文字提示的方式实现的示例,还有一些可能的实现方式中,用户将第一对象拖拽至无法打开第一对象的应用的应用标识上时,接收端还可以在显示界面(如:应用标识周围或应用标识上)显示文字提示作为第一提示信息。需要说明的是,本申请对第一提示信息的实现方式不作限制。
还有一些实施例中,对于上述应用推荐面板中包括的应用标识是人为定义或预设的,或者,应用推荐面板中包括的应用标识包括接收端上安装的所有应用的应用标识两种实现方式而言,接收端在显示应用推荐面板时,也可以将应用推荐面板中无法打开第一对象的应用的应用标识和可以打开第一对象的应用的应用标识进行区别显示,以提示用户哪些应用可以打开第一对象,哪些应用无法打开第一对象。用户可以直接将第一对象拖拽至可以打开第一对象的应用的应用标识上,以触发对应的应用打开第一对象。
例如,一种可能的实现方式中,接收端可以根据第一消息确定第一对象的类型,确定第一对象的类型的具体方式可以参考前述实施例中所述,不再赘述。根据第一对象的类型,接收端可以对应用推荐面板中包括的应用标识进行分类,确定出哪些应用标识对应的应用可以打开第一对象,以及哪些应用标识对应的应用无法打开第一对象。接收端在显示应用推荐面板时,对于其中无法打开第一对象的应用的应用标识,接收端可以将这类应用标识变暗或灰化。例如,接收端可以将应用推荐面板中无法打开第一对象的应用的应用标识的显示颜色变为灰色,变为灰色后,应用标识对应的应用可以视为未激活状态,能够提示用户该应用标识对应的应用无法打开第一对象。和/或,接收端在显示应用推荐面板时,对于其中可以打开第一对象的应用的应用标识,接收端可以将这类应用标识高亮显示,以提示用户该类应用标识对应的应用可以打开第一对象。
本申请实施例提供的该跨设备拖拽方法,用户无需提前在接收端打开可以打开第一对象的应用界面,用户在对第一对象进行拖拽的过程中,可以根据接收端显示的应用推荐面板,选择想要打开第一对象的应用。拖拽方式更加简单,接收端对被拖拽对象的响应更加智能。
一些可能的场景中,用户将第一对象拖拽到接收端的显示界面的目的可能是想要将第一对象保存在接收端。对此,本申请一些实施例中,用户可以将第一对象拖拽到接收端的显示界面的空白区域(如没有应用图标的区域),并结束对第一对象进行拖拽的操作,接收端可以响应于用户可以将第一对象拖拽到接收端的显示界面的空白区域,并结束对第一对象进行拖拽的操作,保存第一对象,如:可以将第一对象保存在默认的存储路径。其中,该空白区域可以是指应用推荐面板中的空白区域,或者,接收端的显示界面中除了应用推荐面板之外的其他空白区域。本申请中,前述应用推荐面板中的空白区域,或者,接收端的显示界面中除了应用推荐面板之外的其他空白区域可以称为第一区域,将第一对象拖拽到第一区域的操作可以称为第四操作,第四操 作对应的拖拽对象也可以称为第二对象。第二对象包括前述第一对象。
另外一些实施例中,应用推荐面板中也可以包括一个专门用于触发保存第一对象的区域,或者,触发保存第一对象的图标(icon),用户可以将第一对象拖拽到应用推荐面板中专门用于触发保存第一对象的区域,或者,触发保存第一对象的icon上,并结束对第一对象进行拖拽的操作,以触发接收端保存第一对象。前述专门用于触发保存第一对象的区域可以称为第二区域,专门用于触发保存第一对象的图标可以称为第一图标。
例如,图9为本申请实施例提供的手机的显示界面的又一示意图。如图9所示,一种可能的方式中,应用推荐面板中可以包括一个保存区域901。当用户想要保存第一对象时,可以将第一对象拖拽到保存区域901中,并结束对第一对象进行拖拽的操作。手机可以响应于用户将第一对象拖拽到保存区域901中,并结束对第一对象进行拖拽的操作,保存第一对象。
可选地,本申请一些实施例中,接收端显示应用推荐面板的方式可以包括:全屏显示或非全屏显示,上述图4和图5中是以非全屏显示为例,图6至图9是以全屏显示为例。非全屏显示具体可以包括半屏显示(即应用推荐面板的区域占手机显示屏幕区域的一半)、三分之一屏显示(即应用推荐面板的区域占手机显示屏幕区域的三分之一)等。
另外一些实施例中,接收端显示应用推荐面板的方式还可以是抽屉显示。当应用推荐面板的显示方式是抽屉显示时,上述S305所述的步骤可以包括:接收端显示第一对象的拖拽阴影,并显示应用推荐面板对应的抽屉按钮(抽屉图标);接收端响应于用户将第一对象拖拽至抽屉按钮上,并在抽屉按钮上停留预设时长(如第四时长)的操作,以抽屉显示的方式显示应用推荐面板。其中,预设时长可以2秒、3秒、5秒等,不作限制。
例如,以接收端为手机为例,图10A为本申请实施例提供的手机的显示界面的又一示意图。如图10A所示,手机接收到第二消息后,可以显示第一对象的拖拽阴影602,并显示应用推荐面板对应的抽屉按钮1001。图10B为本申请实施例提供的用户触发手机以抽屉显示的方式显示应用推荐面板的场景示意图。如图10B所示,用户可以将第一对象拖拽至抽屉按钮1001上,并在抽屉按钮1001上停留预设时长。手机可以响应于用户将第一对象拖拽至抽屉按钮1001上,并在抽屉按钮1001上停留预设时长的操作,以抽屉显示的方式显示应用推荐面板601。
示例性地,手机可以模拟一个点击事件注入系统,当用户将第一对象拖拽至抽屉按钮1001上,并在抽屉按钮1001上停留预设时长上时,点击事件注入系统可以生成一个点击抽屉按钮1001的事件,点击抽屉按钮1001的事件可以触发手机显示应用推荐面板601。
以上所述均为应用推荐面板的一些可能的显示方式,本申请对应用推荐面板的显示方式并不作限制。
可选地,当接收端显示应用推荐面板时,如果应用推荐面板中包括的应用标识的数量过多导致应用推荐面板无法全部显示这些应用标识,则接收端可以在应用推荐面板中显示部分应用标识,将剩余部分的应用标识在应用推荐面板下方滚动/滑动显示。
例如,以接收端为手机为例,图10C为本申请实施例提供的手机的显示界面的又一示意图。如图10C所示,当应用推荐面板所包含的区域无法全部显示应用推荐面板中包括的应用标识时,手机可以在应用推荐面板优先显示部分应用标识(如应用A至应用I的应用标识),其余部分应用标识可以在下方滑动显示。当用户想要查看其余部分应用标识时,可以将第一对象拖拽至显示应用标识的底部区域,触发手机将应用推荐面板中显示的应用标识向上滑动,从而使得其余部分的应用标识显示出来。或者,用户也可以通过多点/指触控(除了第一对象的焦点外其余焦点)的方式触发手机滑动显示更多的应用标识,在此不作限制。例如,当用户使用某个手指拖动第一对象时,可以使用另外一个手指拖动图10C中显示的长方形黑色滑动(实际显示效果也可能是其他颜色),触发手机滑动显示更多的应用标识。
可选地,上述接收端对应用标识进行滑动显示的场景中,应用推荐面板中优先显示的部分应用标识可以是被用户使用频率最高/较高的应用的应用标识。
示例性地,图11为本申请实施例提供的手机的显示界面的又一示意图。如图11所示,当应用推荐面板所包含的区域无法全部显示应用推荐面板中包括的应用标识时,手机还可以在应用推荐面板中将优先显示的部分应用标识与其余部分应用标识进行分组显示。如:应用推荐面板中可以包括“推荐应用”1101和“更多应用”1102两个分组。优先显示的部分应用标识可以在“推荐应用”1101的分组中显示,其余部分应用标识可以在“更多应用”1102的分组中显示,且“更多应用”1102分组显示的应用标识可以支持滑动显示。
一些实施例中,当用户通过继续执行上述对第一对象的拖拽操作,将第一对象拖拽至应用推荐面板中显示的某个应用标识上时,如果该应用标识对应的应用可以打开第一对象,且包括多个可以服务(或称为功能),则接收端可以在该应用标识旁显示(弹出)该应用标识对应的应用的服务菜单(或称为功能菜单),服务菜单中可以包括一个或多个该应用标识对应的应用包括的服务的服务标识。用户可以继续执行上述对第一对象的拖拽操作,将第一对象拖拽至服务菜单中显示的某个服务标识(如第一服务标识)上、并结束对第一对象的拖拽操作,接收端可以响应于用户将第一对象拖拽至服务菜单中显示的某个服务标识上、并结束对第一对象进行拖拽的操作,通过该应用标识对应的应用打开第一对象,并启动该服务标识对应的服务对第一对象进行处理/操作。
例如,用户可以将第一对象拖拽至第一应用标识上,第一应用标识为第一窗口(即上述应用推荐面板)中包括的应用标识中的一个。响应于将第一对象拖拽至第一应用标识的操作,当第一应用标识对应的应用为可以打开第一对象的应用时,接收端可以显示该应用对应的服务菜单。在本申请中,应用对应的服务菜单也可以称为第二窗口。
以接收端为手机,第一对象为图片,应用推荐面板中包括某图像处理应用P的应用标识,图像处理应用P包括预览、滤镜、马赛克等服务为例,图12为本申请实施例提供的手机的显示界面的又一示意图。如图12所示,应用推荐面板中可以包括图像处理应用P的应用标识1201,用户将图片拖拽至图像处理应用P的应用标识1201上时,手机可以在图像处理应用P的应用标识1201旁显示图像处理应用P的服务菜单1202。服务菜单1202中可以包括“预览”、“滤镜”、“马赛克”等服务标识。用户可以继 续执行对图片的拖拽操作,将图片拖拽至“预览”、“滤镜”、“马赛克”等中的任意一个服务标识上、并结束对图片的拖拽操作,手机可以响应于用户将图片拖拽至该服务标识上、并结束对图片进行拖拽的操作,通过图像处理应用P打开图片,并启动该服务标识对应的服务对第一对象进行处理/操作。如:用户将图片拖拽至“预览”上、并结束对图片的拖拽操作时,手机可以响应于用户将图片拖拽至“预览”上、并结束对图片进行拖拽的操作,通过图像处理应用P打开第一对象,并启动图像处理应用P的预览服务,预览图片。
可选地,在一种可能的实现方式中,上述服务菜单中包括的服务标识可以是人为定义或预设的。例如,用户或接收端的服务厂家可以在预先配置每个应用对应的服务菜单中包括哪些服务标识。也即,服务菜单中包括的服务标识是固定的。
在另外一种可能的实现方式中,上述服务菜单中也可以包括应用对应的所有服务的服务标识。例如,以将第一对象拖拽至上述图像处理应用P的应用标识上为例,假设图像处理应用P包括N个服务(N为大于0的整数),则图像处理应用P对应的服务菜单中可以包括前述N个服务的服务标识。
在又一种可能的实现方式中,上述服务菜单中包括的服务标识也可以是接收端根据第一对象的类型所确定的可以打开第一对象的服务的服务标识。例如,接收端接收到第一消息后,可以根据第一消息中包括的第一对象确定第一对象的类型,或者,第一消息中可以单独包含一个字段用于指示第一对象的类型,接收端可以根据该字段确定第一对象的类型。当用户将第一对象拖拽至应用推荐面板中的某个应用标识上时,接收端可以根据第一对象的类型,从该应用标识对应的应用包括的所有服务中选择可以打开第一对象的服务,并在服务菜单中显示这些打开第一对象的服务的服务标识。
需要说明的是,如果用户将第一对象拖拽至无法打开第一对象的服务的服务标识(如第一服务标识)上,则接收端可以显示第二提示信息,第二提示信息用于提示该服务标识对应的服务无法打开第一对象。第二提示信息可以参考第一提示信息,例如,接收端可以通过改变服务标识的显示状态来显示第二提示信息,此处不再赘述。
一些实施例中,当用户将第一对象拖拽至接收端的显示界面时,接收端也可以先判断当前的显示界面是否为支持拖入的界面。当接收端当前的显示界面为不支持拖入的界面时,接收端可以显示应用推荐面板。当接收端当前的显示界面为支持拖入的界面时,接收端可以不显示应用推荐面板。
其中,不支持拖入的界面可以包括系统桌面(简称桌面)、系统弹窗、不支持拖入的应用界面等。支持拖入的界面可以包括支持拖入的应用界面。例如,以接收端为手机为例,支持拖入的应用界面可以是手机上安装的一些聊天应用的聊天界面。
示例性地,以接收端的操作系统为安卓系统为例,当用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面可以响应拖拽事件,则可以对当前的显示界面增加View.OnDragListener监听,并返回一个返回值(True)。接收端的框架(Framework)层可以根据这个返回值(True),判断当前的显示界面是否支持响应拖拽事件,如果当前的显示界面是不支持响应拖拽事件,则表示当前的显示界面为不支持拖入的界面。如果当前的显示界面支持响应拖拽事件,则表示当前的显示界面为支持拖入的界面。
该实施例中可以考虑到用户的拖拽目的原本就是一些支持拖入的界面的场景,能够更符合用户的实际拖拽需求。当接收端当前的显示界面为支持拖入的界面时,接收端无需显示应用推荐面板,支持拖入的界面可以直接响应第一对象的拖拽事件。例如,支持拖入的界面为某个应用的应用界面时,该应用可以响应于第一对象的拖拽事件,根据第一对象进行创建/编辑、查看、加附件、发送、插入、搜索和跳转等。
一些实施例中,当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面(如手机桌面),则接收端还可以根据用户将第一对象拖拽至接收端的显示界面后进一步的交互动作来判断用户意图,并根据用户意图确定是否显示应用推荐面板。
例如,一种可能的实现方式中,接收端可以将用户意图划分为两类:保存第一对象,或者寻找某个应用打开第一对象。当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:直接结束对第一对象的拖拽操作(如松开/释放鼠标),则接收端可以确定用户意图为保存第一对象。此时,接收端可以响应于用户将第一对象拖拽至接收端的显示界面后直接结束对第一对象的拖拽的操作,保存第一对象,如:保存在默认的存储路径。当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针(如鼠标的指针)在接收端的显示界面中停留时间达到第一时长(如3秒、5秒等,对第一时长不作限制),则接收端可以确定用户意图为寻找某个应用打开第一对象。例如,第一对象可能是图片,用户可能想要寻找接收端中的某个应用来预览该图片。此时,接收端可以响应于用户未结束对第一对象的拖拽操作,且指针)在接收端的显示界面中停留时间达到第一时长的触发条件,显示上述应用推荐面板,以实现根据用户意图向与用户推荐应用的目的。
一些实施例中,直接结束对第一对象的拖拽操作也可以是指:用户将第一对象拖拽到接收端的显示界面后至用户结束对第一对象的拖拽操作之间的时间小于上述第一时长。
又例如,另一种可能的实现方式中,接收端可以将用户意图划分为以下两类:以默认方式打开第一对象,或者寻找某个应用打开第一对象。当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:直接结束对第一对象的拖拽操作(如松开/释放鼠标),则接收端可以确定用户意图为以默认方式打开第一对象。此时,接收端可以响应于用户将第一对象拖拽至接收端的显示界面后直接结束对第一对象的拖拽的操作,以默认方式打开第一对象,如:以默认浏览器打开第一对象。类似地,当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中停留时间达到第一时长,则接收端可以确定用户意图为寻找某个应用打开第一对象,接收端可以显示应用推荐面板,不再赘述。
可选地,还有一些可能的实现方式中,上述显示应用推荐面板的触发条件:“用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中停留时间达到第一时长”也可以被替换为“用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中的拖动/滑动距离大于第一阈值”,如:第一阈值可以是3厘米(cm)、5cm等,对第一阈值的大小不作限制。也即,当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为不支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中的拖动/滑动距离大于第一阈值,则接收端可以显示上述应用推荐面板。
或者,还有一些可能的实现方式中,上述两种显示应用推荐面板的触发条件也可以为“和”或者“或”的关系,当上述两种显示应用推荐面板的触发条件为“和”的关系时,上述两种显示应用推荐面板的触发条件同时发生时,接收端可以显示上述应用推荐面板。当上述两种显示应用推荐面板的触发条件为“或”的关系时,其中任意一种触发条件发生时,接收端均可以显示上述应用推荐面板。本申请对接收端显示应用推荐面板的触发条件不作限制。
本实施例中,接收端可以根据用户将第一对象拖拽至接收端的显示界面后进一步的交互动作来判断用户意图,并根据用户意图确定是否显示应用推荐面板,从而接收端可以更准确地知道用户拖拽的目的,并基于户拖拽的目的为用户提供完备的拖拽交互功能。
本申请实施例中,当接收端当前的显示界面为不支持拖入的界面时,用户将某个对象拖拽至接收端的显示界面后,直接结束对该对象进行拖拽的操作可以称为第三操作。第三操作对应的拖拽对象也可以称为第二对象。第二对象可以包括上述第一对象。
可选地,一些实施例中,当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为支持拖入的界面(如三方应用的应用界面),则接收端也可以根据用户将第一对象拖拽至接收端的显示界面后进一步的交互动作来判断用户意图,并根据用户意图确定是否显示应用推荐面板。
例如,一种可能的实现方式,对于当前的显示界面为支持拖入的界面的情况,接收端可以将用户意图划分为以下两类:使用当前的显示界面中的相关功能对第一对象执行后续操作(即,接收端在当前的显示界面中响应第一对象的拖拽事件),或者寻找某个应用打开第一对象。当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:直接结束对第一对象的拖拽操作,则接收端可以确定用户意图为使用当前的显示界面中的相关功能对第一对象执行后续操作。此时,接收端可以响应于用户将第一对象拖拽至接收端的显示界面后直接结束对第一对象的拖拽的操作,响应第一对象的拖拽事件,如:根据第一对象进行创建/编辑、查看、加附件、发送、插入、搜索和跳转等。当用户将第一对象拖拽至接收端的显示界面时,如果接收端判断得到当前的显示界面为支持拖入的界面,且用户将第一对象拖拽至接收端的显示界面后进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针在接收端的显示界面中停留时间达到第一时长,和/或,用户未结束对第一对象的拖拽操作, 且指针在接收端的显示界面中的拖动/滑动距离大于第一阈值,则接收端可以确定用户没有明确的拖入目的地,用户意图为寻找某个应用打开第一对象。此时,接收端可以显示应用推荐面板。
一些实施例中,当用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,则接收端也可以不显示上述应用推荐面板。当用户将第一对象拖拽至桌面上显示的某个应用图标上时,如果该应用图标对应的应用为可以打开第一对象的应用,则接收端可以响应于用户将第一对象拖拽至该应用图标(如第二应用标识)上的操作,提示用户该应用图标对应的应用可以打开第一对象(即支持拖入);此时,用户可以结束对第一对象的拖拽操作(如松开鼠标),接收端可以响应于用户结束对第一对象进行拖拽的操作,通过该应用图标对应的应用打开第一对象。如果该应用图标对应的应用为无法打开第一对象的应用,则接收端可以响应于用户将第一对象拖拽至该应用图标(如第三应用标识)上的操作,提示用户该应用图标对应的应用无法打开第一对象(即不支持拖入),用户可以继续将第一对象拖拽至其他应用图标上。
一种可能的实现方式中,接收端可以通过改变应用图标的显示状态,提示用户该应用图标对应的应用是否可以打开第一对象。
例如,以接收端为手机为例,图13A为本申请实施例提供的手机的桌面的示意图。如图13A所示,假设手机的桌面包括应用A的应用图标和应用B的应用图标。其中,应用A为可以打开第一对象的应用,应用B为无法打开第一对象的应用。当用户将第一对象拖拽至应用A的应用图标上时,手机可以响应于用户将第一对象拖拽至应用A的应用图标上的操作,将应用A的应用图标的显示状态改变为可激活状态(如图13A中的虚线所示可以表示可激活状态),以提示用户应用A可以打开第一对象。或者,手机也可以通过其他方式改变应用A的应用图标的显示状态(如变亮、改变显示效果或颜色等),以提示用户应用A可以打开第一对象,在此不作限制。此时,用户可以结束对第一对象的拖拽操作(如松开鼠标),手机可以响应于用户结束对第一对象进行拖拽的操作,通过应用A打开第一对象。
图13B为本申请实施例提供的手机的桌面的另一示意图。如图13B所示,当用户将第一对象拖拽至应用B的应用图标上时,手机可以响应于用户将第一对象拖拽至应用B的应用图标上的操作,将原来显示的应用B的应用图标中“B”的图案改变为显示一个“斜线/斜杠”,以提示用户应用B无法打开第一对象。或者,手机也可以通过其他方式改变应用B的应用图标的显示状态(如变暗、灰化等),以提示用户应用B无法打开第一对象,在此也不作限制。用户可以继续将第一对象拖拽至其他应用图标上。
另外一种可能的实现方式中,接收端也可以通过在第一对象的拖拽阴影或者指针(如鼠标指针)周围(如左上角、左下角、右上角、右下角等)显示一个角标的方式,提示用户应用图标对应的应用是否可以打开第一对象。
例如,同样以接收端为手机为例,图14A为本申请实施例提供的手机的桌面的又示意图。如图14A所示,同样假设手机的桌面包括应用A的应用图标和应用B的应用图标。其中,应用A为可以打开第一对象的应用,应用B为无法打开第一对象的应用。当用户将第一对象拖拽至应用A的应用图标上时,手机可以响应于用户将第一对象拖 拽至应用A的应用图标上的操作,在指针左上角(也可以是第一对象左上角)显示一个加号“+”1401形状的角标,以提示用户应用A可以打开第一对象。此时,用户可以结束对第一对象的拖拽操作(如松开鼠标),手机可以响应于用户结束对第一对象进行拖拽的操作,通过应用A打开第一对象。
图14B为本申请实施例提供的手机的桌面的又一示意图。如图14B所示,当用户将第一对象拖拽至应用B的应用图标上时,手机可以响应于用户将第一对象拖拽至应用B的应用图标上的操作,在指针左上角(也可以是第一对象左上角)显示一个减号“-”1402形状的角标,以提示用户应用B无法打开第一对象。用户可以继续将第一对象拖拽至其他应用图标上。
其他实现方式中,上述用于提示用户应用A可以打开第一对象,和/或,无法打开第一对象的角标也可以是其他形状,在此不作限制。
需要说明的,上述图13A-图13B中所示的通过改变应用图标的显示状态提示用户应用是否可以打开第一对象的方式,以及上述图14A-图14B中所示的通过增加角标提示用户应用是否可以打开第一对象的方式,均为示例性说明。在其他一些实现方式中,接收端也可以通过其他方式(如文字提示)提示应用可以打开或无法打开第一对象,本申请对此不作限制。接收端提示某个应用图标对应的应用无法打开第一对象,也可以认为是显示第一提示信息。如:第一终端设备可以响应于将第二对象拖拽至第三应用标识的操作,显示第一提示信息,第一提示信息用于提示第三应用标识对应的应用不支持拖入第一对象。其中,第一提示信息即上述图13A-图13B、以及图14A-图14B中所示的显示信息。
另外,可以理解,上述图13A-图14B中仅示出了一个可以打开第一对象的应用的应用图标、以及一个无法打开第一对象的应用的应用图标。在部分场景中,接收端的桌面还可以包括更多个应用的应用图标。当用户将第一对象拖拽至这些应用图标上时,手机的具体响应方式可以参考上述图13A-图14B所示。
一些实施例中,上述图13A-图14B所示的示例中所提到的“当用户将第一对象拖拽至桌面上显示的某个应用图标上时,如果该应用图标对应的应用为可以打开第一对象的应用,则接收端改变应用图标的显示状态或者增加角标,以提示用户应用图标对应的应用可以打开第一对象”、以及“当用户将第一对象拖拽至桌面上显示的某个应用图标上时,如果该应用图标对应的应用为无法打开第一对象的应用,则接收端增加角标,以提示用户应用图标对应的应用无法打开第一对象”的技术方案,也可以适用于前述实施例中所述的应用推荐面板中。
例如,用户将第一对象拖拽至应用推荐面板上显示的某个应用标识上时,如果该应用图标识对应的应用为可以打开第一对象的应用,则接收端也可以改变应用标识的显示状态或者增加角标,以提示用户应用图标对应的应用可以打开第一对象。又例如,用户将第一对象拖拽至应用推荐面板上显示的某个应用标识上时,如果该应用图标识对应的应用为无法打开第一对象的应用,则接收端也可以增加角标,以提示用户应用标识对应的应用无法打开第一对象。
对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,一些可能的场景中,用户将第 一对象拖拽到接收端的显示界面的目的也可能是想要将第一对象保存在接收端或以默认方式打开第一对象。对此,一些实现方式中,用户可以将第一对象拖拽到接收端的桌面的空白区域(如没有应用图标的区域),并结束对第一对象进行拖拽的操作,接收端可以响应于用户可以将第一对象拖拽到接收端的桌面的空白区域,并结束对第一对象进行拖拽的操作,保存第一对象或以默认方式打开第一对象,如:可以将第一对象保存在默认的存储路径。或者,用户将第一对象拖拽至无法打开第一对象的应用对应的应用图标上、并结束对第一对象的拖拽操作时,接收端也可以响应于用户将第一对象拖拽至无法打开第一对象的应用对应的应用图标上、并结束对第一对象进行拖拽的操作,保存第一对象或以默认方式打开第一对象。
另外一些实现方式中,接收端也可以根据用户将第一对象拖拽至接收端的桌面后进一步的交互动作来判断用户意图。当用户将第一对象拖拽至接收端的桌面时,如果用户进一步的交互动作为:直接结束对第一对象的拖拽操作(如松开/释放鼠标),则接收端可以确定用户意图为保存第一对象或以默认方式打开第一对象。此时,接收端可以响应于用户将第一对象拖拽至接收端的桌面后直接结束对第一对象的拖拽的操作,保存第一对象或以默认方式打开第一对象,如:保存在默认的存储路径。当用户将第一对象拖拽至接收端的桌面时,如果用户进一步的交互动作为:用户未结束对第一对象的拖拽操作,且指针(如鼠标的指针)在接收端的桌面中停留时间达到第一时长,和/或,用户未结束对第一对象的拖拽操作,且指针在接收端的桌面中的拖动/滑动距离大于第一阈值,则接收端可以确定用户意图为寻找某个应用打开第一对象。此时,接收端可以按照上述接收端不显示应用推荐面板的实施例中所述的方式,响应于用户将第一对象拖拽至应用图标上的操作,提示用户该应用图标对应的应用可以打开第一对象(即支持拖入);或者,响应于用户将第一对象拖拽至应用图标上的操作,提示用户该应用图标对应的应用无法打开第一对象(即不支持拖入)。
可选地,本实现方式中,当用户将第一对象拖拽至接收端的桌面时,如果接收端按照上述方式确定用户意图为寻找某个应用打开第一对象后,当用户将第一对象拖拽至无法打开第一对象的应用对应的应用图标上、并结束对第一对象的拖拽操作时,或者当用户将第一对象拖拽至桌面的空白区域、并结束对第一对象的拖拽操作时,接收端可以响应于前述操作,将第一对象的阴影改变为一个泡泡形状浮窗,如:悬浮球、气泡(或称为拖拽气泡)等形式,并将第一对象对应的泡泡形状浮窗吸附(移动)到桌面的边缘进行显示。该第一对象对应的泡泡形状浮窗可以支持用户对第一对象再次进行拖拽操作,用户重新点击第一对象对应的泡泡形状浮窗并进行拖拽时,接收端可以将第一对象对应的泡泡形状浮窗又改变为第一对象的阴影,供用户继续对第一对象进行拖拽。前述泡泡形状浮窗可以称为第一浮窗,其他一些实施例中,第一浮窗也可以是其他形状,不作限制。
例如,以第一对象对应的泡泡形状浮窗为拖拽气泡、接收端为手机为例,图15为本申请实施例提供的手机的桌面的又一示意图。如图15所示,当用户将第一对象拖拽至手机的桌面时,如果手机按照上述方式确定用户意图为寻找某个应用打开第一对象,则当用户将第一对象拖拽至应用B的应用图标上(如上所述,应用B是无法打开第一对象的应用)、并结束对第一对象的拖拽操作时,或者当用户将第一对象拖拽至 桌面的空白区域、并结束对第一对象的拖拽操作时,手机可以响应于前述操作,将第一对象的阴影改变为一个拖拽气泡1501,并将拖拽气泡1501吸附到桌面的边缘进行显示。当用户想要执行对第一对象的拖拽操作时,可以对拖拽气泡1501执行拖拽操作,手机可以将拖拽气泡1501又改变为第一对象的阴影,供用户继续对第一对象进行拖拽。可选地,拖拽气泡1501和第一对象的阴影的表现样式可以相同,也可以不同,在此不作限制。应当理解,用户对拖拽气泡1501执行拖拽操作时,后续对第一对象的拖拽过程与前述实施例中所述的对第一对象的拖拽过程相同,不再赘述。
可选地,对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,一些可能的场景中,接收端的桌面可以包括多个界面,如:接收端的桌面可以包括一个主界面和一个或多个其他界面。主界面和其他界面中都可以包括应用图标。对于该场景,本申请实施例中,接收端还可以支持用户拖拽第一对象在主界面和其他界面中进行滑动切换,以选择用户想要使用的应用。
例如,以接收端为手机为例,图16为本申请实施例提供的手机的桌面的又一示意图。如图16所示,手机的桌面可以包括如图16中的(a)所示的主界面、以及如图16中的(b)所示的其他界面,图16中的(a)所示的主界面和图16中的(b)所示的其他界面可以彼此切换显示在前端。图17为本申请实施例提供的手机切换显示界面的场景示意图。如图17所示,假设用户将第一对象拖拽至手机的桌面时,手机前端显示的是图16中的(a)所示的主界面,此时,如果用户想要在图16中的(b)所示的其他界面中寻找打开第一对象的应用的应用图标,则用户可以将第一对象拖拽至图16中的(a)所示的主界面的边缘,手机可以响应于用户将第一对象拖拽至图16中的(a)所示的主界面的边缘的操作,将显示界面由图16中的(a)所示的主界面切换/滑动至图16中的(b)所示的其他界面。从而,用户可以在图16中的(b)所示的其他界面中继续拖拽第一对象,选择可以打开第一对象的应用。
示例性地,以手机的操作系统为安卓系统为例,当用户将第一对象拖拽至图16中的(a)所示的主界面的边缘时,手机可以将该拖拽移动事件透传到桌面启动器(launcher)上,控制桌面响应该拖拽移动事件,滑到下一界面。
可选地,其他一些实现方式中,用户想要切换接收端的桌面至下一个界面(如由主界面切换至其他界面)时,也可以先结束对第一对象的拖拽操作,触发接收端将第一对象的阴影改变为一个泡泡形状浮窗,并将第一对象对应的泡泡形状浮窗吸附(移动)到桌面的边缘进行显示。然后,用户可以通过正常的切屏操作(如滑动手机屏幕)将接收端的显示界面由主界面切换其他界面。当接收端的显示界面由主界面切换其他界面时,第一对象对应的泡泡形状浮窗仍然吸附在桌面(此时为其他界面)的边缘进行显示,用户可以基于第一对象对应的泡泡形状浮窗继续执行对第一对象的拖拽操作,在其他界面中继续拖拽第一对象,选择可以打开第一对象的应用。本申请对如何触发手机切换显示界面的操作的具体方式不作限制。
可选地,对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,另外一些可能的场景中,接收端的桌面中也可能包括:文件夹,文件夹中包括一个或多个应用图标。对于 该场景,本申请实施例中,接收端还可以支持用户拖拽第一对象至文件夹中包括的应用图标上,供用户从文件夹中包括的应用图标中选择想要使用的应用。
例如,以接收端为手机为例,图18A为本申请实施例提供的手机的桌面的又一示意图。如图18A所示,手机的桌面可以包括应用A的应用图标、应用B的应用图标、以及“文件夹1”1801,“文件夹1”1801中可以包括应用D的应用图标、应用E的应用图标、以及应用F的应用图标。图18B为本申请实施例提供的手机打开“文件夹1”1801的场景示意图。如图18B所示,假设用户将第一对象拖拽至如图18A所示的手机的桌面时,如果用户想要在选择“文件夹1”1801中的某个应用图标对应的应用打开第一对象,则用户可以将第一对象拖拽至“文件夹1”1801上悬停第二时长(如1秒),手机可以响应于用户将第一对象拖拽至“文件夹1”1801上悬停第二时长的操作,打开“文件夹1”1801,显示“文件夹1”1801对应的显示界面。“文件夹1”1801对应的显示界面可以包括应用D的应用图标、应用E的应用图标、以及应用F的应用图标。用户可以继续拖拽第一对象,基于应用D的应用图标、应用E的应用图标、以及应用F的应用图标中的任意一个应用图标,选择该应用图标对应的应用打开第一对象。
示例性地,手机可以模拟一个点击事件注入系统。用户将第一对象拖拽至文件夹1上悬停第二时长时,点击事件注入系统可以生成一个点击打开“文件夹1”1801的事件,点击打开“文件夹1”1801的事件可以触发手机打开文件夹1,显示文件夹1对应的显示界面。
可选地,对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,还有一些可能的场景中,接收端的桌面中也可能包括:大文件夹(如第一文件夹)。相对于上述图18A-图18B中所示的文件夹而言,大文件夹中也可以包括多个应用图标,但对于大文件夹包括的多个应用图标,大文件夹中可以按照与桌面类似的方式直接显示至少N个应用图标,如N可以是8、9等。当大文件夹包括的应用图标的数量大于N时,对于N个之外的应用图标(或者说剩余的应用图标),可以在大文件夹中折叠为一个折叠按钮,显示在大文件夹的最后。用户想要查看N个之外的应用图标时,可以点击该折叠按钮。接收端可以响应于用户点击折叠按钮的操作,显示大文件夹的全部应用图标。对于前述大文件夹的场景,本申请实施例中,接收端还可以支持用户拖拽第一对象至大文件夹中包括的应用图标上,供用户从大文件夹中包括的应用图标中选择想要使用的应用。前述大文件夹中包括的应用图标也可以称为第二应用标识。
例如,同样以接收端为手机为例,图19A为本申请实施例提供的手机的桌面的又一示意图。如图19A所示,手机的桌面可以包括“文件夹1”1901,“文件夹1”1901以大文件夹的形式展现。“文件夹1”1901中包括:应用A、应用B、应用C…应用I等应用的应用图标。其中,应用A至应用H的应用图标可以在“文件夹1”1901中直接显示,应用I以及应用I之后的其他应用的应用图标在“文件夹1”1901中的最后叠加显示为一个重叠按钮1902。假设用户将第一对象拖拽至如图19A所示的手机的桌面后,如果用户想要选择应用A至应用H中的某个应用打开第一对象,则用户可以直接将第一对象拖拽至“文件夹1”1901中显示的该应用的应用图标上,手机可以响应 于将第一对象拖拽至该应用的应用图标上的操作,通过该应用打开第一对象。
图19B为本申请实施例提供的触发手机显示“文件夹1”1901的全部应用图标的场景示意图。如图19B所示,如果用户想要选择应用I以及应用I之后的其他应用中的某个应用打开第一对象,则用户可以先将第一对象拖拽至重叠按钮1902上悬停第二时长(如1秒),手机可以响应于用户将第一对象拖拽至重叠按钮1902上悬停第二时长的操作,显示“文件夹1”1901的全部应用图标。然后,用户可以继续拖拽第一对象至应用I以及应用I之后的其他应用中的某个应用的应用图标上以打开第一对象,或者,也可以拖拽第一对象至应用A至应用H中的某个应用的应用图标上以打开第一对象。其中,图19B中示例性地以“文件夹1”1901的全部应用图标包括应用A至应用L的应用图标为例。
一些实施例,用户也可以通过多指触控的方式点击重叠按钮1902,触发手机显示“文件夹1”1901的全部应用图标。例如,假设用户用某个手指拖动第一对象至接收端的显示界面,则用户可以使用另一个手指点击重叠按钮1902,手机可以响应于用户点击重叠按钮1902的操作,显示“文件夹1”1901的全部应用图标。
可选地,当“文件夹1”1901的全部应用图标无法在一个页面全部呈现时,也可以参照前述实施例中所述的方式滚动/滑动显示,在此不再赘述。
可选地,对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,当用户通过执行对第一对象的拖拽操作,将第一对象拖拽至某个应用图标(可以是桌面中的应用图标,也可以是文件夹或大文件夹中的应用图标)上时,如果该应用图标对应的应用可以打开第一对象,且包括多个可以服务(或称为功能),则接收端也可以在该应用图标旁显示(弹出)该应用图标对应的应用的服务菜单(或称为功能菜单),服务菜单中可以包括一个或多个该应用图标对应的应用包括的服务的服务标识。用户可以继续执行上述对第一对象的拖拽操作,将第一对象拖拽至服务菜单中显示的某个服务标识上,接收端可以响应于用户将第一对象拖拽至服务菜单中显示的某个服务标识上的操作,通过该应用标识对应的应用打开第一对象,并启动该服务标识对应的服务对第一对象进行处理/操作。例如,服务菜单的显示方式可以与上述图12所示类似,其区别在于,本实施例中,服务菜单是在桌面进行显示。当用户通过执行对第一对象的拖拽操作,将第一对象拖拽至某个应用图标上时,如果该应用图标对应的应用无法打开第一对象,也可以按照前述实施例中所述的方式,显示第一提示信息提示用户。对于上述用户将第一对象拖拽至接收端的显示界面时,如果接收端当前的显示界面为桌面,接收端不显示应用推荐面板的实施例而言,当用户通过执行对第一对象的拖拽操作,将第一对象拖拽至某个应用图标时,接收端显示的该应用图标对应的服务菜单可以称为第三窗口。如:接收端可以接收将第一对象拖拽至第二应用标识的操作,并响应于将第一对象拖拽至第二应用标识的操作,显示第三窗口,第三窗口包括第二应用标识对应的应用包括的一个或多个服务的服务标识。
可选地,以上各实施例中所述的接收端响应于用户将第一对象拖拽至某个应用标识(应用图标)上的操作,通过该应用标识对应的应用打开第一对象,或者,显示该应用标识对应的应用的服务菜单,可以是指:当用户将第一对象拖拽至某个应用标识 (应用图标)上、且停留时长大于第三时长时,接收端响应于用户将第一对象拖拽至该应用标识(应用图标)上的操作,通过该应用标识对应的应用打开第一对象,或者,显示该应用标识对应的应用的服务菜单。例如,第三时长可以是2秒、3秒等。
类似地,以上各实施例中所述的接收端响应于用户将第一对象拖拽至某个服务标识上的操作,通过该服务标识对应的服务打开第一对象,可以是指:当用户将第一对象拖拽至某个服务标识上、且停留时长大于第三时长时,接收端响应于用户将第一对象拖拽至该服务标识上的操作,通过该服务标识对应的服务打开第一对象。
可选地,以上实施例中分别介绍了显示应用推荐面板和不显示应用推荐面板的情况。还有一些实施例中,当用户将第一对象拖拽至接收端的显示界面时,接收端还可以根据第一对象的阴影或指针所处的位置(即第一对象被拖拽的位置),确定显示应用推荐面板或显示服务菜单。例如,当第一对象被拖拽的当前位置为空白区域(即指针停留在空白区域)时,接收端可以显示上述应用推荐面板。当第一对象被拖拽的当前位置为某个应用图标上(即指针停留在应用图标上)时,接收端可以显示该应用图标对应的应用的服务菜单。
一些实施例中,用户将第一对象拖拽至接收端的显示界面时,接收端的显示界面也可能包括应用对应的卡片。卡片也称FA卡片,是直接出现在手机桌面上,同时具备一定功能的区域,这些功能是卡片对应的应用可以提供的。用户可以将第一对象拖拽至卡片上,通过卡片对应的应用打开第一对象,或者,将第一对象拖拽至卡片对应的某个服务上,通过该服务打开第一对象。其中,卡片对应的服务是指卡片对应的应用可以提供的服务。
以手机的显示界面包括图库对应的卡片为例,图19C为本申请实施例提供的手机的显示界面的又一示意图。如图19C所示,手机的显示界面中可以包括图库对应的卡片。用户将第一对象拖拽至图19C所示的手机显示界面后,可以继续将第一对象拖拽至图库对应的卡片上,手机可以响应于将第一对象拖拽至图库对应的卡片的操作,通过图库打开第一对象。或者,请继续参考图19C所示,图库对应的卡片中还可以包括“微电影创作”、“自由创作”、“拼图创作”等服务的服务标识(如图中的图案和文字),用户也可以将第一对象拖拽至前述“微电影创作”、“自由创作”、“拼图创作”等服务对应的任意一个服务标识上,手机可以响应于将第一对象拖拽至该服务标识上的操作,通过该服务标识对应的服务打开第一对象。如:用户可以将图片拖拽至“拼图创作”上,手机可以启动图库的“拼图创作”服务打开图片供用户进行拼图创作。
当用户通过执行对第一对象的拖拽操作,将第一对象拖拽至某个卡片或卡片中的服务标识上时,如果该卡片对应的应用或服务标识对应的服务无法打开第一对象,则也可以按照前述实施例中所述的方式,显示提示信息提示用户,如:第二提示信息,或者第二提示信息也可以称作第一提示信息,不作限制。前述卡片中的服务标识可以称为第二服务标识。
以上实施例中所述的应用图标、文件夹、大文件夹、卡片等中的一种或多种也可以同时显示在手机显示界面中,在此不作限制。
可选地,还有一些实施例中,用户将第一对象拖拽至接收端的显示界面时,不论 接收端当前的显示界面是否为桌面,接收端均可以显示应用推荐面板,但用户可以主动关闭应用推荐面板。
例如,以接收端为手机为例,图20为本申请实施例提供的手机的显示界面的又一示意图。如图20所示,用户将第一对象拖拽至接收端的显示界面时,接收端可以显示应用推荐面板。应用推荐面板中可以包括一个关闭按钮2001,用户可以将第一对象拖拽至关闭按钮2001上,并在关闭按钮2001上停留预设时长(如第五时长),手机可以响应于用户将第一对象拖拽至关闭按钮2001上,并在关闭按钮2001上停留预设时长的操作,关闭应用推荐面板。如:手机可以响应于用户将第一对象拖拽至关闭按钮2001上,并在关闭按钮2001上停留预设时长的操作,模拟一个点击关闭按钮2001的事件,点击关闭按钮2001的事件可以触发手机关闭应用推荐面板。例如,如果手机的显示界面当前为桌面,则手机关闭应用推荐面板后,显示界面会恢复至桌面。用户可以按照前述实施例所述的方式,将第一对象拖拽至桌面的某个应用图标上。
一些实施例,用户将第一对象拖拽至接收端的显示界面时,也可以通过多指触控的方式点击关闭按钮2001,触发手机关闭应用推荐面板。例如,假设用户用某个手指拖动第一对象至接收端的显示界面,则用户可以使用另一个手指点击关闭按钮2001,手机可以响应于用户点击关闭按钮2001的操作,关闭应用推荐面板。
另外一些实施例,用户将第一对象拖拽至接收端的显示界面时,也可以先结束对第一对象的拖拽操作,触发手机将第一对象的阴影改变为一个泡泡形状浮窗,并将第一对象对应的泡泡形状浮窗吸附(移动)到桌面的边缘进行显示。然后,用户可以使用鼠标或手指点击关闭按钮2001。当显示界面会恢复至桌面后,用户可以基于第一对象对应的泡泡形状浮窗继续执行对第一对象的拖拽操作,按照前述实施例中所述的方式,拖拽第一对象至桌面上显示的某个应用图标上,在此不再赘述。
可选地,本实施例中,当应用推荐面板是非全屏显示时,用户还可以通过点击显示界面中应用推荐面板所在区域之外的区域,或者,第一对象拖拽至应用推荐面板所在区域之外的区域上,并在应用推荐面板所在区域之外的区域上停留预设时长(如第五时长),触发接收端关闭应用推荐面板,本申请对关闭应用推荐面板的操作方式不作限制。
上述关闭应用推荐面板的操作也即关闭第一窗口的操作,接收端可以响应于上述任意一种关闭第一窗口的操作,关闭第一窗口。接收端响应于关闭第一窗口的操作,关闭第一窗口后,用户可以执行将第一对象拖拽至第二应用标识的操作,第二应用标识为桌面中包括的应用标识中的一个。响应于将第一对象拖拽至第二应用标识的操作,当第二应用标识对应的应用为支持拖入第一对象的应用时,接收端可以打开第二应用标识对应的应用,并将第一对象传递给第二应用标识对应的应用;或者,显示第三窗口(即桌面图标对应的应用的服务菜单),第三窗口包括第二应用标识对应的应用包括的一个或多个服务的服务标识。或者,当第二应用标识对应的应用为不支持拖入第一对象的应用时,接收端可以显示第一提示信息,第一提示信息用于提示第二应用标识对应的应用不支持拖入第一对象。具体可以参考前述关于拖拽第一对象至桌面图标的相关实施例中所述,此处不再赘述。
还有一些可能的场景中,用户在拖拽第一对象的过程中,拖拽行为可能发生中断。 导致拖拽行为发生中断的因素可能是用户主动中断,或者,被动中断。例如,接收端的显示界面(或者应用推荐面板)中可以包括一个中断区域,用户可以拖拽第一对象至中断区域并结束对第一对象的拖拽操作,实现主动中断拖拽行为。或者,前述实施例中所述的用户将第一对象拖拽至无法打开第一对象的应用对应的应用图标上、并结束对第一对象的拖拽操作,或者当用户将第一对象拖拽至桌面的空白区域、并结束对第一对象的拖拽操作,也可以认为是用户主动中断了拖拽行为。又例如,接收端接收到了电话或者某些通知,又或者接收端弹出某些系统弹窗时,可能导致拖拽行为中断,这类中断场景可以认为是被动中断。
对于上述用户在拖拽第一对象的过程中拖拽行为发生中断的情况,本申请实施例中,当拖拽行为发生中断时,接收端可以将第一对象的阴影改变为一个泡泡形状浮窗,如:悬浮球、气泡(或称为拖拽气泡)等形式,并将第一对象对应的泡泡形状浮窗吸附(移动)到桌面的边缘进行显示。该第一对象对应的泡泡形状浮窗可以支持用户对第一对象再次进行拖拽操作,用户重新点击第一对象对应的泡泡形状浮窗并进行拖拽时,接收端可以将第一对象对应的泡泡形状浮窗又改变为第一对象的阴影,供用户继续对第一对象进行拖拽。例如,第一对象对应的泡泡形状浮窗可以参考前述图15所示,在此不再用附图表示。
需要说明的是,对于前述显示应用推荐面板的实施例而言,当用户重新点击第一对象对应的泡泡形状浮窗并进行拖拽时,接收端可以再次显示应用推荐面板。对于前述不显示应用推荐面板的实施例而言,当用户重新点击第一对象对应的泡泡形状浮窗并进行拖拽时,接收端可以不显示应用推荐面板。也即,当用户重新点击第一对象对应的泡泡形状浮窗并进行拖拽时接收端的响应动作,可以与第一对象第一次被拖拽至接收端的显示界面时接收端的响应动作保持一致。本申请中,前述泡泡形状浮窗可以是前述实施例中所述的第一浮窗,显示第一浮窗的位置可以称为第一位置。示例性地,第一位置可以是桌面的边缘。
可选地,本申请还有一些实施例中,当接收端显示应用推荐面板时,应用推荐面板中不仅可以包括应用标识,还可以包括部分或全部应用对应的服务的服务标识。与应用标识类似,这些服务标识可以是人为定义或预设的;或者,也可以是接收端根据第一对象的类型所确定的,在此不作限制。或者,接收端也可以仅显示服务推荐面板(不显示应用推荐面板),服务推荐面板中可以包括一些应用对应的服务的服务标识,不包括应用标识。也即,第一窗口中可以包括一个或多个应用对应的应用标识,和/或,一个或多个服务对应的服务标识。
还需要说明的是,以上实施例中所述的打开第一对象的应用并不局限于能够通过启动应用界面显示第一对象的具体内容的应用,在一些可能的场景中,打开第一对象的应用还可以包括:可以对第一对象进行操作、获取第一对象中包含的相关信息的应用,或者,可以收藏第一对象的应用等。例如,第一对象可以是一张二维码图片,打开第一对象的应用可以包括:能够显示该二维码图片的图库应用、可以对该二维码图片进行扫描获取二维码包含的信息的扫码应用(如相机)、可以收藏该二维码图片的收藏应用(如备忘录、收藏夹等)。应当理解,关于对“可以打开第一对象的应用”的定义并不限于前述示例性给出的几种解释,本申请对可以打开第一对象的应用并不 作限制。前述可以打开第一对象的应用可以称为支持拖入第一对象的应用,对应的,无法打开第一对象的应用可以称为不支持拖入第一对象的应用。
可选地,本申请一些实施例中,用户在源端的显示界面中拖拽第一对象时,源端可以提示用户能够对第一对象进行跨设备拖拽。例如,当源端检测到用户开始拖拽源端的显示界面(如第二界面)中的第一对象时,源端可以显示一个用户界面(user interface,UI)动效,如第二UI动效,用于提示用户能够对第一对象进行跨设备拖拽。其中,在第二界面拖拽第一对象的操作可以称为第二操作,UI动效的显示位置与接收端相对于源端的方位相关,如:UI动效的显示位置可以是源端的屏幕边缘,且该屏幕边缘为与源端连接的接收端所在的一侧。例如,源端的屏幕可以分为上侧屏幕边缘、下侧屏幕边缘、左侧屏幕边缘、以及右侧屏幕边缘,则当与源端连接的接收端所在的一侧为源端的右侧时,UI动效的显示位置可以是源端的右侧屏幕边缘。用户将第一对象由源端的显示界面向接收端的显示界面进行拖拽的过程中,第一对象的拖拽阴影在源端的显示界面中所在的位置会逐渐靠近上述UI动效,或者说逐渐靠近UI动效所在的屏幕边缘。随着第一对象的拖拽阴影在源端的显示界面中所在的位置逐渐靠近UI动效所在的屏幕边缘,源端可以逐步加强UI动效的显示效果,如:可以增大UI动效的显示范围。当第一对象被拖拽至UI动效的显示区域中时(即第一对象的拖拽阴影进入UI动效的显示区域中时),源端可以将与第一对象的拖拽阴影所在位置相关区域的UI动效进行高亮显示或颜色加强。其中,UI动效颜色加强时的颜色可以和第一对象的颜色或源端的显示界面的颜色相关,如:UI动效颜色加强时的颜色可以和第一对象的颜色相同,或者,源端的显示界面为桌面时,UI动效颜色加强时的颜色可以和桌面壁纸的颜色相同或相近。
示例性地,同样以源端为PC、接收端为手机,PC的输入设备为鼠标为例,图21为本申请实施例提供的PC显示UI动效的示意图。如图21所示,PC的显示界面2101中可以包括第一对象2102、以及鼠标的指针(图中的小箭头,未标出)。用户可以将鼠标的指针移动至第一对象2102上,点击长按鼠标的左键并移动鼠标,从而,可以对第一对象2102进行拖拽。当PC检测到用户开始拖拽第一对象2102时,PC可以显示UI动效2103,UI动效2103可以用于提示用户能够对第一对象2102进行跨设备拖拽。
其中,UI动效2103的显示位置与手机相对于PC的方位相关,如:当手机位于PC的右侧时,UI动效2103的显示位置可以是PC的右侧屏幕边缘。
图22为本申请实施例提供的PC显示UI动效的另一示意图。如图22所示,用户将第一对象2102由PC的显示界面向手机的显示界面进行拖拽的过程中,第一对象2102的拖拽阴影在PC的显示界面中所在的位置会逐渐靠近UI动效2103所在的屏幕边缘。随着第一对象2102的拖拽阴影在PC的显示界面中所在的位置逐渐靠近UI动效2103所在的屏幕边缘,PC可以逐渐增大UI动效2103的显示范围。
一种实现方式中,PC增大UI动效2103的显示范围,可以包括:PC将UI动效2103的所有区域同步增大,即,UI动效2103的所有区域增大的程度相同,如:增加同等宽度。
另一种实现方式中,PC增大UI动效2103的显示范围时,UI动效2103的显示范围增大的区域可以与第一对象2102的拖拽阴影在PC的显示界面中所在的位置相关。 例如,PC可以仅增大UI动效2103中与第一对象2102的拖拽阴影在PC的显示界面中所在的位置靠近的部分区域,如:增大UI动效2103的30%的区域,这30%的区域靠近第一对象2102的拖拽阴影在PC的显示界面中所在的位置。又例如,PC可以增大UI动效2103的所有区域,但UI动效2103的区域中与第一对象2102的拖拽阴影在PC的显示界面中所在的位置靠近的部分区域增大的程度更大。
相对于PC将UI动效2103的所有区域同步增大的实现方式而言,UI动效2103的显示范围增大的区域与第一对象2102的拖拽阴影在PC的显示界面中所在的位置相关,可以呈现出随着第一对象2102的拖拽阴影的靠近而吸引放大的效果。本申请对PC增大UI动效2103的显示范围的具体实现方式不作限制。
可选地,当第一对象2102的拖拽阴影在PC的显示界面中所在的位置接触到UI动效2103的显示区域、或者第一对象2102的拖拽阴影在PC的显示界面中所在的位置完全进入到UI动效2103的显示区域时,PC可以不再增大UI动效2103的显示范围。或者,PC中也可以预设有UI动效2103的最大显示范围,当UI动效2103的显示范围达到最大显示范围时,PC不再增大UI动效2103的显示范围。
图23为本申请实施例提供的PC显示UI动效的又一示意图。如图23所示,当第一对象2102被拖拽至UI动效2103的显示区域中时,PC还可以将与第一对象2102的拖拽阴影的所在位置相关区域的UI动效2103进行高亮显示或颜色加强。图23中以斜线填充区域表示高亮显示或颜色加强的效果。
其中,UI动效2103颜色加强时的颜色可以和第一对象2102的颜色或PC的显示界面的颜色相关,如:UI动效2103颜色加强时的颜色可以和第一对象2102的颜色相同,或者,PC的显示界面为桌面时,UI动效2103颜色加强时的颜色可以和桌面壁纸的颜色相近,如:选择桌面壁纸主色调中的颜色。本申请对UI动效2103颜色加强时的颜色不作限制,如:UI动效2103颜色加强时的颜色也可以是默认的某个颜色。
一种实现方式中,UI动效2103进行高亮显示或颜色加强的区域可以与第一对象2102的拖拽阴影在UI动效2103的显示区域中所占的区域相同(图23中是以此为例)。
另外一种实现方式中,UI动效2103进行高亮显示或颜色加强的区域也可以大于第一对象2102的拖拽阴影在UI动效2103的显示区域中所占的区域。
又一种实现方式中,UI动效2103进行高亮显示或颜色加强的区域也可以是固定大小,与第一对象2102的拖拽阴影在UI动效2103的显示区域中所占的区域无关,本申请对UI动效2103进行高亮显示或颜色加强的区域也不作限制。
可选地,PC在某一侧屏幕边缘(如右侧屏幕边缘)在显示UI动效时,UI动效距离相邻的屏幕边缘(如上下两侧屏幕边缘)可以保持一定距离,如:UI动效在右侧屏幕边缘所占的区域可以为整个右侧屏幕边缘中间的80%,距离上下两侧屏幕边缘分别间隔10%的距离。
需要说明的是,以上图21至图23中虽然是以波浪形状示出了UI动效的显示效果,在其他一些示例中,UI动效也可以呈现为矩形、或者其他规则或不规则形状,本申请对UI动效的形状也不作限制。
上述实施例以源端为PC、接收端为手机为例,说明了源端通过UI动效提示用户能够对第一对象进行跨设备拖拽,以及第一对象在源端的显示界面中移动的过程中UI 动效的变化。可选地,一些实施例中,当第一对象被拖拽出源端的屏幕边缘时,接收端的屏幕边缘也可以显示UI动效,体现第一对象的穿越效果。本申请中,接收端的屏幕边缘显示的UI动效可以称为第一UI动效,上述源端的屏幕边缘显示的UI动效可以称为第二UI动效。第一UI动效可以用于提示用户第一对象被拖入接收端的显示界面(如第一界面)。
例如,图24为本申请实施例提供的手机显示UI动效的示意图。如图24所示,当第一对象2102被拖拽出PC的屏幕边缘时,手机的屏幕边缘可以显示UI动效2401。可以理解,手机显示UI动效2401的屏幕边缘与PC显示UI动效2103的屏幕边缘相对,如:PC在右侧屏幕边缘显示UI动效2103,则手机在左侧屏幕边缘显示UI动效2401。
其中,手机显示的UI动效2401与PC显示的UI动效2103可以相同(如均为波浪形状),也可以不同。下面均以手机显示的UI动效2401也为波浪形状为例,其他示例中,手机显示的UI动效2401也可以是矩形、或者其他规则或不规则形状。
可选地,手机显示UI动效2401的区域与鼠标指针或者第一对象2102的拖拽阴影在手机的显示界面中被拖入的位置相关。例如,手机可以在第一对象2102的拖拽阴影在手机的显示界面中被拖入的位置(即出现在手机屏幕边缘上的位置)上下一定范围(如预设的距离范围)内显示UI动效2401。其中,第一对象2102的拖拽阴影在手机的显示界面中被拖入的位置可以根据第一对象2102的拖拽阴影在PC的显示界面中被拖出的位置确定,如:第一对象2102的拖拽阴影在PC的显示界面中被拖出的位置距离PC的上侧(顶部)屏幕边缘之间的距离为上下两侧屏幕之间距离的30%,则第一对象2102的拖拽阴影在手机的显示界面中被拖入的位置距离手机的上侧(顶部)屏幕边缘之间的距离也可以为上下两侧屏幕之间距离的30%。或者,第一对象2102的拖拽阴影在手机的显示界面中被拖入的位置也可以是物理空间内相对的接收位置,在此不作限制。
图25为本申请实施例提供的手机显示UI动效的另一示意图。如图25所示,用户将第一对象2102在手机的显示界面中继续进行拖拽时,第一对象2102的拖拽阴影在手机的显示界面中所在的位置会逐渐远离UI动效2401所在的屏幕边缘。随着第一对象2102的拖拽阴影在手机的显示界面中所在的位置逐渐远离UI动效2401所在的屏幕边缘,手机可以逐渐增大UI动效2401的显示范围。
其中,手机逐渐增大UI动效2401的显示范围的具体方式与PC增大UI动效2103的显示范围的类似,可以将UI动效2401的所有区域同步增大,或者,UI动效2401的显示范围增大的区域可以与第一对象2102的拖拽阴影在手机的显示界面中所在的位置相关。
可选地,当第一对象2102的拖拽阴影在手机的显示界面中所在的位置离开UI动效2401的初始显示区域(即,第一对象2102刚被拖入手机的显示界面时UI动效2401的显示区域)时,手机可以不再增大UI动效2401的显示范围。或者,手机中也可以预设有UI动效2401的最大显示范围,当UI动效2401的显示范围达到最大显示范围时,手机不再增大UI动效2401的显示范围。
其他一些实现方式中,随着第一对象2102的拖拽阴影在手机的显示界面中所在的 位置逐渐远离UI动效2401所在的屏幕边缘,手机也可以不增大UI动效2401的显示范围。也即,UI动效2401的显示范围可以保持不变。
请继续参考图25所示,当第一对象2102被拖拽至第一对象2102的拖拽阴影远离屏幕边缘的过程中,手机还可以将与第一对象2102的拖拽阴影的所在位置相关区域的UI动效2401进行高亮显示或颜色加强。图25中以斜线填充区域表示高亮显示或颜色加强的效果。
其中,UI动效2401颜色加强时的颜色可以和第一对象2102的颜色或手机的显示界面的颜色相关,如:UI动效2401颜色加强时的颜色可以和第一对象2102的颜色相同,或者,手机的显示界面为桌面时,UI动效2401颜色加强时的颜色可以和桌面壁纸的颜色相近,如:选择桌面壁纸主色调中的颜色。本申请对UI动效2401颜色加强时的颜色不作限制,如:UI动效2401颜色加强时的颜色也可以是默认的某个颜色。
一种实现方式中,UI动效2401颜色加强时的颜色和第一对象2102的颜色或手机的显示界面的颜色相关,可以包括:UI动效2401进行高亮显示或颜色加强的区域与第一对象2102的拖拽阴影在UI动效2401的显示区域中所占的区域相同(图25中是以此为例)。
另外一种实现方式中,UI动效2401颜色加强时的颜色和第一对象2102的颜色或手机的显示界面的颜色相关,也可以包括:UI动效2401进行高亮显示或颜色加强的区域大于第一对象2102的拖拽阴影在UI动效2401的显示区域中所占的区域。
又一种实现方式中,UI动效2401进行高亮显示或颜色加强的区域也可以是固定大小,与第一对象2102的拖拽阴影在UI动效2401的显示区域中所占的区域无关,本申请对UI动效2401进行高亮显示或颜色加强的区域也不作限制。
图26为本申请实施例提供的手机显示UI动效的又一示意图。如图26所示,用户将第一对象2102在手机的显示界面中继续进行拖拽,使得第一对象2102的拖拽阴影远离屏幕边缘的过程中,手机还可以显示第一对象2102的拖尾效果2601。也即,在第一对象2102被拖拽至第一对象2102的拖拽阴影远离屏幕边缘的过程中,手机可以显示第一对象2102的拖尾效果。
一些可能的示例中,拖尾效果2601可以通过高光投影显示的方式呈现。
拖尾效果2601的显示区域可以大于、小于、或等于第一对象2102的拖拽阴影的显示区域。拖尾效果2601的显示区域可以跟随第一对象2102的拖拽阴影而移动。在拖尾效果2601的显示区域跟随第一对象2102的拖拽阴影移动的过程中,拖尾效果2601可以逐渐变小或保持不变,拖尾效果2601可以逐渐变淡直至消失。例如,当第一对象2102的拖拽阴影移动超过预设的距离(如5cm,6cm等)后,拖尾效果2601可以消失,即手机不在显示拖尾效果。
图27为本申请实施例提供的手机显示UI动效的又一示意图。如图27所示,当用户结束对第一对象2102进行拖拽的操作后,手机可以响应于用户结束对第一对象2102进行拖拽的操作,不再显示UI动效2401。也即,用户将第一对象2102释放后,手机屏幕边缘的UI动效2401可以消失。
需要说明的是,上述实施例中所述的UI动效(包括PC显示的UI动效2103和手机显示的UI动效2401)、拖尾效果等名称均为示例性说明。其他一些实施例中,UI 动效也可以称为UI穿越动效、穿越动效、穿越光效等,拖尾效果也可以称为拖尾动效、高光投影等,本申请对UI动效和拖尾效果等名称均不作限制。
应当理解,以上各实施例中所述仅为对本申请实施例提供的跨设备拖拽方法的示例性说明。在其他一些可能的实现方式中,以上所述的各实施例也可以删减或增加某些执行步骤,或者以上实施例中所述的部分步骤的顺序也可以进行调整,本申请对此均不作限制。
对应于前述实施例中所述的跨设备拖拽方法,本申请实施例提供一种跨设备拖拽装置,该装置可以应用于上述接收端,用于实现前述实施例所述的跨设备拖拽方法中接收端所执行的步骤。如接收端可以是第一终端设备。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述跨设备拖拽方法中接收端所执行的步骤相对应的模块或单元。例如,图28为本申请实施例提供的跨设备拖拽装置的结构示意图。如图28所示,该装置可以包括:显示单元2801、接收单元2802、以及处理单元2803。显示单元2801、接收单元2802、以及处理单元2803可以用于配合实现前述实施例所述的跨设备拖拽方法中接收端执行的步骤对应的功能。
如:显示单元2801可以用于显示第一界面;接收单元2801可以用于接收第一操作;处理单元2803可以用于响应于第一操作,控制显示单元显示第一窗口等。类似地,显示单元2801、接收单元2802、以及处理单元2803可以配合实现前述实施例所述的跨设备拖拽方法中接收端执行的全部步骤对应的功能,在此不再一一赘述。
本申请实施例还提供一种跨设备拖拽装置,该装置可以应用于上述源端,用于实现前述实施例所述的跨设备拖拽方法中源端所执行的步骤。如源端可以是第二终端设备。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述跨设备拖拽方法中源端所执行的步骤相对应的模块或单元。例如,图29为本申请实施例提供的跨设备拖拽装置的另一结构示意图。如图29所示,该装置可以包括:显示单元2901、接收单元2902、以及处理单元2903。显示单元2901、接收单元2902、以及处理单元2903可以用于配合实现前述实施例所述的跨设备拖拽方法中源端执行的步骤对应的功能。
如:显示单元2901可以用于显示第二界面;接收单元2902可以用于接收第二操作;处理单元2903可以用于响应于第二操作,控制显示单元在第二界面的屏幕边缘显示第二UI动效等。类似地,显示单元2901、接收单元2902、以及处理单元2903可以配合实现前述实施例所述的跨设备拖拽方法中源端执行的全部步骤对应的功能,在此不再一一赘述。
应理解以上装置中模块(或称为单元)的划分仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且装置中的单元可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分单元以软件通过处理元件调用的形式实现,部分单元以硬件的形式实现。
例如,各个单元可以为单独设立的处理元件,也可以集成在装置的某一个芯片中实现,此外,也可以以程序的形式存储于存储器中,由装置的某一个处理元件调用并执行该单元的功能。此外这些单元全部或部分可以集成在一起,也可以独立实现。这 里所述的处理元件又可以称为处理器,可以是一种具有信号的处理能力的集成电路。在实现过程中,上述方法的各步骤或以上各个单元可以通过处理器元件中的硬件的集成逻辑电路实现或者以软件通过处理元件调用的形式实现。
在一个例子中,以上装置中的单元可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个专用集成电路(application specific integrated circuit,ASIC),或,一个或多个数字信号处理器(digital signal process,DSP),或,一个或者多个现场可编辑逻辑门阵列(field programmable gate array,FPGA),或这些集成电路形式中至少两种的组合。
再如,当装置中的单元可以通过处理元件调度程序的形式实现时,该处理元件可以是通用处理器,例如中央处理器(central processing unit,CPU)或其它可以调用程序的处理器。再如,这些单元可以集成在一起,以片上系统(system-on-a-chip,SOC)的形式实现。
在一种实现中,以上装置实现以上方法中各个对应步骤的单元可以通过处理元件调度程序的形式实现。例如,该装置可以包括处理元件和存储元件,处理元件调用存储元件存储的程序,以执行以上方法实施例所述的方法中源端或接收端执行的步骤。存储元件可以为与处理元件处于同一芯片上的存储元件,即片内存储元件。
在另一种实现中,用于执行以上方法的程序可以在与处理元件处于不同芯片上的存储元件,即片外存储元件。此时,处理元件从片外存储元件调用或加载程序于片内存储元件上,以调用并执行以上方法实施例所述的方法中源端或接收端执行的步骤。
本申请实施例还提供一种电子设备。该电子设备可以是上述源端或接收端。电子设备包括:处理器,用于存储处理器可执行指令的存储器;处理器被配置为执行所述指令时,使得电子设备实现如前述实施例所述的方法中源端或接收端执行的步骤。该存储器可以位于该电子设备之内,也可以位于该电子设备之外。且该处理器包括一个或多个。
示例性地,该电子设备可以是手机,也可以是平板电脑、可穿戴设备、车载设备、AR/VR设备、笔记本电脑、UMPC、上网本、PDA等。
在又一种实现中,该电子设备实现以上方法中各个步骤的单元可以是被配置成一个或多个处理元件,这里的处理元件可以为集成电路,例如:一个或多个ASIC,或,一个或多个DSP,或,一个或者多个FPGA,或者这些类集成电路的组合。这些集成电路可以集成在一起,构成芯片。
例如,本申请实施例还提供一种芯片,该芯片可以应用于上述电子设备。芯片包括一个或多个接口电路和一个或多个处理器;接口电路和处理器通过线路互联;处理器通过接口电路从电子设备的存储器接收并执行计算机指令,以实现如前述实施例所述的方法中源端或接收端执行的步骤。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以 是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。
基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,如:程序。该软件产品存储在一个程序产品,如计算机可读存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
例如,本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序指令;当所述计算机程序指令被电子设备执行时,使得电子设备实现如实施例所述的方法中源端或接收端执行的步骤。
又例如,本申请实施例还提供一种计算机程序产品,包括:计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如前述实施例所述的方法中源端或接收端执行的步骤。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (32)

  1. 一种跨设备拖拽方法,其特征在于,所述方法应用于第一终端设备;所述方法包括:
    所述第一终端设备显示第一界面;
    所述第一终端设备接收第一操作,所述第一操作为将第一对象由第二终端设备的显示界面拖拽至所述第一界面的操作;
    响应于所述第一操作,所述第一终端设备显示第一窗口;所述第一窗口包括一个或多个应用对应的应用标识,和/或,一个或多个服务对应的服务标识。
  2. 根据权利要求1所述的方法,其特征在于,所述第一窗口中包括的应用标识与所述第一对象的类型相关。
  3. 根据权利要求2所述的方法,其特征在于,所述第一终端设备显示第一窗口之前,所述方法还包括:
    所述第一终端设备获取所述第一对象的类型;
    所述第一终端设备根据所述第一对象的类型,确定所述第一窗口中包括的应用标识。
  4. 根据权利要求3所述的方法,其特征在于,所述第一终端设备根据所述第一对象的类型,确定所述第一窗口中包括的应用标识,包括:
    所述第一终端设备根据所述第一对象的类型,从所述第一终端设备安装的所有应用中确定支持拖入所述第一对象的应用;
    所述第一终端设备将所述支持拖入所述第一对象的应用的应用标识,作为所述第一窗口中包括的应用标识。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收将所述第一对象拖拽至第一应用标识的操作;所述第一应用标识为所述第一窗口中包括的应用标识中的一个;
    响应于所述将所述第一对象拖拽至第一应用标识的操作,当所述第一应用标识对应的应用为支持拖入所述第一对象的应用时,所述第一终端设备显示第二窗口,所述第二窗口包括所述第一应用标识对应的应用包括的一个或多个服务的服务标识;
    或者,当所述第一应用标识对应的应用为不支持拖入所述第一对象的应用时,所述第一终端设备显示第一提示信息,所述第一提示信息用于提示所述第一应用标识对应的应用不支持拖入所述第一对象。
  6. 根据权利要求5所述的方法,其特征在于,所述第二窗口中包括的服务标识与所述第一对象的类型相关。
  7. 根据权利要求6所述的方法,其特征在于,所述第一终端设备显示第二窗口之前,所述方法还包括:
    所述第一终端设备获取所述第一对象的类型;
    所述第一终端设备根据所述第一对象的类型、以及所述第一应用标识对应的应用包括的所有服务,确定所述第二窗口中包括的服务标识。
  8. 根据权利要求7所述的方法,其特征在于,所述第一终端设备根据所述第一对象的类型、以及所述第一应用标识对应的应用包括的所有服务,确定所述第二窗口中 包括的服务标识,包括:
    所述第一终端设备根据所述第一对象的类型,从所述第一应用标识对应的应用包括的所有服务中确定支持拖入所述第一对象的服务;
    所述第一终端设备将所述支持拖入所述第一对象的服务的服务标识,作为所述第二窗口中包括的服务标识。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述第一界面包括支持拖入的界面和不支持拖入的界面;所述第一终端设备显示第一窗口,包括:
    当所述第一界面为不支持拖入的界面时,所述第一终端设备显示第一窗口。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述第一终端设备显示第一窗口,包括:
    当所述第一对象被拖拽至所述第一界面后,对所述第一对象的拖拽操作未结束、且指针在所述第一界面中的停留时间达到第一时长,和/或,对所述第一对象的拖拽操作未结束、且指针在所述第一界面中的滑动距离大于第一阈值时,所述第一终端设备显示第一窗口。
  11. 根据权利要求10所述的方法,其特征在于,所述第一界面为不支持拖入的界面;所述方法还包括:
    所述第一终端设备接收第三操作,所述第三操作为将第二对象由第二终端设备的显示界面拖拽至所述第一界面后,直接结束对所述第二对象进行拖拽的操作;
    响应于所述第三操作,所述第一终端设备保存所述第二对象,或者,打开默认应用、并将所述第二对象传递给所述默认应用。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收将所述第一对象拖拽至第一应用标识的操作;所述第一应用标识为所述第一窗口中包括的应用标识中的一个;
    响应于所述将所述第一对象拖拽至第一应用标识的操作,当所述第一应用标识对应的应用为支持拖入所述第一对象的应用时,所述第一终端设备打开所述第一应用标识对应的应用,并将所述第一对象传递给所述第一应用标识对应的应用;
    或者,当所述第一应用标识对应的应用为不支持拖入所述第一对象的应用时,所述第一终端设备显示第一提示信息,所述第一提示信息用于提示所述第一应用标识对应的应用不支持拖入所述第一对象。
  13. 根据权利要求12所述的方法,其特征在于,所述第一终端设备显示第一提示信息,包括:所述第一终端设备通过改变所述第一应用标识的显示状态显示所述第一提示信息。
  14. 根据权利要求5-8任一项所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收将所述第一对象拖拽至第一服务标识的操作;所述第一服务标识为所述第二窗口中包括的服务标识中的一个;
    响应于所述将所述第一对象拖拽至第一服务标识的操作,当所述第一服务标识对应的服务为支持拖入所述第一对象的服务时,所述第一终端设备打开所述第一服务标识对应的服务,并将所述第一对象传递给所述第一服务标识对应的服务;
    或者,当所述第一服务标识对应的服务为不支持拖入所述第一对象的服务时,所 述第一终端设备显示第二提示信息,所述第二提示信息用于提示所述第一服务标识对应的服务不支持拖入所述第一对象。
  15. 根据权利要求1-14任一项所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收第四操作,所述第四操作为将第二对象由第二终端设备的显示界面拖拽至所述第一界面中的第一区域的操作;所述第一区域为空白区域;
    响应于所述第四操作,所述第一终端设备保存所述第二对象。
  16. 根据权利要求1-15任一项所述的方法,其特征在于,所述第一窗口中包括第二区域或第一图标;所述方法还包括:
    所述第一终端设备接收将所述第一对象拖拽至所述第二区域或所述第一图标上的操作;
    响应于所述将所述第一对象拖拽至所述第二区域或所述第一图标上的操作,所述第一终端设备保存所述第一对象。
  17. 根据权利要求1-16任一项所述的方法,其特征在于,所述第一界面为系统桌面,所述第一界面中包括一个或多个应用标识;所述方法还包括:
    所述第一终端设备接收关闭所述第一窗口的操作;
    响应于所述关闭所述第一窗口的操作,所述第一终端设备关闭所述第一窗口;
    所述第一终端设备接收将所述第一对象拖拽至第二应用标识的操作;所述第二应用标识为所述系统桌面中包括的应用标识中的一个;
    响应于所述将所述第一对象拖拽至第二应用标识的操作,当所述第二应用标识对应的应用为支持拖入所述第一对象的应用时,所述第一终端设备打开所述第二应用标识对应的应用,并将所述第一对象传递给所述第二应用标识对应的应用;或者,所述第一终端设备显示第三窗口,所述第三窗口包括所述第二应用标识对应的应用包括的一个或多个服务的服务标识;
    或者,当所述第二应用标识对应的应用为不支持拖入所述第一对象的应用时,所述第一终端设备显示第一提示信息,所述第一提示信息用于提示所述第二应用标识对应的应用不支持拖入所述第一对象。
  18. 根据权利要求1-17任一项所述的方法,其特征在于,所述第一终端设备显示第一窗口,包括:
    所述第一终端设备以全屏显示的方式、或者非全屏显示的方式、又或者抽屉显示的方式显示所述第一窗口。
  19. 根据权利要求1-18任一项所述的方法,其特征在于,所述方法还包括:
    当所述第一终端设备检测到所述第一对象在所述第一终端设备上被拖拽的拖拽行为发生中断时,所述第一终端设备在所述第一界面中的第一位置显示所述第一对象对应的第一浮窗。
  20. 根据权利要求19所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收对所述第一浮窗的拖拽操作;
    响应于所述对所述第一浮窗的拖拽操作,所述第一终端设备将所述第一浮窗改变为所述第一对象的拖拽阴影,供用户继续对所述第一对象进行拖拽。
  21. 一种跨设备拖拽方法,其特征在于,所述方法应用于第一终端设备;所述方 法包括:
    所述第一终端设备显示第一界面;所述第一界面包括一个或多个应用的应用标识;
    所述第一终端设备接收将第一对象拖拽至第二应用标识的操作;所述第二应用标识为所述第一界面中包括的应用标识中的一个、且所述第二应用标识对应的应用为支持拖入所述第一对象的应用;所述第一对象来自第二终端设备的显示界面;
    响应于所述将第一对象拖拽至第二应用标识的操作,所述第一终端设备打开所述第二应用标识对应的应用,并将所述第一对象传递给所述第二应用标识对应的应用;或者,所述第一终端设备显示第三窗口,所述第三窗口包括所述第二应用标识对应的应用包括的一个或多个服务的服务标识。
  22. 根据权利要求21所述的方法,其特征在于,所述方法还包括:
    所述第一终端设备接收将第二对象拖拽至第三应用标识的操作;所述第三应用标识为所述第一界面中包括的应用标识中的一个、且所述第三应用标识对应的应用为不支持拖入所述第二对象的应用;所述第二对象来自第二终端设备的显示界面;
    响应于所述将第二对象拖拽至第三应用标识的操作,所述第一终端设备显示第一提示信息,所述第一提示信息用于提示所述第三应用标识对应的应用不支持拖入所述第一对象。
  23. 根据权利要求21或22所述的方法,其特征在于,所述应用标识包括应用图标或卡片。
  24. 根据权利要求23所述的方法,其特征在于,所述卡片包括一个或多个服务标识;所述方法还包括:
    所述第一终端设备接收将第一对象拖拽至第二服务标识的操作;所述第二服务标识为所述卡片包括的服务标识中的一个;
    响应于所述将所述第一对象拖拽至第二服务标识的操作,当所述第二服务标识对应的服务为支持拖入所述第一对象的服务时,所述第一终端设备打开所述第二服务标识对应的服务,并将所述第一对象传递给所述第二服务标识对应的服务;
    或者,当所述第二服务标识对应的服务为不支持拖入所述第一对象的服务时,所述第一终端设备显示第二提示信息,所述第二提示信息用于提示所述第二服务标识对应的服务不支持拖入所述第一对象。
  25. 根据权利要求21-24任一项所述的方法,其特征在于,所述第一界面包括第一文件夹;所述第一文件夹包括一个或多个应用的应用标识;所述第二应用标识为所述第一文件夹包括的应用标识中的一个;
    所述将第一对象拖拽至第二应用标识的操作,包括:打开所述第一文件夹,并将第一对象拖拽至所述第二应用标识的操作。
  26. 根据权利要求1-25任一项所述的方法,其特征在于,所述方法还包括:
    当所述第一对象被拖入所述第一界面时,所述第一终端设备在所述第一界面的屏幕边缘显示第一UI动效;所述第一UI动效用于提示用户所述第一对象被拖入所述第一界面。
  27. 根据权利要求26所述的方法,其特征在于,所述第一UI动效的显示区域与所述第一对象的拖拽阴影在所述第一界面中被拖入的位置相关。
  28. 根据权利要求26或27所述的方法,其特征在于,所述方法还包括:
    在所述第一对象被拖拽至所述第一对象的拖拽阴影远离所述屏幕边缘的过程中,所述第一终端设备将所述第一UI动效的显示区域中与所述第一对象的拖拽阴影的所在位置相关的区域进行高亮显示或颜色加强。
  29. 根据权利要求26-28任一项所述的方法,其特征在于,所述方法还包括:
    在所述第一对象被拖拽至所述第一对象的拖拽阴影远离所述屏幕边缘的过程中,所述第一终端设备显示所述第一对象的拖尾效果;
    所述拖尾效果的显示区域跟随所述第一对象的拖拽阴影而移动;在所述拖尾效果的显示区域跟随所述第一对象的拖拽阴影移动的过程中,所述拖尾效果逐渐变小或保持不变,且所述拖尾效果的显示亮度和/或颜色逐渐变淡;当所述第一对象的拖拽阴影移动超过预设的距离后,所述第一终端设备不再显示所述拖尾效果。
  30. 一种电子设备,其特征在于,包括:处理器,用于存储所述处理器可执行指令的存储器;
    所述处理器被配置为执行所述指令时,使得所述电子设备实现如权利要求1-29任一项所述的方法。
  31. 一种计算机可读存储介质,其上存储有计算机程序指令;其特征在于,当所述计算机程序指令被电子设备执行时,使得电子设备实现如权利要求1-29任一项所述的方法。
  32. 一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,其特征在于,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器实现如权利要求1-29任一项所述的方法。
PCT/CN2022/120658 2021-10-18 2022-09-22 跨设备拖拽方法、电子设备及存储介质 WO2023065957A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22882568.3A EP4390645A1 (en) 2021-10-18 2022-09-22 Cross-device dragging method, and electronic device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111210629.5A CN115993924A (zh) 2021-10-18 2021-10-18 跨设备拖拽方法、电子设备及存储介质
CN202111210629.5 2021-10-18

Publications (1)

Publication Number Publication Date
WO2023065957A1 true WO2023065957A1 (zh) 2023-04-27

Family

ID=85992730

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/120658 WO2023065957A1 (zh) 2021-10-18 2022-09-22 跨设备拖拽方法、电子设备及存储介质

Country Status (3)

Country Link
EP (1) EP4390645A1 (zh)
CN (1) CN115993924A (zh)
WO (1) WO2023065957A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104137048A (zh) * 2011-12-28 2014-11-05 诺基亚公司 提供应用的打开实例
US20150319202A1 (en) * 2010-09-09 2015-11-05 Opentv, Inc. Methods and systems for drag and drop content sharing in a multi-device environment
CN105556451A (zh) * 2013-06-07 2016-05-04 苹果公司 用于多个显示器的用户界面
CN105892851A (zh) * 2016-03-29 2016-08-24 北京金山安全软件有限公司 一种可视资源传输方法、装置及电子设备
CN107222936A (zh) * 2017-06-26 2017-09-29 广东欧珀移动通信有限公司 一种数据处理方法、装置及终端
CN107943439A (zh) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 界面移动方法、装置、智能终端、服务器和操作系统
CN108829323A (zh) * 2018-06-22 2018-11-16 联想(北京)有限公司 信息处理方法、装置及电子设备
CN112083867A (zh) * 2020-07-29 2020-12-15 华为技术有限公司 一种跨设备的对象拖拽方法及设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150319202A1 (en) * 2010-09-09 2015-11-05 Opentv, Inc. Methods and systems for drag and drop content sharing in a multi-device environment
CN104137048A (zh) * 2011-12-28 2014-11-05 诺基亚公司 提供应用的打开实例
CN105556451A (zh) * 2013-06-07 2016-05-04 苹果公司 用于多个显示器的用户界面
CN105892851A (zh) * 2016-03-29 2016-08-24 北京金山安全软件有限公司 一种可视资源传输方法、装置及电子设备
CN107943439A (zh) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 界面移动方法、装置、智能终端、服务器和操作系统
CN107222936A (zh) * 2017-06-26 2017-09-29 广东欧珀移动通信有限公司 一种数据处理方法、装置及终端
CN108829323A (zh) * 2018-06-22 2018-11-16 联想(北京)有限公司 信息处理方法、装置及电子设备
CN112083867A (zh) * 2020-07-29 2020-12-15 华为技术有限公司 一种跨设备的对象拖拽方法及设备

Also Published As

Publication number Publication date
CN115993924A (zh) 2023-04-21
EP4390645A1 (en) 2024-06-26

Similar Documents

Publication Publication Date Title
WO2022022495A1 (zh) 一种跨设备的对象拖拽方法及设备
WO2021052147A1 (zh) 一种数据传输的方法及相关设备
WO2021013158A1 (zh) 显示方法及相关装置
WO2021027747A1 (zh) 一种界面显示方法及设备
WO2021103981A1 (zh) 分屏显示的处理方法、装置及电子设备
WO2021063074A1 (zh) 一种分屏显示方法与电子设备
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
WO2021057868A1 (zh) 一种界面切换方法及电子设备
CN111966252A (zh) 应用窗口显示方法和电子设备
WO2021036628A1 (zh) 一种具有折叠屏的设备的触控方法与折叠屏设备
WO2021052279A1 (zh) 一种折叠屏显示方法及电子设备
CN111666055B (zh) 数据的传输方法及装置
WO2021063237A1 (zh) 电子设备的控制方法及电子设备
WO2021057343A1 (zh) 一种对电子设备的操作方法及电子设备
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
WO2021078032A1 (zh) 用户界面的显示方法及电子设备
EP3958106A1 (en) Interface display method and electronic device
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
WO2020228735A1 (zh) 一种显示应用的方法及电子设备
WO2021042878A1 (zh) 一种拍摄方法及电子设备
WO2023065957A1 (zh) 跨设备拖拽方法、电子设备及存储介质
WO2022206848A1 (zh) 一种应用小部件的显示方法及设备
CN116661670B (zh) 管理手势导航窗口的方法、电子设备及存储介质
WO2023088093A1 (zh) 显示方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22882568

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022882568

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022882568

Country of ref document: EP

Effective date: 20240322

NENP Non-entry into the national phase

Ref country code: DE