WO2021083052A1 - Procédé de partage d'objet et dispositif électronique - Google Patents

Procédé de partage d'objet et dispositif électronique Download PDF

Info

Publication number
WO2021083052A1
WO2021083052A1 PCT/CN2020/123315 CN2020123315W WO2021083052A1 WO 2021083052 A1 WO2021083052 A1 WO 2021083052A1 CN 2020123315 W CN2020123315 W CN 2020123315W WO 2021083052 A1 WO2021083052 A1 WO 2021083052A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
identifiers
application
electronic device
input
Prior art date
Application number
PCT/CN2020/123315
Other languages
English (en)
Chinese (zh)
Inventor
张沛然
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021083052A1 publication Critical patent/WO2021083052A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication

Definitions

  • the embodiments of the present application relate to the field of communication technology, and in particular, to an object sharing method and electronic equipment.
  • the process of sharing an image in application A to image B can be as follows: first trigger the electronic device to select the target image in application A, save the target image in the local album, and then trigger The electronic device switches the application A to running in the background, and opens the application B, and then triggers the electronic device to open the local album in the application B, find the target image and share it.
  • the embodiments of the present application provide an object sharing method and electronic device to solve the problem of low operational efficiency of cross-application data transmission in related technologies.
  • an embodiment of the present application provides an object sharing method applied to an electronic device.
  • the method includes: receiving a user's drag input of a target object in a first interface; in response to the drag input, displaying N Logo; in the case where the end position of the drag input and the display position of the first logo meet the preset conditions, share the target object to the first application; wherein, each of the N logos is used to indicate An application program, the first identifier is used to indicate the first application program, and N is a positive integer.
  • an embodiment of the present application provides an electronic device, the electronic device includes: a receiving module, a display module, and a sharing module; the receiving module is configured to receive a user's drag input of a target object in a first interface; The display module is configured to display N identifiers in response to the drag input received by the receiving module; the sharing module is configured to display N identifiers when the end position of the drag input and the display position of the first identifier meet a preset condition Next, share the target object to the first application program; wherein, each of the N identifiers is used to indicate an application program, the first identifier is used to indicate the first application program, and N is a positive integer.
  • an embodiment of the present application provides an electronic device, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • the computer program is executed by the processor, the following On the one hand, the steps of the object sharing method.
  • an embodiment of the present application provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the object sharing method in the first aspect are implemented.
  • the electronic device may receive a drag input of the user on the target object in the first interface; in response to the drag input, display N identifiers; and the end position of the drag input is at the same position as the first identifier.
  • the target object is shared to the first application; wherein, each of the N identifiers is used to indicate an application, and the first identifier is used to indicate the first application , N is a positive integer.
  • the user can trigger the electronic device to share the target object from the first interface to the first application through a drag input, which simplifies the operation steps and can improve the operation efficiency of data transmission between applications.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the application
  • FIG. 2 is one of the flowcharts of the object sharing method provided by the embodiment of the application.
  • FIG. 3 is one of the schematic diagrams of the interface of the object sharing method provided by the embodiment of the application.
  • FIG. 5 is the second flowchart of the object sharing method provided by an embodiment of the application.
  • FIG. 6 is the third schematic diagram of the interface of the object sharing method provided by the embodiment of the application.
  • FIG. 7 is the third flowchart of the object sharing method provided by an embodiment of the application.
  • FIG. 8 is a schematic structural diagram of an electronic device provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of hardware of an electronic device provided by an embodiment of the application.
  • first”, “second”, “third”, and “fourth” in the specification and claims of this application are used to distinguish different objects, rather than describing a specific order of objects.
  • first interface, the second interface, the third interface, and the fourth interface are used to distinguish different interfaces, rather than to describe a specific order of the interfaces.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • multiple refers to two or more than two, for example, multiple processing units refers to two or more processing units; multiple elements Refers to two or more elements, etc.
  • the embodiment of the present application provides an object sharing method.
  • the electronic device can receive a user's drag input on the target object in the first interface; in response to the drag input, display N identifiers; at the end position of the drag input If the display position of the first identifier meets the preset condition, the target object is shared to the first application; wherein, each of the N identifiers is used to indicate an application, and the first identifier is used to indicate For the first application, N is a positive integer.
  • the user can trigger the electronic device to share the target object from the first interface to the first application through a drag input, which simplifies the operation steps and can improve the operation efficiency of data transmission between applications.
  • the following takes the Android operating system as an example to introduce the software environment to which the object sharing method provided in the embodiments of the present application is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of this application.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • a developer can develop a software program that implements the object sharing method provided by the embodiment of this application based on the system architecture of the Android operating system as shown in FIG. 1, so that the object The sharing method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the electronic device can implement the object sharing method provided in the embodiments of the present application by running the software program in the Android operating system.
  • the electronic device in the embodiment of the present application may be a mobile electronic device or a non-mobile electronic device.
  • Mobile electronic devices can be mobile phones, tablet computers, notebook computers, handheld computers, vehicle terminals, wearable devices, ultra-mobile personal computers (UMPC), netbooks, or personal digital assistants (personal digital assistants, PDAs), etc.
  • the non-mobile electronic device may be a personal computer (PC), a television (television, TV), a teller machine or a self-service machine, etc.; the embodiment of the application does not specifically limit it.
  • the execution subject of the object sharing method provided by the embodiments of the present application can be the aforementioned electronic device (including mobile electronic devices and non-mobile electronic devices), or can be a functional module and/or functional entity in the electronic device that can implement the method.
  • the details can be determined according to actual use requirements, and the embodiments of the present application do not limit it.
  • the following takes an electronic device as an example to illustrate the object sharing method provided in the embodiment of the present application.
  • an embodiment of the present application provides an object sharing method, which may include the following steps 201 to 203.
  • Step 201 The electronic device receives the user's drag input of the target object in the first interface.
  • the first interface may be a chat interface, a circle of friends interface, etc. in an instant social application
  • the first interface may also be a search interface, a search result interface, etc. in a search application
  • the first interface may also be
  • the shopping interface in the shopping application the first interface can also be the e-book interface in the e-book application, etc., and the first interface can also be other feasible interfaces, which can be determined according to actual use requirements. This embodiment does not limited.
  • the target object may be text, image, video file, audio file, etc., which may be specifically determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the input parameters of the drag input include at least one of the following: the movement direction of the drag input, the trajectory of the drag input, and the end position of the drag input; the input parameters of the drag input may also include other feasibility parameters, which are implemented in this application The examples are not limited.
  • Step 202 In response to the drag input, the electronic device displays N identifiers.
  • Step 203 When the end position of the drag input and the display position of the first identifier meet a preset condition, the electronic device shares the target object to the first application.
  • Each of the N identifiers is used to indicate an application program, and the first identifier is used to indicate a first application program; where N is a positive integer.
  • each of the N identifiers may be the application icon of the indicated application, or the name of the indicated application, or the interface thumbnail of the indicated application, or It may be another identifier that can be used to indicate an application program, which is not limited in the embodiment of the present application.
  • the N applications indicated by the N identifiers may be applications currently running in the background (may also include applications currently running in the foreground, that is, the applications corresponding to the first interface (hereinafter referred to as the second application).
  • Program an application that is displayed on a separate screen with a second application, or an application that is displayed on a different screen of an electronic device (multi-screen electronic device) with the second application, the embodiment of the application does not limit it); it can also be
  • the preset N application programs may also be any N application programs, which may be specifically determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the number of N identifiers can be reduced, which facilitates the user to find the first identifier, and improves the user cross-application.
  • the speed of data transmission thereby enhancing the user experience.
  • the N logos can be superimposed and displayed on the first interface, and the N logos can also be displayed on the desktop of the electronic device (at this time, the N logos may be application icons on the desktop, or not The application icon in the desktop) can be specifically determined according to actual usage requirements, which is not limited in the embodiment of the present application.
  • step 202 may be specifically implemented by the following step 202a or step 202b.
  • Step 202a In response to the drag input, the electronic device displays the N identifiers on the first interface according to a preset display effect when the input parameters of the drag input meet the first condition.
  • the first condition may include at least one of the following: the movement direction of the drag input includes the first direction (part of the movement direction of the drag input), and the movement track of the drag input passes through the electronic device The first area of the screen.
  • the first condition may also include other feasibility conditions, which may be specifically determined according to actual use requirements, which are not limited in the embodiment of the present application.
  • the preset display effect can be that the size of each logo gradually increases from a first threshold to a second threshold, the first threshold is smaller than the second threshold (the effect of the logo from being small to being larger), or the first threshold is greater than
  • the second threshold value (the effect of the identification changes from large to small)
  • the size of the first threshold and the second threshold are determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the preset display effect may be that each logo sequentially moves from the first side of the electronic device to the second side of the electronic device, and the first side and the second side are two opposite sides of the electronic device. .
  • each mark rises from the bottom of the electronic device to the top of the electronic device in sequence.
  • the preset display effect can also be other feasible display effects, which can be specifically determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • Step 202b In response to the drag input, the electronic device updates the first interface to the desktop of the electronic device when the input parameter of the drag input meets the second condition.
  • the N identifiers are application icons in the desktop.
  • the second condition may include at least one of the following: the movement direction of the drag input includes the second direction (the movement direction of the partial drag input), and the movement track of the drag input passes through the screen.
  • the second area may also include other feasibility conditions, which may be specifically determined according to actual usage requirements, and are not limited in the embodiment of the present application.
  • the second condition and the first condition may be the same or different, which is not limited in the embodiment of the present application.
  • the electronic device can determine N different identifiers according to the input parameters of the drag input, the first condition and the second condition are different.
  • the end position of the drag input and the display position of the first mark satisfy the preset condition.
  • the end position of the drag input may coincide with the display position of the first mark, or it may be the end position of the drag input.
  • the distance from the display position of the first mark is less than the first threshold (the first threshold may be determined according to actual usage requirements, and is not limited in the embodiment of the present application), and may also be other feasible conditions, which is not limited in the embodiment of the present application.
  • the target object is displayed on the second interface of the first application.
  • the first interface and the second interface may be different interfaces in the same application program, or may be different interfaces in different application programs, which is not limited in the embodiment of the present application.
  • first interface and the second interface can be superimposed and displayed on the same screen, can also be displayed in split screens on the same screen, or can be displayed separately on different screens, which are specifically determined according to actual usage requirements. Not limited.
  • the electronic device updates the first interface to the second interface, and the second interface includes the target object.
  • the electronic device displays the first interface and the second interface on separate screens, and the second interface includes the target object.
  • the first interface is displayed on the first screen of the electronic device, and the electronic device displays the second screen on the second screen of the electronic device when the display position of the end position identifier of the drag input meets the preset condition.
  • the second interface includes the target object.
  • the electronic device can control the N signs to move to the left in turn, so that the user can view the N signs; after the user finds the required signs from the N signs, The user can move the image to the left for a certain distance by dragging the image to trigger the electronic device to control the N signs to stop moving.
  • the user when the first interface is displayed on the first screen of the electronic device and the second interface is displayed on the second screen, the user can drag the target object in the first interface from the first interface to The drag input on the second interface triggers the electronic device to display the target object in the second interface.
  • the user when the first interface and the second interface are displayed in separate screens on the first screen of the electronic device, the user can drag the target object in the first interface from the first interface to the second interface.
  • the drag input triggers the electronic device to display the target object in the second interface.
  • the embodiment of the present application provides an object sharing method.
  • the electronic device can receive a user's drag input on the target object in the first interface; in response to the drag input, display N identifiers; at the end of the drag input When the position and the display position of the first identifier meet the preset conditions, the target object is shared to the first application; wherein, each of the N identifiers is used to indicate an application, and the first identifier is used for Indicate the first application, N is a positive integer.
  • the user can trigger the electronic device to share the target object from the first interface to the first application through a drag input, which simplifies the operation steps and can improve the cross-application relationship (when the application belongs to the first interface and the first application Not at the same time) the operational efficiency of data transmission.
  • the electronic device may first display a batch of logos (each logo indicates an application program).
  • each logo indicates an application program.
  • the user can use the drag Input, trigger the electronic device to change a batch of identifications, so that the user can find the first identification needed.
  • the object sharing method provided in the embodiment of the present application may further include the following step 204, and the above step 202 can be specifically implemented by the following step 202c .
  • Step 204 In response to the drag input, the electronic device displays M identifiers when the input parameter of the drag input satisfies the third condition.
  • Each of the M identifiers is used to indicate an application program, where M is a positive integer.
  • the third condition For the description of the input parameter of the drag input that satisfies the third condition, reference may be made to the related description of the input parameter of the drag input that satisfies the first condition in step 202a, which will not be repeated here.
  • the third condition and the first condition may be the same or different, which is not limited in the embodiment of the present application.
  • Step 202c The electronic device updates the M identifiers to the N identifiers when the input parameter of the drag input meets the fourth condition.
  • the fourth condition For the description of the input parameter of the drag input that satisfies the fourth condition, reference may be made to the related description of the input parameter of the drag input that satisfies the second condition in step 202b, which will not be repeated here.
  • the fourth condition and the second condition may be the same or different, which is not limited in the embodiment of the present application.
  • step 204 and step 202c the third condition and the fourth condition are different.
  • the electronic device when the trajectory of the drag input passes through the first area, the electronic device first displays M logos. If the user does not find the required first logo in the M logos, the user can continue to drag the target object.
  • the electronic device updates the M identifiers to N identifiers, the user finds the first identifier among the N identifiers, and continues to drag the target object to the first identifier.
  • the M identifiers and the N identifiers may be completely different, or may be partly the same or partly different, which is not limited in the embodiment of the present application.
  • the display format of the M logos and the display format of the N logos may be the same or different, which is not limited in the embodiment of the present application.
  • the M logos are superimposed and displayed on the first interface, and the N logos are also superimposed and displayed on the first interface.
  • the M identifiers are superimposed and displayed on the first interface, and each of the M identifiers is an identifier of an application running in the background; the N identifiers are displayed on the desktop of the electronic device, and the The N identifiers are application icons on the desktop.
  • the above-mentioned displaying of M identifiers is: superimposing and displaying the M identifiers on the first interface, and each of the M identifiers is an identifier of an application running in the background; the above M identifiers are updated to the
  • the N identifiers are: updating the first interface to the desktop of the electronic device, and the N identifiers are application icons in the desktop.
  • N identifiers are displayed on the desktop of the electronic device, since the user is more familiar with the application icons in the desktop, it is easier to find the first identifier, which can improve the user's cross-application data
  • the transmission speed improves the user experience.
  • the desktop includes N logos, and the N logos include app A; as shown in Figure 6 (a) As shown, the user continues to drag the image until the position of app A ends the drag input, then as shown in Figure 4 (a), the electronic device updates the desktop to the second interface, and displays this in the second interface image.
  • the first application program may include multiple interfaces that can be used to display the target object. Therefore, before displaying the second interface including the target object, the electronic device first displays at least one interface identifier (each interface identifier is used To indicate an interface) for the user to select an interface identifier indicating the second interface.
  • step 203 can be specifically implemented by the following steps 203a to 203c.
  • Step 203a The electronic device displays at least one interface identifier when the drag input and the display position of the first identifier meet a preset condition.
  • Each interface identifier is used to indicate a different interface in the first application.
  • Each interface identifier can be an interface name, an interface thumbnail, or other indicator interface identifiers, which can be specifically determined according to actual usage requirements, and the embodiment of the present application does not limit it.
  • Step 203b The electronic device receives the user's target input of the target interface identifier in the at least one interface identifier.
  • the target interface identifier is used to indicate the second interface.
  • the target input can be the user's click input on the target interface logo, or the user's sliding input on the target interface logo.
  • the target input can also be the user's multi-finger input on the target interface logo.
  • the input may also be other feasibility input by the user on the target interface identification, which may be specifically determined according to actual use requirements, and is not limited in the embodiment of this application.
  • the above-mentioned click input may be one-click input, double-click input, triple-click input, etc., any number of click inputs;
  • the above-mentioned sliding input may be upward sliding input, downward sliding input, left sliding input, right
  • the above-mentioned multi-finger input can be two-finger or three-finger long-press input, two-finger or three-finger sliding input, two-finger or three-finger zoom input and other arbitrary multi-finger input; It is determined according to actual usage requirements, and the embodiment of the present application does not limit it.
  • Step 203c In response to the target input, the electronic device displays the target object on the second interface.
  • the electronic device displays at least one target interface identifier "interface 1, interface 2, interface 3, interface 4, interface 5, interface", and the user selects the target interface identifier "interface 2" (corresponding to the second interface), then as As shown in (a) of FIG. 4, the electronic device updates the first interface to the second interface, and displays the image on the second interface.
  • each of the drawings in the embodiments of the present application is an example in combination with the drawings of the independent embodiment. In specific implementation, each of the drawings may also be implemented in combination with any other drawings that can be combined, and the embodiments of the present application are not limited.
  • the object sharing method provided in the embodiment of the present application may further include the above-mentioned step 204, and the above-mentioned step 202 may be specifically implemented by the above-mentioned step 202c.
  • an embodiment of the present application provides an electronic device 120.
  • the electronic device 120 includes: a receiving module 121, a display module 122, and a sharing module 123;
  • the display module 122 is configured to display N identifiers in response to the drag input received by the receiving module 121;
  • the sharing module 123 is configured to display the first drag input at the end position of the drag input
  • the target object is shared to the first application; wherein, each of the N logos is used to indicate an application, and the first logo is used to indicate the first application Program, N is a positive integer.
  • the display module 122 is specifically configured to display the N identifiers on the first interface according to a preset display effect when the input parameters of the drag input meet the first condition; or, in the drag In the case that the input parameter that is input automatically meets the second condition, the first interface is updated to the desktop of the electronic device, and the N identifiers are application program icons in the desktop.
  • the display module 122 is further configured to display M identifiers when the input parameter of the drag input satisfies the third condition before displaying the N identifiers, each of the M identifiers Used to indicate an application; the display module 122 is specifically configured to update the M identifiers to the N identifiers when the input parameter of the drag input meets the fourth condition; where M is a positive integer.
  • the display module 122 is specifically configured to superimpose and display the M identifiers on the first interface, and each of the M identifiers is an identifier of an application running in the background; the display module 122 is specifically configured to The first interface is updated to the desktop of the electronic device, and the N identifiers are application icons in the desktop.
  • the sharing module 123 is specifically configured to display at least one interface identifier, and each interface identifier is used to indicate a different interface in the first application program; receiving a user's target input of the target interface identifier in the at least one interface identifier , The target interface identifier is used to indicate the second interface; in response to the target input, the target object is displayed in the second interface.
  • the electronic device provided in the embodiment of the present application can implement each process shown in any one of FIG. 2 to FIG. 7 in the foregoing method embodiment, and in order to avoid repetition, details are not described herein again.
  • the embodiment of the present application provides an electronic device, which can receive a drag input of a user on a target object in a first interface; in response to the drag input, display N identifiers; at the end position of the drag input If the display position of the first identifier meets the preset condition, the target object is shared to the first application; wherein, each of the N identifiers is used to indicate an application, and the first identifier is used to indicate For the first application, N is a positive integer.
  • the user can trigger the electronic device to share the target object from the first interface to the first application through a drag input, which simplifies the operation steps and can improve the operation efficiency of data transmission between applications.
  • FIG. 9 is a schematic diagram of the hardware structure of an electronic device that implements each embodiment of the present application.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, and a memory 109 , Processor 110, and power supply 111 and other components.
  • Those skilled in the art can understand that the structure of the electronic device shown in FIG. 9 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, in-vehicle electronic devices, wearable devices, and pedometers.
  • the user input unit 107 is configured to receive a drag input of the user on the target object in the first interface; the display unit 106 is configured to display N identifiers in response to the drag input; and the processor 110 is configured to When the end position of the drag input and the display position of the first mark meet the preset conditions, the target object is shared to the first application, where each of the N marks is used to indicate an application, The first identifier is used to indicate the first application, and N is a positive integer.
  • the electronic device can receive a drag input of the user on the target object in the first interface; in response to the drag input, display N identifiers; When the display position of an identifier satisfies the preset condition, the target object is shared to the first application; wherein, each of the N identifiers is used to indicate an application, and the first identifier is used to indicate the first application.
  • N is a positive integer.
  • the user can trigger the electronic device to share the target object from the first interface to the first application through a drag input, which simplifies the operation steps and can improve the operation efficiency of data transmission between applications.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
  • the data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the electronic device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device and the electronic device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the electronic device, which uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the electronic device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present application further provides an electronic device, which may include the aforementioned processor 110 shown in FIG. 9, a memory 109, and a computer program stored in the memory 109 and running on the processor 110,
  • the computer program is executed by the processor 110, each process of the object sharing method shown in any one of FIG. 2 to FIG. 7 in the foregoing method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here. .
  • the embodiment of the present application also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program When the computer program is executed by a processor, the computer program shown in any one of FIG. 2 to FIG.
  • Each process of the object sharing method can achieve the same technical effect. In order to avoid repetition, I will not repeat it here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Divulgués dans des modes de réalisation de la présente demande sont, un procédé de partage d'objet et un dispositif électronique. Le procédé consiste : à recevoir une entrée de glissement d'un utilisateur sur un objet cible dans une première interface ; à afficher N identifiants en réponse à l'entrée de glissement ; et à partager l'objet cible avec un premier programme d'application sous réserve qu'une position d'extrémité de l'entrée de glissement et une position d'affichage du premier identifiant satisfassent une condition prédéfinie, chacun des N identifiants étant utilisé pour indiquer un programme d'application, le premier identifiant étant utilisé pour indiquer le premier programme d'application, et N étant un nombre entier positif.
PCT/CN2020/123315 2019-10-28 2020-10-23 Procédé de partage d'objet et dispositif électronique WO2021083052A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911033618.7 2019-10-28
CN201911033618.7A CN110851051B (zh) 2019-10-28 2019-10-28 一种对象分享方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021083052A1 true WO2021083052A1 (fr) 2021-05-06

Family

ID=69598090

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123315 WO2021083052A1 (fr) 2019-10-28 2020-10-23 Procédé de partage d'objet et dispositif électronique

Country Status (2)

Country Link
CN (1) CN110851051B (fr)
WO (1) WO2021083052A1 (fr)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110851051B (zh) * 2019-10-28 2021-06-08 维沃移动通信有限公司 一种对象分享方法及电子设备
CN111600931A (zh) * 2020-04-13 2020-08-28 维沃移动通信有限公司 一种信息分享方法及电子设备
CN111522674B (zh) * 2020-04-30 2024-05-07 维沃移动通信(杭州)有限公司 多媒体内容的跨应用处理方法及电子设备
CN111881647A (zh) * 2020-06-09 2020-11-03 维沃移动通信有限公司 标识显示方法、装置及电子设备
CN111796733B (zh) * 2020-06-28 2022-05-17 维沃移动通信(杭州)有限公司 图像显示方法、图像显示装置和电子设备
CN112269523B (zh) * 2020-10-28 2023-05-26 维沃移动通信有限公司 对象编辑处理方法、装置及电子设备
CN113055525A (zh) * 2021-03-30 2021-06-29 维沃移动通信有限公司 文件分享方法、装置、设备和存储介质
CN113452744B (zh) * 2021-03-30 2023-06-09 维沃移动通信有限公司 文件分享方法、装置、设备和存储介质
CN113360879B (zh) * 2021-05-27 2023-09-22 维沃移动通信(杭州)有限公司 显示控制方法、装置、电子设备及介质
CN115407909A (zh) * 2021-05-27 2022-11-29 Oppo广东移动通信有限公司 内容分享方法、装置、终端及存储介质
CN114001748B (zh) * 2021-10-28 2024-03-22 维沃移动通信有限公司 导航路线显示方法、装置、设备及介质
CN114338897B (zh) * 2021-12-16 2024-01-16 杭州逗酷软件科技有限公司 对象的分享方法、装置、电子设备以及存储介质
CN114327189B (zh) * 2022-03-07 2022-09-30 深圳传音控股股份有限公司 操作方法、智能终端及存储介质
WO2023169236A1 (fr) * 2022-03-07 2023-09-14 深圳传音控股股份有限公司 Procédé d'opération, terminal intelligent et support de stockage
CN115202555A (zh) * 2022-06-23 2022-10-18 维沃移动通信有限公司 信息处理方法、装置
CN115309309A (zh) * 2022-08-17 2022-11-08 维沃移动通信有限公司 内容分享方法、装置、电子设备及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9075583B1 (en) * 2013-03-15 2015-07-07 Emc Corporation Layout design for a mobile application using selected governance, risk management and compliance rules
CN106527882A (zh) * 2016-09-29 2017-03-22 北京小米移动软件有限公司 一种内容分享的方法、装置及终端
CN107247746A (zh) * 2017-05-23 2017-10-13 努比亚技术有限公司 一种数据分享方法及终端
CN108762954A (zh) * 2018-05-29 2018-11-06 维沃移动通信有限公司 一种对象分享方法及移动终端
CN110851051A (zh) * 2019-10-28 2020-02-28 维沃移动通信有限公司 一种对象分享方法及电子设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102087662A (zh) * 2011-01-24 2011-06-08 深圳市同洲电子股份有限公司 一种信息的搜索方法及搜索装置
CN102508707A (zh) * 2011-11-21 2012-06-20 宇龙计算机通信科技(深圳)有限公司 信息编辑方法和终端
US9116596B2 (en) * 2012-06-10 2015-08-25 Apple Inc. Sharing images and comments across different devices
KR20140007163A (ko) * 2012-07-09 2014-01-17 삼성전자주식회사 모바일 장치에 클립보드 기능을 제공하는 방법 및 장치
US10599250B2 (en) * 2013-05-06 2020-03-24 Qeexo, Co. Using finger touch types to interact with electronic devices
CN105955607B (zh) * 2016-04-22 2020-06-19 北京小米移动软件有限公司 内容分享方法和装置
CN106603823A (zh) * 2016-11-28 2017-04-26 努比亚技术有限公司 一种内容分享方法、装置及终端
CN107977152A (zh) * 2017-11-30 2018-05-01 努比亚技术有限公司 一种基于双屏移动终端的图片分享方法、终端和存储介质
CN108958580B (zh) * 2018-06-28 2021-07-23 维沃移动通信有限公司 一种显示控制方法及终端设备
CN109471742A (zh) * 2018-11-07 2019-03-15 Oppo广东移动通信有限公司 信息处理方法、装置、电子设备及可读存储介质
CN109582475A (zh) * 2018-11-27 2019-04-05 维沃移动通信有限公司 一种分享方法及终端
CN109683761B (zh) * 2018-12-17 2021-07-23 北京小米移动软件有限公司 内容收藏方法、装置及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9075583B1 (en) * 2013-03-15 2015-07-07 Emc Corporation Layout design for a mobile application using selected governance, risk management and compliance rules
CN106527882A (zh) * 2016-09-29 2017-03-22 北京小米移动软件有限公司 一种内容分享的方法、装置及终端
CN107247746A (zh) * 2017-05-23 2017-10-13 努比亚技术有限公司 一种数据分享方法及终端
CN108762954A (zh) * 2018-05-29 2018-11-06 维沃移动通信有限公司 一种对象分享方法及移动终端
CN110851051A (zh) * 2019-10-28 2020-02-28 维沃移动通信有限公司 一种对象分享方法及电子设备

Also Published As

Publication number Publication date
CN110851051A (zh) 2020-02-28
CN110851051B (zh) 2021-06-08

Similar Documents

Publication Publication Date Title
WO2021083052A1 (fr) Procédé de partage d'objet et dispositif électronique
WO2021115329A1 (fr) Procédé de commande d'application, et dispositif électronique
WO2021104365A1 (fr) Procédé de partage d'objets et dispositif électronique
WO2021197263A1 (fr) Procédé de partage de contenu et dispositif électronique
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021218902A1 (fr) Procédé et appareil de commande d'affichage et dispositif électronique
WO2021083132A1 (fr) Procédé de déplacement d'icônes et dispositif électronique
WO2021082711A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2021057337A1 (fr) Procédé de fonctionnement et dispositif électronique
WO2021012931A1 (fr) Procédé et terminal de gestion d'icônes
CN111338530B (zh) 应用程序图标的控制方法和电子设备
CN109917995B (zh) 一种对象处理方法及终端设备
WO2021104163A1 (fr) Procédé d'agencement d'icônes et dispositif électronique
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
WO2021129536A1 (fr) Procédé de déplacement d'icône et dispositif électronique
WO2021129538A1 (fr) Procédé de commande et dispositif électronique
CN110888707A (zh) 一种消息发送方法及电子设备
WO2020207379A1 (fr) Procédé de division d'écran et dispositif terminal
WO2020182035A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2020215982A1 (fr) Procédé de gestion d'icône de bureau et dispositif terminal
WO2021164716A1 (fr) Procédé d'affichage et dispositif électronique
WO2020181945A1 (fr) Procédé d'affichage d'identifiant et borne
CN108228902B (zh) 一种文件显示方法及移动终端
CN110908554B (zh) 长截图的方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20882207

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20882207

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20882207

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.03.2023)