WO2024008017A1 - 内容分享方法、图形界面及相关装置 - Google Patents

内容分享方法、图形界面及相关装置 Download PDF

Info

Publication number
WO2024008017A1
WO2024008017A1 PCT/CN2023/105191 CN2023105191W WO2024008017A1 WO 2024008017 A1 WO2024008017 A1 WO 2024008017A1 CN 2023105191 W CN2023105191 W CN 2023105191W WO 2024008017 A1 WO2024008017 A1 WO 2024008017A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
window
interface element
application
sharing
Prior art date
Application number
PCT/CN2023/105191
Other languages
English (en)
French (fr)
Inventor
毕晟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024008017A1 publication Critical patent/WO2024008017A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This application relates to the field of terminal technology, and in particular to content sharing methods, graphical interfaces and related devices.
  • This application provides a content sharing method, graphical interface and related devices.
  • developers do not need to manually adapt the application for drag-and-drop sharing in advance to realize content sharing of the application, reducing the developer's workload. .
  • this application provides a content sharing method.
  • the method includes: a first device displays a first window, and the first window includes one or more interface elements; the first device detects a first element that acts on the first window. Operation; after the first device detects the first operation, the first device detects a drag operation acting on a first interface element among one or more interface elements; in response to the drag operation, the first device in the second window Display the second interface element corresponding to the first interface element, or the first device sends the transmission content of the first interface element to the second device; wherein, before the first device detects the first operation, the first interface element is The drag operation is not used to trigger the first device to display the second interface element in the second window or to send the transmission content to the second device.
  • a sharing mode is provided.
  • the electronic device can enter the sharing mode after detecting the first operation.
  • the electronic device can automatically change the response behavior of the interface element so that in the In sharing mode, interface elements can be shared from one window to another through the user's drag operation, or the transmission content of the interface element can be shared with other devices.
  • the workload of developers to manually adapt applications for drag-and-drop sharing is reduced, the application scenarios for content sharing are expanded, and the user experience is improved.
  • the second window when the electronic device displays the first window, the second window may be displayed at the same time. In this case, the electronic device may directly share the transmission content of the interface element between the two windows that are displayed simultaneously.
  • the electronic device when the electronic device displays the first window, it does not display the second window, and after the electronic device detects the drag operation, it triggers the display of the second window. In this way, while the electronic device realizes cross-window sharing of content, it also realizes window switching.
  • the first device displays the second interface element corresponding to the first interface element in the second window, or the first device sends the transmission content corresponding to the first interface element to the second device Previously, the method also included: the first device displays a screenshot of the first interface element that moves following the movement trajectory of the drag operation.
  • the electronic device can follow the movement of the user's finger to display a screenshot of the interface element, making drag-and-drop sharing more interesting.
  • the first window and the second window belong to the same application or different applications.
  • Electronic devices can realize content sharing between different applications, that is, drag and drop the content of one application to another application, or electronic devices can realize content sharing between different pages within the same application, that is, drag and drop the content of one application to another application. Drag and drop content to share it on another page.
  • the second interface element includes: transmission content or an identification of the transmission content.
  • the identification is an icon, or a screenshot of the first interface element.
  • the first operation is used to trigger the first device to enter the first mode, and after the first device detects the first operation acting on the first window, the method further includes:
  • the first device displays prompt information in the first window, and the prompt information is used to indicate that the first device has entered the first mode.
  • the first prompt information is used to highlight the first interface element.
  • the first prompt information can be represented by changing the display effect of the first interface element, adding additional information, etc.
  • the display effect can include: position, size, color, brightness, transparency, saturation, shadow Such as static effects, dynamic effects of interface element jitter, etc.
  • This additional information can be expressed as the border of the interface element, the icon in the upper right corner of the interface element, etc. In this way, the user can know the interface elements that can be dragged in the currently displayed window through the first prompt information.
  • the first prompt information includes a border of the first interface element.
  • the method further includes: the first device stops displaying the third interface element in the first window.
  • the electronic device can only control some interface elements to enter the sharing mode, that is, only some interface elements support drag and drop sharing.
  • the interface elements that do not support drag and drop sharing may not be within the scope of the first prompt information prompt, that is, There will be no animation effect or display effect change for this interface element, and no additional information will be added.
  • the electronic device may not display the interface element that does not support drag-and-drop sharing. In this way, it is possible to prevent some applications that have secure privacy information from leaking users' private information due to content sharing, or to prevent some unimportant interface elements from interfering with users' interface elements that need to be dragged and dropped for sharing.
  • the drag operation is specifically: a sliding operation from the position of the first interface element to a specified position.
  • the first window is a window of the first application
  • the second window is a window of the second application
  • the designated location may refer to the location of the icon of the second application; or, the designated location The location may refer to the location where the icon of the second device is located, or the location where the icon of the first contact is located, where the second device is the device used by the first contact.
  • the icon of the second application may be displayed in a list including icons of multiple applications, or the icon of the second device may be displayed in a list including icons of multiple devices, or the icon of the first contact Icons can be displayed within a list containing multiple contact icons.
  • users can share the transmission content to designated applications or designated devices or designated contacts according to their own needs, improving user operability.
  • the second window is displayed in the first user interface, and the first user interface further includes the first window.
  • the electronic device after sharing the transmission content to the second window, the electronic device can display the first window and the second window at the same time. In this way, after the content is shared successfully, the user can view the display content of the content sharer and the content receiver at the same time.
  • the first interface element includes one or more interface elements.
  • the electronic device can realize drag-and-drop sharing of one or more interface elements through one drag-and-drop operation.
  • the electronic device can realize drag-and-drop sharing of multiple interface elements through one drag-and-drop operation
  • the user can quickly realize the drag-and-drop sharing of multiple interface elements through one drag-and-drop operation.
  • the sharing of content makes it convenient for users to operate.
  • the first interface element includes N interface elements, N ⁇ 2, and N is a positive integer
  • the first device detects that the first interface element acts on the first interface element among the one or more interface elements.
  • the method further includes: the first device detects a selection operation acting on N interface elements.
  • the method further includes: the first device changes or creates the response behavior of the M interface elements in the first window, Enable the first device to display a logo in the second window or send the transmission content to the second device in response to a drag operation on the first interface element among M interface elements, where M ⁇ 1, and M is positive integer.
  • electronic devices can automatically change the response behavior of interface elements in sharing mode, avoiding the trouble of developers manually declaring applications or interface elements that support sharing, and reducing developers' workload.
  • the method further includes:
  • the first device obtains the transmission content from the information of the first interface element.
  • the electronic device can automatically determine the transmission content required in the sharing process based on the information of the interface elements in the sharing mode, avoiding the trouble of developers manually declaring the transmission content and reducing the developer's workload.
  • the method is executed by a system unit of the first device, and the system unit and the application to which the first window belongs are different modules of the first device.
  • the system unit may be located at a frame layer of the first device.
  • the sharing mode can be defined as a system-level drag-and-drop sharing mode. In this way, any application under the system can respond to the first operation and enter the sharing mode to realize application content sharing and expand the content. Sharing application scenarios improve users’ experience of drag-and-drop sharing.
  • embodiments of the present application provide an electronic device, which is characterized in that it includes a memory, one or more processors, and one or more programs; when one or more processors execute one or more programs, , causing the electronic device to implement the method described in the first aspect or any implementation manner of the first aspect.
  • embodiments of the present application provide a computer-readable storage medium, including instructions, which are characterized in that when the instructions are run on an electronic device, the electronic device enables the electronic device to implement the first aspect or any one of the first aspects. method described in each embodiment.
  • Figure 1 is a schematic diagram of the hardware structure of an electronic device 100 provided by an embodiment of the present application.
  • FIGS 2A-2G, 3A-3F, and 4A-4D are some user interfaces provided by embodiments of the present application.
  • Figure 5 is a schematic diagram of the software structure of the electronic device 100 provided by the embodiment of the present application.
  • Figure 6 is a flow chart of interactions between internal modules in the software structure of the electronic device 100 provided by the embodiment of the present application;
  • Figure 7 is a schematic diagram of a tree structure of windows, controls and layouts provided by the embodiment of the present application.
  • Figure 8 is a schematic diagram of part of the layout and controls in the user interface 10 provided by the embodiment of the present application.
  • Figure 9 is a schematic flowchart of a content sharing method provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only and shall not be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the embodiments of this application, unless otherwise specified, “plurality” The meaning is two or more.
  • GUI graphical user interface
  • An embodiment of the present application provides a content sharing method.
  • the method includes: an electronic device displays one or more interface elements in a first window. After the electronic device enters the sharing mode, the electronic device can detect that the user has acted on the one or more interface elements.
  • the drag operation of the target interface element among multiple interface elements determines the transmission content according to the target interface element, shares the transmission content to the second window, and displays the transmission content or the identification corresponding to the transmission content in the second window, Or, share the transferred content to other devices to share the content.
  • interface elements refer to a series of elements in the user interface that meet user interaction requirements, including: pictures, text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets and other controls or A combination of these controls.
  • the electronic device After entering the sharing mode, the electronic device can change or create the response behavior of one or more interface elements currently displayed, so that after entering the sharing mode, the electronic device can detect the sharing operation acting on the interface element and trigger a response to the interface element. Sharing of elements.
  • the same operation of the electronic device for the interface element will not trigger any behavior, or the same operation can be used to trigger the electronic device to perform other behaviors for the interface element, which behavior is different from Share this interface element.
  • the interface element is a picture. Before the electronic device enters the sharing mode, the click operation on the picture can be used to trigger the display of a high-definition picture of the picture.
  • the click operation on the picture can be used to Trigger sharing of this image.
  • developers do not need to manually declare the content that can be dragged in the application.
  • the electronic device can automatically adjust the response behavior of each interface element by detecting whether it enters the sharing mode, so that the electronic device can detect the sharing operation acting on the interface element. , to realize sharing of the interface elements.
  • the electronic device After the electronic device detects the sharing operation acting on the target interface element, it can automatically determine the transmission content based on the target interface element, and implement sharing for the target interface element based on the transmission content.
  • the transmission content can include text, pictures, voice, tables, videos, files, etc.
  • the transmission content can be a piece of text, which can be the text displayed in the text control, or can include other text in addition to the text displayed in the text control.
  • the transmission content can be a high-definition picture corresponding to the picture control.
  • the content displayed in the picture control can be part of the high-definition picture, and its definition can also be lower than that. HD pictures.
  • the transmission content can be a file. It can be seen that the electronic device automatically obtains the content corresponding to the interface element as the transmission content. In this way, the developer does not need to manually declare the transmission content during the sharing process.
  • the electronic device can automatically use the content contained in the interface element as the content based on the interface element selected by the user. Transfer content.
  • the electronic device can display the transmission content or a logo corresponding to the transmission content in the second window.
  • the logo can be a presentation form of the transmission content, for example, the logo It can be a screenshot of the target interface element, or it can be a preset icon.
  • the embodiment of the present application does not limit the identification.
  • the content sharing method provided by the embodiment of the present application provides a sharing mode.
  • this sharing mode the electronic device can automatically change the response behavior of the interface elements of the application to realize the application's self-adaptation to drag and drop sharing.
  • the content of any application can be shared in this sharing mode. Developers do not need to declare in advance the applications or interface elements that support sharing, as well as the transmission content during the sharing process. Users can also share the content of the application, reducing the number of developers. workload, expands the application scenarios of content sharing, and improves user experience.
  • FIG. 1 shows a schematic diagram of the hardware structure of the electronic device 100 .
  • the electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant) digital assistant (PDA), augmented reality (AR) device, virtual reality (VR) device, artificial intelligence (AI) device, wearable device, vehicle-mounted device, smart home device and/or Smart city equipment, the embodiment of this application does not place special restrictions on the specific type of electronic equipment.
  • PDA personal digital assistant
  • AR augmented reality
  • VR virtual reality
  • AI artificial intelligence
  • wearable device wearable device
  • vehicle-mounted device smart home device and/or Smart city equipment
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the processor 110 may be configured to change or create a response behavior of one or more interface elements currently displayed after the electronic device 100 enters the sharing mode, and according to the user's operation, find out from one or more interface elements.
  • Target interface element and determine the transmission content based on the target interface element. Specific descriptions of changing or creating the response behavior of interface elements, determining target interface elements, and determining transmission content can be found in subsequent method embodiments, which will not be discussed here.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , demodulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display screen 194 can be used to display a user interface, including a first user interface, a first window, a second window, and other user interfaces related to content sharing.
  • a user interface including a first user interface, a first window, a second window, and other user interfaces related to content sharing.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • Camera 193 is used to capture still images or video.
  • the internal memory 121 may include one or more random access memories (RAM) and one or more non-volatile memories (NVM).
  • RAM random access memories
  • NVM non-volatile memories
  • the external memory interface 120 can be used to connect an external non-volatile memory to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement the data storage function. For example, save music, video and other files in external non-volatile memory.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals.
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the headphone interface 170D is used to connect wired headphones.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • Air pressure sensor 180C is used to measure air pressure.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may utilize the magnetic sensor 180D to detect opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • Distance sensor 180F for measuring distance.
  • Electronic device 100 can measure distance via infrared or laser.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outwardly through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects.
  • the electronic device 100 can use the proximity light sensor 180G to detect when the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touching.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • Temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also known as "touch device”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K can be used to detect the user's sharing operation and pass the user operation to the processor 110 so that the processor 110 triggers the electronic device 100 to enter the sharing mode, or triggers the electronic device 100 to share interface elements, etc. wait.
  • Bone conduction sensor 180M can acquire vibration signals.
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • 2A-2G illustrate some user interfaces involved when the electronic device 100 performs content sharing between applications.
  • FIG. 2A exemplarily shows the user interface 10 provided by the entertainment application after the electronic device 100 starts the entertainment application.
  • the entertainment application can be used to provide users with entertainment services such as social networking, chatting, watching videos, listening to music, etc.
  • the entertainment application can refer to a Weibo application.
  • the user interface 10 may include: a status bar 101 , a first menu bar 102 , a browsing area 103 , and a second menu bar 104 . in:
  • the status bar 101 may include one or more signal strength indicators for mobile communication signals, one or more signal strength indicators for wireless fidelity (WiFi) signals, a battery status indicator, and a time indicator.
  • WiFi wireless fidelity
  • the first menu bar 102 may include one or more options, and the electronic device 100 may detect an operation on the option and activate a function corresponding to the option.
  • the first menu bar 102 may include: photo options, follow options, recommendation options, local options, and more options.
  • the camera option can be used to start the camera function
  • the attention option can be used to trigger the electronic device 100 to display the entertainment content that the user is concerned about in the browsing area 103
  • the recommendation option can be used to trigger the electronic device 100 to display the entertainment application recommended in the browsing area 103.
  • Entertainment content the same-city option can be used to trigger the electronic device 100 to display other users in the city where the user is located in the browsing area 103, entertainment content published on the entertainment application, and more options can be used to trigger the electronic device 100 to display other more hidden information.
  • Functions such as publishing entertainment content in the form of text, pictures, videos, etc.
  • the browsing area 103 can be used to display entertainment content published by different users on the entertainment application. Among them, for the entertainment posted by a user Entertainment content, the browsing area 103 can display the user's avatar, name, published entertainment content, as well as sharing options, comment options, like options, etc. for the entertainment content. As shown in FIG. 2A , the browsing area 103 may include a picture control 103A, which is displayed as a thumbnail of a picture. The electronic device 100 may detect a user operation, such as a click operation, on the picture control 103A to trigger the display of the picture.
  • a user operation such as a click operation
  • the picture corresponding to the control 103A the picture can be a high-definition large picture, the content displayed can be more than the content displayed by the thumbnail in the picture control 103A, and/or the definition of the picture is higher than that in the picture control 103A Thumbnail clarity.
  • the second menu bar 104 may include one or more options, and the electronic device 100 may detect an operation on the option and activate a function corresponding to the option.
  • the second menu bar 104 may include: home page options, discovery options, message options, and local options. These multiple options can be used to trigger the electronic device 100 to display different pages provided by the entertainment application in the user interface 10.
  • the electronic device 100 displays the user's information in the user interface 10.
  • the home page, discovery page, message page, and local page are displayed in interface 10.
  • the user interface 10 shown in FIG. 2A may be the content displayed by the electronic device 100 when the home page option is selected.
  • the electronic device 100 may trigger the display of the user interface 10 as shown in FIG. 2A after detecting a user operation, such as a click operation, on the application icon of the entertainment application.
  • the electronic device 100 can also detect the voice command of the entertainment application by detecting the user's voice, triggering the display of the user interface 10 as shown in FIG. 2A.
  • the electronic device 100 displays the user interface as shown in FIG. 2A. The trigger mode of 10 is not displayed.
  • this application takes the electronic device 100 starting an entertainment application as an example to describe the content sharing method provided by the embodiment of the application.
  • the embodiment of the application does not limit the applications started by the electronic device 100.
  • the electronic device 100 can also display user interfaces provided by application programs such as music applications, chat applications, office applications, etc.
  • the user interface 10 shown in FIG. 2A is only an illustrative example.
  • the user interface provided by the entertainment application may include more or fewer controls, and the user interface 10 does not constitute a limitation on the embodiments of the present application.
  • the electronic device 100 when the electronic device 100 detects a user operation on the user interface 10 , such as a quick three-click operation, in response to the operation, the electronic device 100 enters the sharing mode.
  • a user operation on the user interface 10 such as a quick three-click operation
  • the electronic device 100 can detect a sharing operation acting on the currently displayed content, and trigger sharing of interface elements acting on the sharing operation, such as sharing to other applications, sharing to other devices, and so on.
  • the sharing mode may also be called a sharing mode, and the embodiment of the present application does not limit the name of the sharing mode.
  • “sharing mode” of an electronic device such as a smartphone
  • the electronic device when the electronic device detects a user's sharing operation, the electronic device can trigger the sharing of interface elements targeted at the sharing operation.
  • “Sharing mode” can be a service and function provided by the electronic device 100, which can support the electronic device 100 to switch applications and realize data sharing between applications or data sharing between devices.
  • the electronic device 100 can display a border around the currently displayed interface element.
  • the border can be used to prompt the user that the current electronic device 100 has entered the sharing mode, and the electronic device 100 can detect the effect. Operations on interface elements surrounded by these borders trigger sharing of the interface elements.
  • the electronic device 100 can traverse the interface elements included in the currently displayed content, determine the size (such as length and height) and position information of these interface elements, and draw borders around these interface elements based on the size and position information.
  • the electronic device 100 may display the user interface 10 as shown in FIG. 2B.
  • (a), (b), and (c) in FIG. 2B respectively show the user interfaces 10 that may be displayed after the three electronic devices 100 enter the sharing mode.
  • each interface element displayed in the user interface 10 shown in Figure 2B is displayed with a rectangular frame around it, where the interface
  • the elements may include the first menu bar 102 and photo options, follow options, recommendation options, local options, and more options in the first menu bar 102, as well as the user's avatar, name, published entertainment content in the browsing area 103, and Sharing options, comment options, like options, etc. for the entertainment content, as well as the second menu bar 104, home page options, discovery options, message options, local options, etc. in the second menu bar 104.
  • the interface element may refer to a single control, including text control, picture control, button control, table control, etc., such as the picture control 103A shown in (a) in Figure 2B , may also refer to a combination of multiple controls, such as the first menu bar 102 shown in (a) in Figure 2B.
  • the embodiments of the present application do not limit this.
  • the border around these interface elements can also be a circular frame, a diamond frame, an irregular frame or a wire frame that fits the shape of the interface element.
  • the application embodiment does not limit the shape of the frame.
  • the electronic device 100 in addition to these frames being always displayed after the electronic device 100 enters the sharing mode, the electronic device 100 can also display these frames alternately, or display these frames at intervals, and so on. For example, after the electronic device 100 enters the sharing mode, the electronic device 100 can follow the above steps.
  • the frame in the first menu bar 102 is first displayed, then the frame in the browsing area 103 is displayed, and finally the frame in the second menu bar 104 is displayed, and then displayed again in order from top to bottom.
  • the embodiment of the present application does not limit the display rules or display time of the border.
  • the electronic device 100 can also change the display effect of the currently displayed content.
  • the display effect includes: position, size, color, brightness, and transparency. , saturation, shadow, etc. In this way, the user can perceive that the current electronic device 100 has entered the sharing mode based on the change in the display effect of the currently displayed content.
  • the electronic device 100 can also display prompt information.
  • the prompt information is used to remind the user that the current electronic device 100 has entered the sharing mode.
  • the electronic device 100 can display a message as shown in Figure 2B A prompt message is displayed at the bottom of the user interface 10: You have entered sharing mode, please select the content that needs to be shared.
  • a border may not be displayed around some interface elements that do not support sharing, or the interface elements may be directly hidden.
  • the electronic device 100 may display the user interface 10 as shown in (b) or (c) in FIG. 2B .
  • the first menu bar 102 , the browsing area 103 , and the second menu bar 104 are still displayed in the user interface 10 .
  • the electronic device 100 only displays a frame in the browsing area 103 . In this way, the user can know through the border displayed only in the browsing area 103 that only the content in the browsing area 103 supports sharing, and the content in the first menu bar 102 and the second menu bar 104 does not support sharing.
  • the electronic device 100 can exit the sharing mode after detecting the user's specified operation (such as a left swipe operation). For example, when the electronic device 100 detects a left sliding operation on the user interface 10 shown in (c) in Figure 2B, the electronic device 100 exits the sharing mode and displays the user interface as shown in Figure 2A.
  • the electronic device 100 detects a left sliding operation on the user interface 10 shown in (c) in Figure 2B, the electronic device 100 exits the sharing mode and displays the user interface as shown in Figure 2A.
  • the electronic device 100 can also change the currently displayed user interface, for example, return to the upper-level user interface of the application, or switch to the user interface of another application and change the user interface. After the interface is changed, the electronic device 100 can still continue to be in the sharing mode.
  • the electronic device 100 can detect the sharing operation acting on the interface element in the modified user interface, and trigger the sharing of the interface element.
  • the electronic device 100 can detect the sharing operation on the picture control 103A shown in (a) in FIG. 2B , and trigger the sharing of the picture control 103A.
  • the electronic device 100 can obtain the transmission content corresponding to the picture control 103A, so that the electronic device 100 can share the transmission content to other applications.
  • the sharing operation can be performed as a drag operation from gesture 1 to gesture 2 in Figure 2C, to gesture 3 in Figure 2D, to gesture 4 in Figure 2E, and to gesture 5 in Figure 2F.
  • the content displayed by the electronic device 100 may be different.
  • the electronic device 100 can obtain a screenshot picture 103B that is the same as the picture control 103A by taking a screenshot of the picture control 103A.
  • the screenshot picture 103B can move following the user's touch point on the display screen after the user triggers the sharing operation.
  • the picture that moves according to the user's touch point on the display screen can be a picture corresponding to the picture control 103A.
  • the clarity of the picture can be higher than the screenshot of the picture control 103A, and/or the display content can also be more than that of the picture control. Display content in the screenshot of 103A.
  • the embodiment of this application does not limit the content displayed during the user's dragging process.
  • the electronic device 100 may stop displaying the border when the electronic device 100 begins to detect the sharing operation.
  • the electronic device 100 can display as shown in FIG. 2D at the bottom of the display screen.
  • the application list 105 is used to display one or more application icons.
  • the application list 105 may include: a first icon 105A, a second icon 105B, and a third icon 105C.
  • the first icon 105A can be used to trigger the startup of the SMS application
  • the second icon 105B can be used to trigger the startup of the settings application
  • the third icon 105C can be used to trigger the startup of the gallery application.
  • the screenshot image 103B is close to the bottom of the user interface 10 .
  • the applications corresponding to the application icons displayed in the application list 105 may be applications that are frequently used by the electronic device 100, or applications that the electronic device 100 has used recently, or applications that are running in the background of the electronic device 100, etc. Etc., the embodiment of the present application does not limit the association between the application icons displayed in the application list 105 and the electronic device 100 .
  • the embodiment of the present application does not limit the display location of the application list 105.
  • the application list 105 can be displayed on the left side of the user interface or on the right side of the user interface.
  • the electronic device 100 may trigger display of the application list 105 on the left side of the user interface after detecting that the user's drag operation on the screenshot image 103B moves to the left side of the user interface.
  • the embodiments of this application There is no limit on the timing when the electronic device 100 triggers the display of the application list 105.
  • the electronic device 100 may trigger the display of the application list 105 in the user interface 10 after entering the sharing mode.
  • the electronic device 100 displays the user interface 20 as shown in (a) in Figure 2F.
  • the user interface 20 is a user interface provided by the SMS application, or the electronic device 100 displays (b in Figure 2F ), the user interface 30 includes a user interface 10 provided by an entertainment application and a user interface 20 provided by a text message application.
  • the electronic device 100 can simultaneously display the contents of the user interface 10 in a split-screen display manner. and content in the user interface 20.
  • the user interface 20 may be a user interface displayed when the electronic device 100 historically opens a text message application.
  • the embodiment of the present application does not limit the user interface displayed by the receiving application when sharing content between applications.
  • the electronic device 100 can change the display effect of the first icon 105A.
  • the display effect can include icon size, icon color, icon position, etc. . Comparing Figure 2D and Figure 2E, it can be seen that when the user's drag operation is within the range of the first icon 105A, the electronic device 100 can enlarge the icon size of the first icon 105A, thereby prompting the user through the change in the size.
  • the picture corresponding to the picture control 103A can be shared to the application corresponding to the first icon 105A.
  • the electronic device 100 can display the user interface as shown in (a) or (b) in Figure 2F after the user's drag operation is within the range of the first icon 105A and remains for a period of time, such as 1 second. interface.
  • the user interface 20 may include an information display area 201 and an information input area 202 .
  • the information display area 201 is used to display information exchanged between the user and other people, that is, the information communicated between the electronic device 100 and other devices, and the information input area 202 is used to trigger the input of information.
  • the user interface 30 may include an area 301 and an area 302, where the area 301 is used to display the user interface provided by the entertainment application, and the area 302 is used to display the user interface provided by the text message application.
  • the user can view the user interfaces provided by the two applications at the same time, and the user can view the source of the image and the recipient of the image at the same time, helping the user to view the image sharing process more clearly.
  • the electronic device 100 when the electronic device 100 simultaneously displays content provided by two applications in a split-screen manner, it is not limited to the above-mentioned upper and lower split-screen methods.
  • the electronic device 100 can also display a left-right split screen, or a floating window. way to display content provided by both applications.
  • the electronic device 100 can be displayed in the user interface 20 as shown in Figure 2G
  • the attachment window 203 is shown, and the attachment window 203 can be used to display files, pictures, voices, etc. to be sent by the user.
  • the attachment window 203 may include: a picture 203A and a delete icon 203B.
  • the picture 203A may be a picture corresponding to the picture control 103A shown in FIG. 2C.
  • the electronic device 100 detects the user's drag operation on the picture control 103A, which can trigger the electronic device 100 to share the content corresponding to the picture control 103A from the entertainment application to the text message application.
  • the interface elements affected by the user's drag operation may be different from the content shared by the electronic device 100 .
  • the interface elements displayed by the electronic device 100 are only presentation forms of the content shared by the electronic device 100 .
  • FIG. 2C when the electronic device 100 detects a drag operation on the picture control 103A, the content shared by the electronic device 100 is a picture.
  • the corresponding content can be a paragraph of text.
  • the corresponding content can be a URL.
  • the electronic device 100 when the electronic device 100 detects a user operation on the send icon 202A in the information input area 202, in response to the operation, the electronic device 100 can send the picture 203A to other devices and display the picture in the information display area 201. 203A.
  • the electronic device 100 may not display the application list 105 and directly display the user interface as shown in FIG. 2F. That is, directly switch to the user interface provided by another application.
  • the application can be the application most frequently used by the user, or the application recently used by the electronic device 100 , or the application running in the background of the electronic device 100 , or it can also be An application preset on the electronic device 100 is used to display content shared by the user, such as a picture browsing application and so on. For example, the electronic device 100 directly switches from the user interface 10 shown in FIG. 2C to the user interface shown in FIG. 2F.
  • the sharing operation may be the drag operation of FIG. 2C-FIG. 2F, or the click operation.
  • the electronic device 100 detects an action on (a in FIG. 2B ), in response to this operation, the electronic device 100 can display the application list 105 shown in FIG. 2D at the bottom of the user interface 10.
  • the electronic device 100 detects When a click operation on the first icon 105A in the application list 105 is detected, in response to the operation, the electronic device 100 can share the content corresponding to the picture control 103A to the text message application, that is, display the user as shown in Figure 2G Interface 20.
  • 3A to 3F show some user interfaces involved when the electronic device 100 shares multiple contents between devices.
  • FIG. 3A shows an exemplary user interface 10 displayed when the electronic device 100 enters the sharing mode.
  • the user interface please refer to the relevant description of (a) in FIG. 2B , which will not be described again here.
  • the electronic device 100 when the electronic device 100 detects a selection operation, such as a click operation, on the picture control 103A, in response to the operation, the electronic device 100 can select the picture control 103A and determine the picture control 103A to be shared. content, and change the display effect of the picture control 103A.
  • the display effect may include: color, size, saturation, transparency, etc.
  • the display effect can be seen in the picture control 103A shown in FIG. 3B.
  • the background color of the picture control 103A is darker than that of the picture control 103A in FIG. 3A .
  • the selected picture control 103A may display an animation effect, such as a shaking effect.
  • an animation effect such as a shaking effect.
  • the user interface 10 also includes a picture control 103C.
  • a selection operation such as a click operation
  • the electronic device 100 can select the picture control 103C. , determine the picture control 103C as the content to be shared, and change the display effect of the picture control 103C.
  • the background color of the picture control 103C is darker than the background color of the picture control 103C in Figure 3B. It can be seen from Figure 3C that both the picture control 103A and the picture control 103C are currently selected.
  • the electronic device 100 can detect a sharing operation, such as a drag operation, acting on the picture control 103A or the picture control 103C. It can be expressed as a sliding process from gesture 1 to gesture 2 as shown in Figure 3D.
  • the electronic device 100 can display a screenshot picture 103D.
  • the screenshot picture 103D can move along the sliding trajectory of the gesture.
  • the screenshot picture 103D can be obtained by combining or superimposing the screenshots of the picture control 103A and the picture control 103C. .
  • the electronic device 100 When the electronic device 100 detects the user's sharing operation and the gesture slides to the bottom of the user interface, for example, when detecting gesture 2 in the user's sharing operation, the electronic device 100 can switch to another application, as shown in Figure 3E
  • the user interface 20 may be a user interface provided by a text message application.
  • FIG. 2F For a specific description of the user interface 20, please refer to the related description of FIG. 2F, and will not be described again here.
  • the electronic device 100 can display the attachment window 203 shown in Figure 3F in the user interface 20.
  • the attachment window 203 is used to display files, pictures, voices, etc. to be sent by the user.
  • the attachment window 203 may include: a picture 203C and a picture 203D.
  • the picture 203C can be a picture corresponding to the picture control 103A selected in Figure 3C
  • the picture 203D can be a picture corresponding to the picture control 103C selected in Figure 3D.
  • the electronic device 100 can allow the user to select multiple interface elements and share the multiple interface elements at the same time.
  • the electronic device 100 can allow the user to select multiple interface elements and share the multiple interface elements at the same time.
  • a user has multiple contents to share, he or she can select multiple contents to share the contents at once, which improves the sharing efficiency and also facilitates the user's operations during the content sharing process.
  • 4A-4D illustrate some user interfaces involved when the electronic device 100 performs content sharing between devices.
  • the electronic device 100 When the electronic device 100 detects a drag operation on the picture control 103A as shown in (a) in FIG. 2B, in response to the operation, the electronic device 100 can generate and display a screenshot image 103B, which follows the user's drag. Drag operation to move.
  • the electronic device 100 can display a device list 106 as shown in FIG. 4A at the bottom of the user interface.
  • the device list 106 is used to display a or Multiple device icons.
  • the device list 106 may include: a first icon 106A, a second icon 106B, and a third icon 106C.
  • the first icon 106A can be used to trigger the sending of the picture corresponding to the picture control 103A to device 1
  • the second icon 106B can be used to trigger the sending of the picture corresponding to the picture control 103A to device 2
  • the third icon 106C can be used to trigger the sending of the picture corresponding to the picture control 103A to device 2.
  • the picture corresponding to 103A is sent to device 3.
  • the device corresponding to the device icon displayed in the device list 106 may be a device with which the electronic device 100 has established a connection relationship (such as a wired connection relationship or a wireless connection relationship), or may belong to the same account or company as the electronic device 100 . Devices under an account group, etc. This embodiment of the present application does not limit the association between the device icons displayed in the device list 106 and the electronic device 100 .
  • the embodiment of the present application does not limit the position displayed in the device list 106.
  • the device list 106 can be displayed on the left side of the user interface or on the right side of the user interface.
  • the display position of the device list 106 please refer to the above-mentioned information. The relevant description of the display position of the application list 106 will not be described again here.
  • the electronic device 100 when the electronic device 100 detects that the user's drag operation on the picture control 103A is in the range where the second icon 106B is located, that is, as shown in Figure 4B, the second icon 106B and the screenshot When the pictures 103B overlap, the electronic device 100 displays prompt information 107 as shown in FIG. 4C in the user interface 10.
  • the prompt information 107 is used to prompt the user that the picture corresponding to the picture control 103A has been sent to the second icon 106B.
  • Device 2
  • the electronic device 100 can change the display effect of the second icon 106B.
  • the display effect may include: icon size, icon color, icon position, etc. wait. Comparing Figure 2A and Figure 2B, it can be seen that when the user's drag operation acts on the range where the second icon 106B is located, the electronic device 100 can change the icon color of the second icon 106B, thereby prompting the user through the change of the color.
  • the picture corresponding to the picture control 103A can be shared with the device 2 corresponding to the second icon 106B.
  • the electronic device 100 can share the picture corresponding to the picture control 103A with the picture corresponding to the second icon 106B after the user's drag operation acts within the range of the second icon 106B and maintains it for a period of time, such as 1 second.
  • Device 2 displays prompt information 107 as shown in Figure 4C.
  • the device 2 may display the user interface 40 as shown in FIG. 4D.
  • user interface 40 may be an exemplary user interface for an application menu displayed by device 2 .
  • the user interface 40 may include a window 401.
  • This window 401 can be used to display pictures sent by the electronic device 100 .
  • the window 401 may include a cancel option 401A, a save option 401B, a copy option 401C, and a picture 401D.
  • the cancel option 401A can be used to trigger the cancellation of obtaining the picture sent by the electronic device 100
  • the save option 401B can be used to trigger the saving of the picture sent by the electronic device 100 to the local
  • the copy option 401C can be used to trigger the copying of the picture sent by the electronic device 100, and copy it on the device 2
  • device 2 can detect the user's paste operation when displaying other input windows and paste the picture in the input window. For example, when device 2 displays a memo, it detects the user's long press operation and triggers the display.
  • Paste option after detecting the user's confirmation operation of the paste option, triggers the copied picture to be pasted in the memo.
  • the picture 401D displays a picture sent by the electronic device 100.
  • the picture may be a picture corresponding to the picture control 103A shown in FIG. 4A.
  • the window 401 is not limited to being displayed in the user interface 40 mentioned above.
  • the device 2 detects the picture sent by the electronic device 100 when displaying the user interface provided by the application, the device 2 can display the window 401 in the user interface provided by the application.
  • the window 401 is displayed in the user interface.
  • FIGS. 2A-2G, 3A-3F, and 4A-4D only show the process of sharing pictures by the electronic device 100. It should be understood that the embodiments of the present application do not limit the shared content.
  • the electronic device 100 100 can detect the user's sharing operation on the text control and share the content (such as text) corresponding to the text control to other applications or other devices.
  • the electronic device may be a portable terminal device equipped with iOS, Android, Microsoft or other operating systems, such as a mobile phone, a tablet, a wearable device, etc., or a laptop computer (Laptop) with a touch-sensitive surface or touch panel.
  • Non-portable terminal devices such as desktop computers with touch-sensitive surfaces or touch panels.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. This embodiment of the present invention takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 5 is a schematic diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as application A and application B.
  • application A and application B may be camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, Video, SMS and other applications.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include system window frame, system view frame, system drag service, etc.
  • the system window frame is used to provide windows for applications.
  • the system view framework is used to manage and display views and manage the responsive behavior of controls.
  • the display interface of the electronic device 100 may be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the system drag and drop service is used to generate and manage drag and drop windows, and change the display position of the window based on the user's sharing operation (such as drag and drop operation).
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the following uses the user interface shown in FIGS. 2A to 2G as a specific example to describe in detail the interaction process between the modules in the software structure of the electronic device 100 .
  • 2A-2G show the steps involved when the electronic device 100 detects a drag operation on an interface element in the sharing mode and shares the content corresponding to the interface element from one application to another application. user interface.
  • FIG. 6 is an interaction flow chart between internal modules in the software structure of the electronic device 100 provided by the embodiment of the present application.
  • the content sharing method provided by the embodiment of the present application involves application A, application B, system window frame, system view frame, and system drag service in the software structure of the electronic device 100.
  • system window frame, system view frame, and system drag service please refer to the relevant content in Figure 5 mentioned above, and will not be described again here.
  • the interactions between modules in the software structure may include:
  • Application A detects the startup operation.
  • the electronic device 100 may detect a startup operation acting on application A, and the startup operation may be used to trigger the startup of application A.
  • the startup operation may be a click operation on the icon of application A.
  • the application A may refer to the entertainment application mentioned in the relevant content of Figure 2A.
  • Application A starts the display of the window of application A.
  • the electronic device 100 may display the window of application A.
  • the window may include one or more interface elements, and the one or more interface elements may be arranged, combined, or stacked up and down to form a user interface displayed on the display screen of the electronic device 100 .
  • a user interface may include one or more windows.
  • Interface elements can be divided into controls and layouts, where layout is a special type of control.
  • layout can contain other layouts or controls, but a control cannot contain other layouts or controls.
  • the user interface of application A may refer to the user interface 10 shown in FIG. 2A.
  • Figure 7 is a schematic diagram of a tree structure of windows, controls and layout provided by the embodiment of the present application.
  • the window is the root of all displayed content, and a View tree can be constructed in the window.
  • the View tree describes the overlay and arrangement relationships of the controls and layouts contained in the user interface.
  • layout 1 is located in the window.
  • Layout 1 can include layout 21, layout 22, layout 23, etc.
  • Layout 1 includes control 31, layout 22 includes control 32, and layout 23 includes control 33.
  • FIG. 8 is a schematic diagram of part of the layout and controls in the user interface 10 provided by the embodiment of the present application. As can be seen from Figure 8, control 11 and control 12 are located in layout 1.
  • Application A detects the operation of entering the sharing mode.
  • the electronic device 100 When the electronic device 100 displays the user interface provided by application A, it can detect an operation acting on the user interface, and the operation can be used to trigger the electronic device 100 to enter the sharing mode. In this sharing mode, the electronic device 100 can change the user currently displayed on the electronic device 100 interface, and changes the responsive behavior of interface elements in the currently displayed user interface. For details, please refer to subsequent steps S106 and S107.
  • the operation may be a quick three-click operation as shown in FIG. 2A.
  • the sharing mode may also be called a drag-and-drop mode, and the embodiment of the present application does not limit this name.
  • Application A sends the operation instruction information to the system window frame.
  • the electronic device 100 sends the instruction information of the operation to the system window frame through application A.
  • the system window frame sends a rendering request to the system view frame according to the instruction information.
  • the electronic device 100 may send the rendering request to the system view frame through the system window frame.
  • This rendering request is used to trigger changes to the current display content, that is, to use sharing mode rendering for the currently displayed user interface.
  • the change in the display content may be used to prompt the user that the electronic device 100 has entered the sharing mode.
  • the system view framework uses rendering in shared mode for the window of application A.
  • the electronic device 100 may use the rendering in the sharing mode for the currently displayed window through the system view frame, that is, display the window in the sharing mode.
  • the window in the sharing mode may refer to displaying a border around each interface element in the currently displayed window.
  • the border may refer to the border around each interface element as shown in Figure 2B.
  • rendering in sharing mode is not limited to displaying borders around each interface element, but can also change the display effect of interface elements, display text prompt information, etc.
  • step S204 shown in Figure 9. The content will not be described here.
  • the current display content may not be changed.
  • steps S105-S106 are optional steps.
  • the system view framework changes or creates the response behavior of the interface elements in the window of application A.
  • the electronic device 100 can change or create the response behavior of the interface elements in the window of application A through the system view framework. After the response behavior of the interface element is changed or created, the electronic device 100 can trigger sharing of the interface element in response to a drag operation acting on the interface element.
  • the system view framework can adjust the response behavior of the interface element to a specified operation.
  • the specified operation may refer to a drag and drop operation
  • the response behavior may refer to sharing of the interface element.
  • the electronic device 100 may detect a drag operation on an interface element in the window of application A.
  • the drag operation may be a touch operation acting on the display screen.
  • the drag operation may refer to a continuous drag operation of the user's finger acting on the display screen as shown in FIGS. 2D to 2F .
  • steps S108-S117 take the sharing operation as a drag and drop operation as an example to describe part of the process of content sharing between applications.
  • the sharing operation is not limited to a drag and drop operation, but can also be a click operation, long press and drag. Operations, etc., specifically regarding what form of sharing operation is detected by application A and triggers the sharing of content, is consistent with the specified operation of system view frame adjustment in step S107.
  • Application A generates touch events based on the drag operation.
  • the electronic device 100 can generate a touch event through application A according to the drag operation.
  • the input event generated by application A based on the drag operation is a touch event.
  • the drag operation is an operation triggered by the user through the mouse
  • the input event generated by application A based on the drag operation is a mouse event.
  • the embodiment of the present application does not limit the type of the input event.
  • the input event can include event type, coordinates, events and other information.
  • the event types may include down events, move events, and on events.
  • the down event represents the beginning of a user gesture
  • the on event represents the end of a user gesture
  • the move event represents the process of a user gesture.
  • the Input event triggered by a user gesture can include a down event, multiple move events and an up event.
  • the event type in the input event indicates whether the user's operation is a drag operation, a click operation, a long press and drag operation, etc.
  • the coordinates refer to the position on the display where the sharing operation occurs
  • the time refers to the time when the user triggers the sharing operation.
  • Application A sends the touch event to the system window frame.
  • the electronic device 100 can send touch events to the system window frame through application A.
  • the system window frame sends the touch event to the system view frame.
  • the electronic device 100 may send touch events to the system view frame through the system window frame.
  • the system view framework triggers a screenshot of the target interface element based on the touch event, and determines the screenshot as the content displayed during the dragging process.
  • the target interface element is the interface element that the drag operation acts on when the electronic device 100 starts to detect the drag operation.
  • the target interface element may refer to the interface element pointed by the touch point when the user's finger starts to touch the display screen when the user initiates the drag operation.
  • the electronic device 100 can trigger a screenshot of the target interface element according to the touch event through the system view framework, and determine the screenshot as the content displayed during the dragging process.
  • the system view framework can find the target interface element for the drag operation by traversing the interface elements in the window of the current application A and based on the coordinate information corresponding to the down event contained in the touch event, and according to step S107,
  • the adjusted response behavior of the target interface element determines whether the operation currently acting on the target interface element is a specified operation, and if so, triggers the execution of the adjusted response.
  • the electronic device 100 may display a screenshot of the target interface element during the dragging process.
  • the target interface element may refer to the picture control 103A shown in FIG. 2C , which is the interface element pointed by the gesture 1 shown in FIG. 2C .
  • step S112 is an optional step.
  • the system view framework obtains the transmission content for the target interface element.
  • the electronic device 100 can obtain the content of the target interface element through the system view framework, and determine the content as the content transmitted during the sharing process. For example, when the content corresponding to the target interface element is a picture, the content shared by the electronic device 100 is the picture.
  • the system view framework can obtain the transmission content for the target interface element in the following two ways:
  • the system view framework can obtain the information carried by the target interface element itself through the external interface of the target interface element.
  • a Text control developers can set the data corresponding to the Text control through the setstring interface of the Text control.
  • the control's configuration file indicates the control's properties, layout, size, location, and other information.
  • developers can write the control's transmission content into the configuration file in advance to adapt to the sharing mode in advance. In this way, when you need to obtain the transmission content of the control, you can find the transmission content corresponding to the control from the configuration file of the control.
  • android:dragcontext "Henry” is the code added by the developer in advance to the configuration file in order to adapt to this sharing mode.
  • the system view framework can determine that the transmission content of the Text control is the text "Henry” through the content added in advance.
  • the transmission content is the content corresponding to the interface element, and the interface element is a form of presentation of the transmission content to users.
  • the content transmitted by the electronic device 100 may include: text, pictures, voices, tables, videos, files, etc.
  • the embodiment of the present application does not limit the transmitted content.
  • the system view framework sends the screenshot and transmission content of the target interface element to the system drag service.
  • the electronic device 100 can send the screenshot and transmission content of the target interface element to the system drag service through the system view framework.
  • step S112 is an optional step
  • the system view framework may only send the transmission content to the system drag service.
  • the system drag service generates a drag window and displays screenshots in the drag window.
  • the electronic device 100 can display a screenshot of the target interface element following the user's touch point on the display screen during the user's dragging process. Specifically, the electronic device 100 can generate a drag window through the system drag server, and display a screenshot of the target interface element in the drag window, and the system drag service can act on the display screen according to the user's drag operation. The position of the touch point is moved synchronously, so that the screenshot of the target interface element moves with the user's drag operation.
  • the screenshot displayed during the dragging process may be the screenshot image 103B.
  • step S115 is an optional step.
  • the system drag service sends the transmission content to application B.
  • the electronic device 100 can send the transmission content to application B through the system drag service.
  • application B may be a preset specified application, such as a desktop, memo application, etc.
  • application B may be an application determined under preset rules, such as applications recently opened by the user, applications used most frequently by the user, etc.
  • application B may be the application selected by the user for the sharing operation. For example, when the sharing operation is a drag operation and the electronic device 100 detects the user's drag operation on the target interface element to the application icon of application B, the system drags The server can send the transmission content to application B.
  • the embodiment of this application does not limit application B.
  • Application B displays the transmission content or the identification of the transmission content in the window of application B.
  • the transmission content or the identification of the transmission content may be displayed in the window of application B.
  • the transmission content and the identification of the transmission content may also be referred to as second interface elements.
  • interface elements please refer to the above content.
  • application B can start the display of application B's window and directly display the picture or text in the window.
  • application B can start the display of the window of application B and display the playback interface of the video in the window.
  • the playback interface can display a frame of the video.
  • the playback interface can include playback controls that can be used to trigger the video to play.
  • application B can start the display of the window of application B and display the play icon of the voice in the window. The play icon can be used to trigger the play of the voice.
  • the identification of the transmission content is the presentation form of the transmission content, and the presentation form may be a preset icon or a screenshot of the target interface element, etc. This embodiment of the present application does not limit this.
  • the window of application B may refer to the user interface 20 as shown in FIG. 2G
  • the transmission content or the identification of the transmission content displayed in the user interface may refer to the picture 203A shown in FIG. 2G .
  • the window of application B can be started to be displayed by application B before step S117, or after application B obtains the transmission content, application B can first trigger the display of the window of application B in step S117. , and then change the window of application B to display the transmission content or the identification of the transmission content.
  • the system drag and drop service can also send it to other devices, thereby realizing content sharing between devices.
  • the above steps S101-S117 can realize content sharing from application A to application B.
  • application A and application B in addition to being different applications, can also be the same application, and the embodiment of the present application does not limit this.
  • the electronic device 100 implements content sharing from application A to application B through the system window frame, system view frame and system drag server in the application framework layer.
  • the system view framework is used to automatically adjust the response behavior of interface elements to drag operations and determine the display content and transmission content during the drag process. A system-level content sharing effect is achieved.
  • Figure 9 is a schematic flowchart of a content sharing method provided by an embodiment of the present application.
  • the method includes:
  • the electronic device 100 displays the first window.
  • the first window may be a window of the first application.
  • the electronic device 100 may detect a user's operation on the application icon of the first application, for example, a click operation, and display the first window of the first application in response to the operation.
  • the displayed content in the first window may refer to the user interface 10 shown in FIG. 2A.
  • the first window may include one or more interface elements.
  • Interface elements refer to a series of elements in the user interface that meet user interaction requirements, including: pictures, text, icons, buttons, menus, tabs, text boxes, and dialog boxes. , status bar, navigation bar, Widget and other controls, as well as combinations of these controls.
  • the electronic device 100 can detect the startup operation through application A, and start displaying the user interface of application A.
  • the first application may refer to application A
  • the displayed content in the first window may refer to the user interface of application A.
  • steps S101-S102 please refer to the relevant descriptions of steps S101-S102.
  • the electronic device 100 detects the operation of entering the sharing mode on the first window.
  • the operation may refer to an operation acting on the first window.
  • the operation may refer to the operation shown in FIG. 2A, such as a quick three-click operation.
  • the operation may also refer to the operation of turning on the sharing mode in the drop-down menu.
  • this operation may also be called a first operation, and this operation is used to trigger the electronic device 100 to enter the sharing mode.
  • the electronic device 100 can adjust the response behavior of one or more interface elements in the first window, so that the electronic device 100 can detect the sharing operation acting on the interface element and trigger sharing of the interface element. For details, please refer to the description of subsequent step S203, which will not be discussed here.
  • the electronic device 100 can start rendering in the sharing mode for the first window, that is, change the display content, and visually remind the user that the sharing mode has been entered. For details, please refer to the description of subsequent step S204, which will not be discussed here.
  • the response behavior of the electronic device 100 to the same interface element is different before entering the sharing mode and after entering the sharing mode, or, before entering the sharing mode, the interface element has no response behavior.
  • This operation may refer to the operation mentioned in step S103 above.
  • the electronic device 100 may only control some interface elements in the first window to enter the sharing mode, or the electronic device 100 may not respond to the operation of entering the sharing mode. In this way, only some interface elements in the first window can be allowed to support sharing, and other interface elements can be prohibited from sharing. This is to avoid some applications that contain security and privacy information. When they do not want to share content within the application, developers can adjust the content that is supported and not supported to share, thus ensuring the security and privacy of users.
  • the electronic device 100 can provide the following three levels of sharing prohibition:
  • the electronic device 100 prohibits the entire user interface from entering the sharing mode.
  • the electronic device 100 After the electronic device 100 detects the operation of entering the sharing mode on the first window, the electronic device 100 will not enter the sharing mode.
  • the electronic device 100 may prohibit a certain area in the entire user interface from entering the sharing mode, and the partial area may include multiple controls.
  • the electronic device 100 may only control the first area in the first window to enter the sharing mode. Then, after entering the sharing mode, the electronic device 100 only adjusts the response behavior of the interface elements in the first area of the first window, and only starts rendering in the sharing mode for the content of the first area in the first window.
  • the electronic device 100 can prohibit some controls in the user interface from entering the sharing mode.
  • the electronic device 100 can only control other interface elements other than the first control to enter the sharing mode. Then after entering the sharing mode, the electronic device 100 only adjusts the response behavior of the interface elements in the first window except the first control, and only starts rendering in the sharing mode for the content in the first window except the first control. .
  • the embodiment of the present application does not limit the execution order of step S201 and step S202.
  • the electronic device 100 may first detect the operation of entering the sharing mode and then display the first window.
  • the electronic device 100 changes or creates the response behavior of the interface element in the first window.
  • the response behavior of the interface element refers to the response performed by the electronic device 100 after detecting a specified operation on the interface element.
  • the electronic device 100 changes or creates the response behavior of the interface element in the first window so that after entering the sharing mode, the electronic device 100 detects a sharing operation acting on the interface element in the first window.
  • the response is to trigger the sharing of the interface element.
  • the response behavior of the electronic device 100 to change the interface element means that before entering the sharing mode, the electronic device 100 can detect the same operation as the sharing operation acting on the interface element and perform a certain response. After entering the sharing mode, the electronic device 100 Device 100 changes the response to trigger sharing of the interface element.
  • the response behavior of the electronic device 100 when creating an interface element means that before entering the sharing mode, when the electronic device 100 detects the same operation as the sharing operation on the interface element, the electronic device 100 will not perform any response.
  • the electronic device 100 may determine the response of the interface element as triggering the sharing of the interface element.
  • the electronic device 100 can change or create the response behavior of the interface elements in the first window through the system view framework.
  • step S107 please refer to the relevant description of step S107, which will not be described again here.
  • the electronic device 100 may only change or create the response behavior of some interface elements in the first window.
  • This part of the interface elements can be interface elements set by the developer to allow entering the sharing mode.
  • the electronic device 100 can adjust the response behavior of the M interface elements in the first window, so that the electronic device 100 can trigger the sharing operation on the target interface element in the M interface elements. Sharing of interface elements for this target.
  • M ⁇ 1 M ⁇ 1
  • M is a positive integer.
  • the electronic device 100 displays the first window in the sharing mode.
  • the electronic device 100 enters the sharing mode, starts rendering in the sharing mode for the first window, that is, displays the first window in the sharing mode.
  • the first window in this sharing mode is different from the first window before entering the sharing mode.
  • the electronic device 100 may display prompt information (eg, first prompt information) in the first window, where the prompt information is used to indicate that the electronic device 100 has entered the sharing mode.
  • the prompt information can be expressed as:
  • the electronic device 100 may add information in the first window. The user is reminded that the user has entered the sharing mode through changes in the information in the first window before and after entering the sharing mode.
  • the prompt information may include a new border added to each interface element in the first window after entering the sharing mode.
  • the first menu bar 102 in the user interface 10 and the photo options, follow options, recommendation options, local options, and more options in the first menu bar 102 , as well as the options in the browsing area 103 The user's avatar, name, published entertainment content, as well as sharing options, comment options, like options, etc. for the entertainment content, as well as the second menu bar 104 and the homepage options, discovery options, and message options in the second menu bar 104 , local options and other interface elements are each surrounded by a rectangular border.
  • the electronic device 100 after the electronic device 100 enters the sharing mode, it can traverse the interface elements in the first window, determine the size and position information of each interface element, and determine the frame size and position of each interface element based on the size and position information, and Display the borders of each interface element in the first window. For example, assuming that the electronic device 100 obtains the length L and width W of the first interface element and the position of the first interface element in the first window, the electronic device 100 can display a message at the position after entering the sharing mode. A rectangular border with length L and width W.
  • the frame can be used to remind the user that the electronic device 100 has entered the sharing mode. Furthermore, the frame can also be used to remind the user that the electronic device 100 can detect the sharing operation on the interface element surrounded by the frame and trigger the interface element. of sharing.
  • the electronic device 100 can determine the size and position information of each interface element in the following two ways:
  • the configuration file of the interface element can contain the size and position information of the interface element.
  • the electronic device 100 may obtain the size and position information of each interface element from the configuration file of each interface element in the process of traversing the interface elements in the first window.
  • the electronic device 100 needs to calculate each interface based on the placement and arrangement of each interface element in the first window, as well as the relative relationship between each interface element (such as position relationship, size relationship, etc.)
  • the element's size and position information For example, suppose there are two interface elements: a first control and a second control. It is known that the length of the first control is The length relationship of the first control, the calculated length of the second control is XY. For another example, assuming that three controls are known to be arranged side by side in a window in a three-section layout, the size and position of the three controls can be relatively determined based on the size and position of the window.
  • the way in which the electronic device 100 determines the size and position information of each interface element is not limited to the above two methods.
  • the electronic device 100 can combine the above two methods, and some interface elements can directly obtain the interface element's information from the configuration file. Size and position information. Another part of the interface elements can be combined with the layout of each interface element to calculate the size and position information of the interface elements. The embodiment of the present application does not limit this.
  • the first prompt information can be represented as graphics, icons, text, etc.
  • the electronic device 100 can display a text prompt of "Currently entered, please select content to be shared" at the bottom of the first window.
  • the electronic device 100 displays a sharing icon in each interface element of the first window.
  • the electronic device 100 can change the display effect of each interface element in the first window after entering the sharing mode.
  • the display effect may include static effects on display such as position, size, color, transparency, shadow, saturation, brightness, etc., or may include dynamic effects such as dithering.
  • the electronic device 100 may reduce the saturation of each interface element in the first window.
  • the electronic device 100 may jitter and display each interface element in the first window.
  • the electronic device 100 can use the rendering in the sharing mode for the currently displayed first window through the system view frame.
  • the relevant content of the aforementioned step S106 please refer to the relevant content of the aforementioned step S106, which will not be described again here.
  • the electronic device 100 can display the first window in the sharing mode in one or both of the following ways:
  • the electronic device 100 only renders interface elements that support sharing in the sharing mode.
  • the electronic device 100 can only display prompt information in the area of the interface element that supports sharing, for example, only display a border around the interface element that supports sharing, or, the electronic device 100 Only change the display of interface elements that support sharing.
  • the interface elements in the first menu bar 102 and the second menu bar 104 do not include borders, and the interface elements in the browsing area 103 include borders.
  • the prompt information displayed in the first window can not only prompt the user that the current electronic device 100 has entered the sharing mode, but can also be used to highlight the interface elements in the first window that allow entering the sharing mode.
  • the electronic device 100 only displays interface elements that support sharing
  • the electronic device 100 may only display interface elements that support sharing, and stop displaying interface elements that do not support sharing, such as the third interface element.
  • the electronic device 100 only displays the content in the browsing area 103 .
  • step S204 is optional.
  • the electronic device 100 detects a sharing operation acting on the first interface element in the first window.
  • the electronic device 100 sets the response behavior of each interface element in the first window to the sharing operation.
  • the electronic device 10 detects the action In response to the sharing operation of the first interface element in the first window, the electronic device 100 can trigger sharing of the first interface element in response to the operation.
  • the sharing operation may refer to a drag and drop operation.
  • the first interface element may refer to the screenshot picture 103B as shown in Figure 2C
  • the sharing operation may refer to the drag and drop operation as shown in Figs. 2C to 2F.
  • the drag and drop operation may include the steps shown in Fig. 2C.
  • the sharing operation may also refer to the drag operation as shown in Figure 4A-4B.
  • the drag operation may include the drag operation from Gesture 1 to Gesture 2 as shown in Figure 4A, and the drag operation as shown in Figure 4B.
  • the drag operation may refer to a sliding operation from the position of the first interface element to a specified position.
  • the first interface element may include one or more interface elements.
  • the sharing operation can be used to trigger sharing of the multiple interface elements.
  • the sharing operation may also include selection operations and drag-and-drop operations on interface elements.
  • the first interface element may include a picture control 103A and a picture control 103C as shown in FIG. 3B.
  • the sharing operation may include a click operation on the picture control 103A as shown in Figure 3A, a click operation on the picture control 103C as shown in Figure 3B, and a dragging operation from Gesture 1 to Gesture 2 as shown in Figure 3D. operate.
  • the sharing operation can be one operation or a series of operations.
  • the sharing operation can be a drag operation, a click operation, or a long press and drag operation. Operation, the embodiment of this application does not limit the sharing operation.
  • the electronic device 100 obtains the transmission content corresponding to the first interface element.
  • the electronic device 100 After the electronic device 100 detects the sharing operation on the first interface element, the electronic device 100 can obtain the transmission content for the first interface element.
  • the electronic device 100 can obtain the transmission content of the first interface element in two ways: 1) obtain the transmission content from the information carried by the first interface element itself, 2) obtain the transmission content from the pre-adapted configuration file. Transfer content.
  • two acquisition methods please refer to the relevant content of step S113 in Figure 6 and will not be described again here.
  • the electronic device 100 can obtain the transmission content for the first interface element through the system view frame.
  • step S113 the relevant description of step S113, which will not be described again here.
  • the electronic device 100 triggers sharing of the transmission content.
  • the sharing of transmission content by the electronic device 100 may include the following two situations:
  • the electronic device 100 shares the transmission content between windows.
  • the electronic device 100 can share transmission content from one window to another window.
  • the electronic device 100 may display the second interface element corresponding to the first interface element in the second window in response to the sharing operation.
  • the second interface element includes: transmission content or an identification of the transmission content.
  • the identifier of the transmission content may be an icon or a screenshot of the first interface element. For a specific description of the identifier, please refer to the above-mentioned relevant content, and will not be described again here.
  • the transmission content or the identification of the transmission content may refer to the picture 203A, or the transmission content or the identification of the transmission content may refer to the picture 203C and the picture 203D.
  • sharing between windows can be divided into the following two types:
  • the first window and the second window belong to the same application window.
  • the first window and the second window may display content of different pages of the application.
  • the first window and the second window belong to windows of different applications.
  • the first window is a window of a first application
  • the second window is a window of a second application.
  • the electronic device 100 can realize drag-and-drop sharing of content within an application and drag-and-drop sharing of content between applications, thereby improving the flexibility of content sharing.
  • the electronic device 100 when the electronic device 100 displays the second window, it may no longer display the first window. In this way, the electronic device 100 can complete window switching while completing content sharing.
  • the second window may refer to the user interface 20 shown in FIG. 2G.
  • the electronic device 100 displays the second window it still displays the first window on the same interface. In this way, when realizing content sharing between windows, the electronic device 100 can simultaneously display the content sharing party and the content receiving party.
  • the user interface may refer to the user interface 30 shown in (b) in FIG. 2F .
  • the first window is displayed in the area 301 of the user interface 20 and the second window is displayed in the area 302 .
  • the electronic device 100 shares the transmission content between devices.
  • the electronic device 100 can share the transmission content from the electronic device 100 to other devices (eg, a second device).
  • the other device may refer to a device that has established a connection relationship with the electronic device 100, or the other device and the electronic device 100 belong to the same account or device under the same group.
  • the sharing operation may be a drag operation.
  • the electronic device 100 may display the image that is moved according to the drag operation. Screenshot of the first interface element.
  • the electronic device 100 when the electronic device 100 detects a sharing operation acting on the first interface element, the electronic device 100 can take a screenshot of the first interface element to obtain a screenshot of the first interface element, and then the electronic device 100 can display This screenshot can be moved according to the movement trajectory of the user's sharing operation.
  • the screenshot image may be the screenshot image 103B as shown in FIGS. 2D-2E, or the screenshot image 103D as shown in FIG. 3D, or the screenshot image 103B as shown in FIG. 4B.
  • the electronic device 100 when the electronic device 100 detects a sharing operation, the electronic device 100 may trigger the display of multiple application or device icons that receive the transmission content.
  • the electronic device 100 detects the user's selection of the target application or the target device icon. After the operation, share the transferred content to the target application or target device. In this way, users can independently select the target application or target device to receive the transmitted content according to their own needs, thereby increasing user operability.
  • the electronic device 100 can move the drag operation from the position of the first interface element to a designated position (such as the display screen) after detecting that the drag operation is located. bottom), or when the specified movement trajectory of the drag operation (such as moving downward) is detected, the receiver of the transmission content is triggered to display.
  • the application list contains multiple application icons.
  • the drag operation is from the position of the first interface element to the position of one of the application icons in the application list, Trigger the sharing of the transmission content corresponding to the first interface element to the application.
  • the application list may refer to the application list 105 as shown in FIG. 2D or 2E
  • the device icon may refer to the device list 106 as shown in FIG. 4A.
  • the recipients of the transmission content displayed by the electronic device 100 are not limited to the above-mentioned application list.
  • the electronic device 100 may also trigger the display of the recipient of the transmission content after detecting the operation of entering the sharing mode.
  • the recipient can also be a device list.
  • the device list may display icons of multiple devices.
  • the electronic device 100 may trigger the first interface element to be moved to the position of one of the device icons in the device list.
  • the transmission content corresponding to the element is shared to the device corresponding to the device icon.
  • the electronic device 100 may be called the first device, and the device corresponding to the device icon may be called the second device.
  • the recipient can also be a contact list.
  • the contact list may display icons of multiple contacts.
  • the electronic device 100 may trigger the drag operation.
  • the transmission content corresponding to the first interface element is shared to the device used by the contact.
  • the electronic device 100 may be called the first device, and the device used by the contact may be called the second device.
  • the contact may refer to a phone contact pre-stored in the electronic device 100, a contact in a designated application (such as a WeChat application), or a contact in an account group, etc. This embodiment of the present application does not apply to this contact. Make restrictions. The embodiment of this application does not limit the expression form of the recipient.
  • the specified position may be the position of one of the application icons in the above-mentioned application list.
  • the electronic device 100 may trigger Share the transmission content between windows, display the transmission content or the identification of the transmission content in the window of the application, or the designated location is one of the device icons in the above-mentioned device list, or the contact
  • the electronic device 100 can trigger the transmission content corresponding to the first interface element to be sent to other devices.
  • the first interface element may include one or more interface elements.
  • the electronic device 100 can complete sharing of one or more interface elements through one sharing operation.
  • the electronic device 100 can detect the sharing operation for the multiple interface elements before detecting the sharing operation for the multiple interface elements.
  • Selection operation of interface elements may refer to the selection operation acting on the picture control 103A as shown in Figure 3A, and the selection operation acting on the picture control 103C as shown in Figure 3B.
  • the first interface element includes the picture control 103A and Picture control 103C. It can be seen that with the content sharing method provided by the embodiments of this application, users can quickly enter the sharing mode and quickly complete the sharing of content between different applications or different devices in the sharing mode, which expands the application scenarios of content sharing and makes it convenient for users. operation.
  • the above steps S201-S207 can be executed by the system unit of the electronic device 100, which can be located at the framework layer of the electronic device 100.
  • the application to which the system unit and the first window belong is the electronic device 100.
  • the sharing mode can be defined as a system-level drag-and-drop sharing mode. In this way, any application under the system can respond to the first operation and enter the sharing mode to realize application content sharing and expand the content. Sharing application scenarios improve users’ experience of drag-and-drop sharing.
  • the frame layer of the electronic device 100 please refer to the relevant content in Figure 5, and will not be described again here.
  • each step in the above method embodiment can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the method steps disclosed in conjunction with the embodiments of this application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • This application also provides an electronic device, which may include a memory and a processor.
  • the memory can be used to store computer programs; the processor can be used to call the computer program in the memory, so that the electronic device executes the method executed by the electronic device 100 in any of the above embodiments.
  • the present application also provides a chip system, which includes at least one processor for implementing the functions involved in the method performed by the electronic device 100 in any of the above embodiments.
  • the chip system further includes a memory, the memory is used to store program instructions and data, and the memory is located within the processor or outside the processor.
  • the chip system can be composed of chips or include chips and other discrete devices.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be implemented in hardware or software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software code stored in memory.
  • the memory may be integrated with the processor or may be provided separately from the processor, which is not limited by the embodiments of the present application.
  • the memory may be a non-transient processor, such as a read-only memory ROM, which may be integrated with the processor on the same chip, or may be separately provided on different chips.
  • the embodiments of this application vary on the type of memory, and The arrangement of the memory and processor is not specifically limited.
  • the chip system can be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a system on chip (SoC). It can also be a central processor (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), or a microcontroller (micro controller unit (MCU), or a programmable logic device (PLD) or other integrated chip.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processor unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller
  • PLD programmable logic device
  • the computer program product includes: a computer program (which can also be called a code, or an instruction).
  • a computer program which can also be called a code, or an instruction.
  • the computer program When the computer program is run, it causes the computer to execute the electronic device in any of the above embodiments. 100 any method of execution.
  • This application also provides a computer-readable storage medium that stores a computer program (which may also be called a code, or an instruction).
  • a computer program which may also be called a code, or an instruction.
  • the computer program When the computer program is run, the computer is caused to perform the method performed by any one of the electronic devices 100 in any of the above embodiments.
  • the processor in the embodiment of the present application may be an integrated circuit chip with signal processing capabilities.
  • each step of the above method embodiment can be completed through an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the above-mentioned processor can be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (AP 800plication specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the embodiment of the present application also provides a device.
  • the device may specifically be a component or module, and the device may include one or more connected processors and memories. Among them, memory is used to store computer programs. When the computer program is executed by one or more processors, the device is caused to execute the methods in each of the above method embodiments.
  • the devices, computer-readable storage media, computer program products or chips provided by the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects it can achieve can be referred to the beneficial effects in the corresponding methods provided above, and will not be described again here.
  • the computer may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in this application are generated in whole or in part.
  • the plan The computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk (SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了内容分享方法、图形界面及相关装置,在该方法中,电子设备可以在进入分享模式后,更改窗口中的一个或多个界面元素的响应行为,使得电子设备能够检测到用户作用于该界面元素的拖拽操作,触发针对该界面元素的分享。这样,开发人员无需提前声明支持分享的应用或界面元素,以及分享过程中的传输内容,在电子设备进入分享模式后自动更改界面元素的响应行为,实现针对该界面元素的分享,从而减少了开发人员的工作量,扩展了内容分享的应用场景。

Description

内容分享方法、图形界面及相关装置
本申请要求于2022年07月08日提交中国专利局、申请号为202210801011.4、申请名称为“内容分享方法、图形界面及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及内容分享方法、图形界面及相关装置。
背景技术
随着互联网的发展和电子设备的普及,电子设备上的应用程序越来越多,用户经常需要使用到分享操作,将一个应用程序的内容分享到其他地方,例如用户通过拖动一个文件夹中的文件到另一个文件夹,实现将该文件从一个文件夹分享到另一个文件夹中。
但是在通过拖拽操作实现对内容的分享的应用场景中,只有部分指定的应用程序才能够支持该拖拽操作,将其中的内容分享到其他地方。这是由于,开发人员需要提前对应用程序进行适配,声明该应用程序中能够进行拖动的内容,以及拖动过程中显示和传输的内容,该应用程序才能够最终响应于用户的拖拽操作,对其中的内容进行分享。
可见,为了实现应用程序的内容分享,开发人员只能提前对该应用程序进行手动适配,这不仅加大了开发人员的工作量,也使得能够实现内容分享的应用变得局限。因此,如何更好的实现内容分享,减少开发人员的工作量,是目前亟待解决的问题。
发明内容
本申请提供了内容分享方法、图形界面及相关装置,在该方法中,开发人员无需提前对应用程序进行拖拽分享的手动适配,即可实现应用的内容分享,减少了开发人员的工作量。
第一方面,本申请提供了一种内容分享方法,该方法包括:第一设备显示第一窗口,第一窗口包括一个或多个界面元素;第一设备检测到作用于第一窗口的第一操作;在第一设备检测到第一操作之后,第一设备检测到作用于一个或多个界面元素中第一界面元素的拖拽操作;响应于拖拽操作,第一设备在第二窗口中显示第一界面元素对应的第二界面元素,或者,第一设备将第一界面元素的传输内容发送给第二设备;其中,第一设备检测到第一操作之前,作用于第一界面元素的拖拽操作不用于触发第一设备在第二窗口中显示第二界面元素或将传输内容发送给第二设备。
在上述内容分享方法中,提供了一种分享模式,电子设备可以在检测到第一操作之后,进入该分享模式,在该分享模式下,电子设备可以自动更改界面元素的响应行为,使得在该分享模式下,界面元素能够通过用户的拖拽操作从一个窗口分享到另一个窗口,或者将该界面元素的传输内容分享给其他设备。这样,减少了开发人员手动适配应用实现拖拽分享的工作量,扩展了内容分享的应用场景,提升了用户的体验感。
在一些实施例中,电子设备在显示第一窗口时,可以同时显示有第二窗口,这时,电子设备可以直接实现界面元素的传输内容在两个同时显示的窗口之间的分享。
在另一些实施例中,电子设备在显示第一窗口时,未显示第二窗口,在电子设备检测拖拽操作之后,再触发显示第二窗口。这样,电子设备在实现对内容的跨窗口分享的同时,也实现了窗口的切换。
结合第一方面,在一些实施方式中,第一设备在第二窗口中显示第一界面元素对应的第二界面元素,或者,第一设备将第一界面元素对应的传输内容发送给第二设备之前,方法还包括:第一设备显示跟随拖拽操作的移动轨迹进行移动的第一界面元素的截图。
也就是说,在用户发起拖拽操作,拖动界面元素的过程中,电子设备可以跟随用户手指的移动轨迹显示该界面元素的截图,提升拖拽分享的趣味性。
结合第一方面,在一些实施方式中,第一窗口和第二窗口属于同一个应用或不同应用。
电子设备可以实现不同应用间的内容分享,即将一个应用的内容拖拽分享到另一个应用中,或者,电子设备可以实现同一个应用内,不同页面间的内容分享,即将应用的一个页面中的内容拖拽分享到另一个页面中。
结合第一方面,在一些实施方式中,第二界面元素包括:传输内容或传输内容的标识。
结合第一方面,在一些实施方式中,标识为图标,或,第一界面元素的截图。
结合第一方面,在一些实施方式中,第一操作用于触发第一设备进入第一模式,第一设备检测到作用于第一窗口的第一操作之后,方法还包括:
第一设备在第一窗口中显示提示信息,提示信息用于指示第一设备已进入第一模式。
结合第一方面,在一些实施方式中,第一提示信息用于突出显示第一界面元素。
示例性地,该第一提示信息可以表现为更改第一界面元素的显示效果,增加额外的信息等等,例如,该显示效果可以包括:位置、大小、颜色、亮度、透明度、饱和度、阴影等等静态效果,以及界面元素抖动的动态效果等等,该额外的信息可以表现为界面元素的边框,界面元素右上角的图标等等。这样,用户可以通过该第一提示信息得知当前显示的窗口中能够进行拖拽的界面元素。
结合第一方面,在一些实施方式中,第一提示信息包括第一界面元素的边框。
结合第一方面,在一些实施方式中,第一设备检测到作用于第一窗口的第一操作之后,方法还包括:第一设备停止显示第一窗口中的第三界面元素。
电子设备可以仅控制部分界面元素进入分享模式,即仅部分界面元素支持拖拽分享,这时,在分享模式下,不支持拖拽分享的界面元素可以不在第一提示信息提示的范围内,即该界面元素不会存在动画效果或显示效果的更改,也不会增加额外的信息,或者,在分享模式下,电子设备可以不显示该不支持拖拽分享的界面元素。这样,可以避免部分存在安全隐私信息的应用,由于内容的分享而泄露用户的隐私信息,或者,避免部分不重要的界面元素干扰用户对需要执行拖拽分享的界面元素的干扰。
结合第一方面,在一些实施方式中,拖拽操作具体为:从第一界面元素所在的位置到指定位置的滑动操作。
结合第一方面,在一些实施方式中,第一窗口为第一应用的窗口,第二窗口为第二应用的窗口,该指定位置可以是指第二应用的图标所在的位置;或者,该指定位置可以是指第二设备的图标所在的位置,或,第一联系人的图标所在的位置,其中,第二设备为第一联系人使用的设备。
进一步地,该第二应用的图标可以显示在包括多个应用的图标的列表内,或者,该第二设备的图标可以显示在包含多个设备图标的列表内,或者,该第一联系人的图标可以显示在包含多个联系人图标的列表内。这样,用户可以根据自己的需求将传输内容分享给指定应用或指定设备或指定联系人,提高用户的可操作性。
结合第一方面,在一些实施方式中,第二窗口显示在第一用户界面中,第一用户界面中还包括第一窗口。
也就是说,电子设备在将传输内容分享到第二窗口之后,可以同时显示第一窗口和第二窗口。这样,在内容分享成功后,用户可以同时查看到内容分享方和内容接收方的显示内容。
结合第一方面,在一些实施方式中,第一界面元素包括一个或多个界面元素。
也就是说,电子设备可以通过一次拖拽操作实现对一个或多个界面元素的拖拽分享,当电子设备通过一次拖拽操作实现对多个界面元素的拖拽分享,可以快速实现用户对多个内容的分享,便捷用户的操作。
结合第一方面,在一些实施方式中,第一界面元素包括N个界面元素,N≥2,且N为正整数,第一设备检测到作用于一个或多个界面元素中第一界面元素的拖拽操作之前,方法还包括:第一设备检测到作用于N个界面元素的选择操作。
结合第一方面,在一些实施方式中,第一设备检测到作用于第一窗口的第一操作之后,方法还包括:第一设备更改或创建第一窗口中的M个界面元素的响应行为,使得第一设备能够响应于作用在M个界面元素中第一界面元素的拖拽操作,在第二窗口中显示标识或将传输内容发送给第二设备,其中,M≥1,且M为正整数。
也就是说,电子设备可以在分享模式下,自动更改界面元素的响应行为,避免了开发人员手动声明支持分享的应用或界面元素的麻烦,减少了开发人员的工作量。
结合第一方面,在一些实施方式中,第一设备检测到作用于一个或多个界面元素中第一界面元素的拖拽操作之后,方法还包括:
第一设备从第一界面元素的信息中获取传输内容。
也就是说,电子设备可以在分享模式下,自动根据界面元素的信息确定分享过程中所需的传输内容,避免了开发人员手动声明传输内容的麻烦,减少了开发人员的工作量。
结合第一方面,在一些实施方式中,方法由第一设备的系统单元执行,系统单元与第一窗口所属的应用是第一设备的不同模块。
在一些实施例中,该系统单元可以位于第一设备的框架层。也就是说,可以将该分享模式定义成系统层级的拖拽分享模式,这样,在该系统下的任意一个应用都可以响应于第一操作,进入分享模式,实现应用的内容分享,扩大了内容分享的应用场景,提升了用户对拖拽分享的体验感。
第二方面,本申请实施例提供了一种电子设备,其特征在于,包括存储器,一个或多个处理器,以及一个或多个程序;一个或多个处理器在执行一个或多个程序时,使得电子设备实现如如第一方面或第一方面的任意一种实施方式所描述的方法。
第三方面,本申请实施例提供了一种计算机可读存储介质,包括指令,其特征在于,当指令在电子设备上运行时,使得电子设备实现如如第一方面或第一方面的任意一种实施方式所描述的方法。
附图说明
图1为本申请实施例提供的电子设备100的硬件结构示意图;
图2A-图2G、图3A-图3F、图4A-图4D为本申请实施例提供的一些用户界面;
图5为本申请实施例提供的电子设备100的软件结构示意图;
图6为本申请实施例提供的电子设备100的软件结构中,内部各模块之间的交互流程图;
图7为本申请实施例提供的窗口、控件和布局的一种树形结构示意图;
图8为本申请实施例提供的用户界面10中的部分布局和控件的示意图;
图9为本申请实施例提供的内容分享方法的流程示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请以下实施例中的术语“用户界面(user interface,UI)”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在电子设备上经过解析,渲染,最终呈现为用户可以识别的内容。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
本申请实施例提供了一种内容分享方法,该方法包括:电子设备在第一窗口中显示一个或多个界面元素,在电子设备进入分享模式后,电子设备可以检测到用户作用于该一个或多个界面元素中目标界面元素的拖拽操作,根据该目标界面元素确定传输内容,将该传输内容分享到第二窗口中,在第二窗口中显示该传输内容或该传输内容对应的标识,或者,将该传输内容分享到其他设备中,从而实现对内容的分享。
其中,界面元素是指用户界面中满足用户交互要求的一系列元素,包括:图片、文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等等控件或者这些控件的组合。
在进入分享模式后,电子设备可以更改或创建当前显示的一个或多个界面元素的响应行为,使得电子设备在进入分享模式后,可以检测到作用于该界面元素的分享操作,触发针对该界面元素的分享。相应的,在电子设备进入分享模式之前,电子设备针对该界面元素的同一操作不会触发任何行为,或者,该同一个操作可用于触发电子设备执行针对该界面元素的其他行为,该行为不同于分享该界面元素。例如,该界面元素为一张图片,在电子设备进入到分享模式之前,对该图片的点击操作可用于触发显示该图片的高清图,在进入到分享模式之后,对该图片的点击操作可用于触发分享该图片。这样,开发人员无需手动声明应用程序中能够进行拖动的内容,电子设备能够通过检测是否进入分享模式,来自动调整各界面元素的响应行为,使得电子设备能够检测到作用于界面元素的分享操作,实现对该界面元素的分享。
在电子设备检测到作用于目标界面元素的分享操作后,可以自动根据该目标界面元素确定传输内容,并根据该传输内容实现针对该目标界面元素的分享。
其中,传输内容可以包括文本、图片、语音、表格、视频、文件等等内容。例如,当界面元素为文本控件时,传输内容可以为一段文字,该文字可以为文本控件中显示的文字,也可以包含除文本控件中显示的文字之外的其他文字。又例如,当界面元素为图片控件时,传输内容可以为该图片控件对应的一张高清图,其中,图片控件中显示的内容可以为该高清图的一部分内容,其清晰度也可以低于该高清图。又例如,当界面元素为文件图标时,传输内容可以为一个文件。可以看出,电子设备自动获取界面元素对应的内容作为传输内容,这样,开发人员无需手动声明分享过程中的传输内容,电子设备可以自动根据用户选中的界面元素,将该界面元素包含的内容作为传输内容。
另外,在电子设备将传输内容分享到第二窗口之后,电子设备可以在第二窗口中显示该传输内容或该传输内容对应的标识,该标识可以为该传输内容的展现形式,例如,该标识可以为目标界面元素的截图,也可以为预设的图标,本申请实施例对该标识不做限制。
可以看出,本申请实施例提供的内容分享方法,提供了一种分享模式,在该分享模式下,电子设备可以自动更改应用的界面元素的响应行为,实现应用对拖拽分享的自适配,任意应用的内容在该分享模式下都可以实现分享,开发人员无需提前声明支持分享的应用或界面元素,以及分享过程中的传输内容,也能够实现用户对应用的内容分享,减少了开发人员的工作量,扩展了内容分享的应用场景,提升了用户的体验感。
图1示出了电子设备100的硬件结构示意图。
电子设备100可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备的具体类型不作特殊限制。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
在一些实施例中,处理器110可用于在电子设备100进入分享模式后,更改或创建当前显示的一个或多个界面元素的响应行为,根据用户的操作,从一个或多个界面元素中找到目标界面元素,并根据该目标界面元素确定传输内容。具体关于更改或创建界面元素的响应行为,确定目标界面元素,确定传输内容的描述可以参见后续方法实施例,这里先不展开。
处理器110中还可以设置存储器,用于存储指令和数据。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号解调以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。
在一些实施例中,显示屏194可用于显示用户界面,包括第一用户界面、第一窗口、第二窗口等等与内容分享有关的用户界面,具体关于该用户界面的描述可以参见后续图2A-图2G、图3A-图3F、图4A-图4D的相关内容,这里先不展开。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。
摄像头193用于捕获静态图像或视频。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。
耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。
气压传感器180C用于测量气压。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。
在一些实施例中,触摸传感器180K可用于检测用户的分享操作,并将该用户操作传递给处理器110,以便处理器110触发电子设备100进入分享模式,或者,触发电子设备100分享界面元素等等。
骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。
下面结合图2A-图2G、图3A-图3F、图4A-图4D介绍本申请实施例提供的一些用户界面。
图2A-图2G示出了电子设备100进行应用间的内容分享时,涉及到的一些用户界面。
图2A示例性示出了电子设备100启动娱乐类应用后,该娱乐类应用提供的用户界面10。该娱乐类应用可用于为用户提供社交、聊天、看视频、听音乐等等娱乐服务,例如,该娱乐类应用可以是指微博应用。
如图2A所示,用户界面10可包括:状态栏101、第一菜单栏102、浏览区域103、第二菜单栏104。其中:
状态栏101可包括移动通信信号的一个或多个信号强度指示符、无线高保真(wirelessfidelity,WiFi)信号的一个或多个信号强度指示符、电池状态指示符以及时间指示符。
第一菜单栏102可包括一个或多个选项,电子设备100可以检测到作用于该选项的操作,启动该选项对应的功能。示例性地,该第一菜单栏102可包括:拍照选项、关注选项、推荐选项、同城选项、更多选项。其中,拍照选项可用于启动拍照功能,关注选项可用于触发电子设备100在浏览区域103中显示用户关注的娱乐内容,推荐选项可用于触发电子设备100在浏览区域103中显示该娱乐类应用推荐的娱乐内容,同城选项可用于触发电子设备100在浏览区域103中显示用户所在城市的其他用户,在该娱乐类应用上发布的娱乐内容,更多选项可用于触发电子设备100显示其他更多的隐藏功能,例如发布文字、图片、视频等形式的娱乐内容等等。
浏览区域103可用于展示不同用户在该娱乐类应用上发布的娱乐内容。其中,针对一个用户发布的娱 乐内容,浏览区域103可以显示用户的头像、名称、发布的娱乐内容,以及针对该娱乐内容的分享选项、评论选项、点赞选项等。如图2A所示,浏览区域103中可以包括图片控件103A,该图片控件103A显示为图片的缩略图,电子设备100可以检测作用于该图片控件103A的用户操作,例如点击操作,触发展示该图片控件103A对应的图片,该图片可以为一张高清大图,其显示的内容可以多于图片控件103A中的缩略图显示的内容,和/或,该图片的清晰度高于图片控件103A中的缩略图的清晰度。
第二菜单栏104可包括一个或多个选项,电子设备100可以检测到作用于该选项的操作,启动该选项对应的功能。示例性地,该第二菜单栏104可包括:首页选项、发现选项、消息选项、本地选项。这多个选项可用于触发电子设备100在用户界面10中显示该娱乐类应用提供的不同页面,其中,当首页选项、发现选项、消息选项和本地选项分别处于选中状态时,电子设备100在用户界面10中显示首页页面,发现页面、消息页面、本地页面。示例性地,图2A所示的用户界面10可以为首页选项处于选中状态时,电子设备100显示的内容。
示例性地,电子设备100可以在检测到用户作用于该娱乐类应用的应用图标的用户操作,例如点击操作后,触发显示如图2A所示的用户界面10。或者,电子设备100也可以通过检测到用户语音启动该娱乐类应用的语音指令,触发显示如图2A所示的用户界面10,本申请实施例对电子设备100显示如图2A所示的用户界面10的触发方式不作显示。
应理解,本申请以电子设备100启动娱乐类应用为例,来描述本申请实施例提供的内容分享方法,本申请实施例对电子设备100启动的应用不作限制,在本申请其他实施例中,电子设备100还可以显示音乐应用、聊天类应用、办公类应用等等应用程序提供的用户界面。另外,图2A所示的用户界面10只是示例性举例,娱乐类应用提供的用户界面可以包含更多或更少的控件,该用户界面10不构成对本申请实施例的限制。
如图2A所示,当电子设备100检测到作用于用户界面10的用户操作,例如快速三击操作,响应于该操作,电子设备100进入分享模式。
在分享模式下,电子设备100可以检测到作用于当前显示的内容的分享操作,触发对该分享操作作用的界面元素的分享,例如分享到其他应用、分享给其他设备等等。
可以理解的是,当分享操作为拖拽操作是,该分享模式还可以被称为分享模式,本申请实施例对该分享模式的名称不作限制。
本申请以下实施例中,在智能手机等电子设备的“分享模式”开启的条件下,当该电子设备检测到用户的分享操作时,电子设备可以触发针对该分享操作作用的界面元素的分享。“分享模式”可以是电子设备100提供的一种服务和功能,可以支持电子设备100切换应用,实现应用间的数据分享或设备间的数据分享。
可选地,在电子设备100进入分享模式后,电子设备100可以在当前显示的界面元素的周围显示边框,该边框可用于提示用户当前电子设备100进入了分享模式,电子设备100可以检测到作用于这些边框包围的界面元素的操作,触发对该界面元素的分享。具体的,电子设备100可以遍历当前显示内容中包含的界面元素,确定这些界面元素的尺寸(例如长度和高度)和位置信息,并根据该尺寸和位置信息在这些界面元素周围绘制边框。
示例性地,在电子设备100进入分享模式后,电子设备100可以显示如图2B所示的用户界面10。其中,图2B中的(a)、(b)、(c)分别示出了三种电子设备100进入分享模式后,可能显示的用户界面10。
如图2B中(a)所示,相比于图2A所示的用户界面10,图2B所示的用户界面10中显示的每一个界面元素的周围都显示有一个矩形框,其中,该界面元素可以包括第一菜单栏102以及第一菜单栏102中的拍照选项、关注选项、推荐选项、同城选项、更多选项,以及浏览区域103中的用户的头像、名称、发布的娱乐内容,以及针对该娱乐内容的分享选项、评论选项、点赞选项等,以及第二菜单栏104、第二菜单栏104中的首页选项、发现选项、消息选项、本地选项等等。
需要注意的是,在本申请实施例中,界面元素可以是指单独的一个控件,包括文本控件、图片控件、按钮控件、表格控件等等,例如图2B中(a)所示的图片控件103A,也可以是指多个控件的组合,例如图2B中(a)所示的第一菜单栏102。本申请实施例对此不做限制。
可以理解的是,这些界面元素周围的边框除了为图2B所示的矩形框之外,还可以为圆形框、菱形框、不规则框或与该界面元素的外形贴合的线框,本申请实施例对该边框的形状不作限制。另外,这些边框除了在电子设备100进入分享模式后,一直处于显示状态外,电子设备100还可以交替显示这些边框,或者间隔一段时间显示这些边框等等。例如,在电子设备100进入分享模式之后,电子设备100可以按照从上 到下的顺序,首先显示第一菜单栏102中的边框,再显示浏览区域103中的边框,最后显示第二菜单栏104中的边框,然后在重新按从上到下的顺序显示。本申请实施例对边框的显示规则或显示时间不作限制。
可选的,在电子设备100进入分享模式后,电子设备100除了在界面元素的周围显示边框外,还可以更改当前显示内容的显示效果,该显示效果包括:位置、大小、颜色、亮度、透明度、饱和度、阴影等等。这样,用户可以根据当前显示内容的显示效果的改变,感知到当前电子设备100已进入分享模式。
可选的,在电子设备100进入分享模式后,电子设备100还可以显示提示信息,该提示信息用于提示用户当前电子设备100已经入分享模式,例如,电子设备100可以在如图2B所示的用户界面10的底端显示提示信息:当前已进入分享模式,请选择需要分享的内容。
可以理解的是,本申请实施例对电子设备100进入分享模式后,当前显示内容的更改效果不作限制。
另外,在电子设备100进入分享模式后,针对某些不支持分享的界面元素,其周围可以不显示边框,或者,该界面元素直接隐藏。示例性地,假设图2A中的第一菜单栏102和第二菜单栏104中的界面元素不支持分享,电子设备100可以显示如图2B中(b)或(c)所示的用户界面10。
如图2B中(b)所示,该用户界面10中仍显示有第一菜单栏102、浏览区域103、第二菜单栏104。但是,相比于图2B中的(a),电子设备100仅在浏览区域103中显示有边框。这样,用户可以通过仅在浏览区域103中显示的边框,了解到仅浏览区域103中的内容支持分享,第一菜单栏102和第二菜单栏104中的内容不支持分享。
如图2B中(c)所示,该用户界面10中仅显示有浏览区域103。同样地,用户可以通过进入分享模式,电子设备100显示的图2A到图2B中(c)的内容上的变化,了解到仅浏览区域103中的内容支持分享,第一菜单栏102和第二菜单栏104中的内容不支持分享。
另外,电子设备100在进入分享模式后,可以在检测到用户的指定操作(例如左滑操作)后,退出该分享模式。例如,电子设备100检测到作用于图2B中(c)所示的用户界面10的左滑操作时,电子设备100退出分享模式,显示如图2A所示的用户界面。
需要注意的是,当电子设备100进入分享模式后,电子设备100也可以更改当前显示的用户界面,例如返回该应用的上一级用户界面,或者,切换到其他应用的用户界面,且更改用户界面后,电子设备100仍然可以继续处于分享模式,电子设备100可以检测到作用于该更改后的用户界面中的界面元素的分享操作,触发针对该界面元素的分享。
如图2C所示,电子设备100可以检测到作用于如图2B中(a)所示的图片控件103A的分享操作,触发对图片控件103A的分享。其中,在电子设备100检测到作用于该图片控件103A的分享操作之后,电子设备100可以获取该图片控件103A对应的传输内容,以便电子设备100将该传输内容分享到其他应用。示例性地,该分享操作可以表现为图2C中的手势1到手势2,到图2D中的手势3,到图2E中的手势4,到图2F中的手势5的拖拽操作。在手势变化的过程中,电子设备100显示的内容可以不同。
可选地,响应于该分享操作,电子设备100可以通过对图片控件103A进行截图,获得一张与图片控件103A相同的截图图片103B。如图2C所示,该截图图片103B可以在用户触发分享操作后,跟随用户作用于显示屏上的触摸点进行移动。或者,根据用户作用于显示屏上的触摸点移动的图片可以为图片控件103A对应的图片,该图片的清晰度可以高于图片控件103A的截图,和/或,显示内容也可以多于图片控件103A的截图中的显示内容。本申请实施例对用户拖动过程中,显示的内容不做限制。
可选地,如果电子设备100进入分享模式后,电子设备100显示的界面元素的周围显示有边框,那么在电子设备100开始检测到该分享操作时,电子设备100可以停止显示该边框。
如图2D所示,当用户作用于图片控件103A的分享操作,其在显示屏中的触摸点靠近显示屏的底端时,电子设备100可以在显示屏的底端显示如图2D所示的应用列表105,该应用列表105用于显示一个或多个应用图标。示例性地,应用列表105可包括:第一图标105A、第二图标105B以及第三图标105C。其中,第一图标105A可用于触发启动短信应用,第二图标105B可用于触发启动设置应用,第三图标105C可用于触发启动图库应用。这时,截图图片103B靠近用户界面10的底端。
需要注意的是,应用列表105中显示的应用图标所对应的应用可以为电子设备100使用频率高的应用,或者,电子设备100最近一段时间使用的应用,或者,电子设备100后台运行的应用等等,本申请实施例对应用列表105中显示的应用图标,与电子设备100的关联不作限制。
可以理解的是,本申请实施例不限制应用列表105显示的位置,例如,应用列表105可以显示在用户界面的左侧,也可以显示在用户界面的右侧。示例性地,电子设备100可以在检测到用户对截图图片103B的拖拽操作移动到用户界面的左侧后,触发在用户界面的左侧显示该应用列表105。另外,本申请实施例 对电子设备100触发显示应用列表105的时机不作限制,例如,电子设备100可以在进入分享模式后,触发在该用户界面10中显示该应用列表105。
如图2E-图2F所示,当电子设备100检测到用户作用于图片控件103A的拖拽操作,作用在第一图标105A所在的范围时,示例性地,如图2E所示,第一图标105A与截图图片103B上下重叠时,电子设备100显示如图2F中(a)所示的用户界面20,该用户界面20为短信应用提供的用户界面,或电子设备100显示如图2F中(b)所示的用户界面30,该用户界面30同时包括娱乐类应用提供的用户界面10和短信应用提供的用户界面20,电子设备100可以以分屏显示的方式同时显示该用户界面10中的内容和用户界面20中的内容。其中,该用户界面20可以为电子设备100历史开启短信应用时,显示的用户界面,本申请实施例对应用间分享内容时,接收方应用显示的用户界面不做限制。
可选地,当用户的拖拽操作作用在第一图标105A所在的范围时,电子设备100可以更改该第一图标105A的显示效果,该显示效果可包括图标大小、图标颜色、图标位置等等。对比图2D和图2E可以看出,在用户的拖拽操作作用在第一图标105A所在的范围时,电子设备100可以放大第一图标105A的图标大小,从而通过该大小的变化,提示用户该图片控件103A对应的图片可以分享到第一图标105A对应的应用内。
可选地,电子设备100可以在用户的拖拽操作作用在第一图标105A所在的范围内,并保持一段时间,例如1S后,显示如图2F中(a)或(b)所示的用户界面。
如图2F中(a)所示,用户界面20可包括信息展示区域201、信息输入区域202。其中,信息展示区域201用于显示用户和其他人交流的信息,即电子设备100与其他设备通信的信息,信息输入区域202用于触发输入信息。
如图2F中(b)所示,用户界面30可包括区域301和区域302,其中,区域301用于显示娱乐类应用提供的用户界面,区域302用于显示短信应用提供的用户界面。这样,用户可以同时查看到两个应用提供的用户界面,用户可以同时查看到图片的来源方和图片的接受方,帮助用户更清晰的查看到图片的分享过程。
可以理解的是,电子设备100以分屏显示的方式同时显示两个应用提供的内容时,不限于上述提及的上下分屏方式,例如,电子设备100还可以以左右分屏,或者悬浮窗口的方式显示两个应用提供的内容。
如图2F中(a)-图2G所示,当用户将截图图片103B拖动到图2F中(a)所示的信息输入区域202时,电子设备100可以在用户界面20中显示如图2G所示的附件窗口203,该附件窗口203可用于显示用户待发送的文件、图片、语音等等。
如图2G所示,附件窗口203可包括:图片203A、删除图标203B。其中,图片203A可以为图2C所示的图片控件103A对应的图片。
也就是说,电子设备100检测到用户对图片控件103A的拖拽操作,可以触发电子设备100将图片控件103A对应的内容从娱乐类应用分享到短信应用中。
需要注意的是,用户的拖拽操作作用的界面元素与电子设备100分享的内容可以不相同,电子设备100显示的界面元素只是电子设备100分享的内容的展现形式。如图2C所示,电子设备100检测到作用于图片控件103A的拖拽操作时,电子设备100分享的内容为图片。
应理解,电子设备100显示的界面元素与其对应的内容并不存在必然的关联。例如,当电子设备100显示的界面元素可以为一张图片,其对应的内容可以为一段文字,又例如,当电子设备100显示的界面元素为一个词语时,其对应的内容可以为一段网址。
另外,当电子设备100检测到作用于信息输入区域202中的发送图标202A的用户操作,响应于该操作,电子设备100可以将图片203A发送给其他设备,并在信息展示区域201中显示该图片203A。
另外,当用户作用于图片控件103A的分享操作,其在显示屏中的触摸点靠近显示屏的底端时,电子设备100也可以不显示应用列表105,直接如图2F所示的用户界面,即直接切换到另外一个应用提供的用户界面,该应用可以为用户使用频率最高的应用,也可以为电子设备100最近使用的应用,也可以为电子设备100后台运行的应用,或者,也可以为电子设备100预先设置的用来显示用户分享的内容的应用,例如图片浏览应用等等。例如,电子设备100直接从图2C所示的用户界面10,切换到如图2F所示的用户界面。
应理解,本申请实施例对分享操作不作限制,该分享操作可以为上述图2C-图2F的拖拽操作,也可以为点击操作,例如当电子设备100检测到作用于如图2B中(a)所示的图片控件103A的点击操作,响应于该操作,电子设备100可以在用户界面10的底端显示如图2D所示的应用列表105,当电子设备100检 测到作用于应用列表105中的第一图标105A的点击操作时,响应于该操作,电子设备100可以将该图片控件103A对应的内容分享到短信应用中,即显示如图2G所示的用户界面20。
图3A-图3F示出了电子设备100进行设备间多个内容的分享时,涉及到的一些用户界面。
图3A示出了电子设备100进入分享模式时,显示的示例性用户界面10,具体关于该用户界面的描述可以参见前述图2B中(a)的相关描述,这里不再赘述。
如图3A所示,当电子设备100检测到作用于图片控件103A的选中操作,例如,点击操作,响应于该操作,电子设备100可以选中该图片控件103A,将该图片控件103A确定为待分享的内容,并更改该图片控件103A的显示效果,该显示效果可以包括:颜色、大小、饱和度、透明度等等。示例性地,该显示效果可以参见图3B所示的图片控件103A。
如图3B所示,图片控件103A的背景颜色比图3A中的图片控件103A的背景颜色更深。
可选地,选中后的图片控件103A可以显示有动画效果,例如抖动效果。或者,还可以存在电子设备100的震动效果等等。
如图3B所示,用户界面10中还包括图片控件103C,当电子设备100检测到作用于图片控件103C的选中操作,例如,点击操作,响应于该操作,电子设备100可以选中该图片控件103C,将该图片控件103C确定为待分享的内容,并更改该图片控件103C的显示效果。
如图3C所示,图片控件103C的背景颜色比图3B中的图片控件103C的背景颜色更深。从图3C可以看出,图片控件103A和图片控件103C当前都处于选中状态。
在图片控件103A和图片控件103C都处于选中状态的情况下,如图3D所示,电子设备100可以检测到作用于图片控件103A或图片控件103C的分享操作,例如拖拽操作,该拖拽操作可以表现为如图3D所示的手势1到手势2的滑动过程。另外,在手势的滑动过程中,电子设备100可以显示截图图片103D,该截图图片103D可以跟随手势的滑动轨迹进行移动,该截图图片103D可以由图片控件103A和图片控件103C的截图组合或叠加得到。
在电子设备100检测到用户的分享操作,其手势滑动到用户界面的底端时,例如检测到用户的分享操作中的手势2时,电子设备100可以切换到另一个应用,显示如图3E所示的用户界面20。示例性地,该用户界面20可以为短信应用提供的用户界面,具体关于该用户界面20的描述可以参见前述图2F的相关描述,这里不再赘述。
如图3E-图3F所示,当用户将截图图片103D拖动到图3E所示的信息输入区域202时,电子设备100可以在用户界面20中显示如图3F所示的附件窗口203,该附件窗口203用于显示用户待发送的文件、图片、语音等等。
如图3F所示,附件窗口203可包括:图片203C、图片203D。其中,图片203C可以为图3C中选中的图片控件103A对应的图片,图片203D可以为图3D中选中的图片控件103C对应的图片。
从图3A-图3F可以看出,电子设备100在进入分享模式后,可以允许用户选择多个界面元素,同时对这多个界面元素进行分享。这样,当用户存在多个内容需要分享时,可以通过选中多个内容,一次性完成这些内容的分享,提高分享效率,同时也在内容分享过程中便捷了用户的操作。
图4A-图4D示出了电子设备100进行设备间的内容分享时,涉及到的一些用户界面。
当电子设备100检测到作用于如图2B中(a)所示的图片控件103A的拖拽操作,响应于该操作,电子设备100可以生成并显示截图图片103B,该截图图片103B跟随用户的拖拽操作进行移动。当用户的拖拽操作在屏幕中的触摸点靠近用户界面的底端时,电子设备100可以在用户界面的底端显示如图4A所示的设备列表106,该设备列表106用于显示一个或多个设备图标。示例性地,设备列表106可包括:第一图标106A、第二图标106B、第三图标106C。其中,第一图标106A可用于触发将图片控件103A对应的图片发送给设备1,第二图标106B可用于触发将图片控件103A对应的图片发送给设备2,第三图标106C可用于触发将图片控件103A对应的图片发送给设备3。
需要注意的是,设备列表106中显示的设备图标所对应的设备可以为电子设备100建立有连接关系(例如有线连接关系或无线连接关系)的设备,或者,与电子设备100同属于一个账号或一个账号群组下的设备等等,本申请实施例对设备列表106中显示的设备图标,与电子设备100的关联不作限制。
可以理解的是,本申请实施例不限制设备列表106中显示的位置,例如,设备列表106可以显示在用户界面的左侧,也可以显示在用户界面的右侧。具体关于设备列表106的显示位置的描述可以参考前述关 于应用列表106的显示位置的相关描述,这里不再赘述。
如图4B-图4C所示,当电子设备100检测到用户作用于图片控件103A的拖拽操作,作用在第二图标106B所在的范围时,即如图4B所示,第二图标106B与截图图片103B上下重叠时,电子设备100显示在用户界面10中显示如图4C所示的提示信息107,该提示信息107用于提示用户已将图片控件103A对应的图片发送给第二图标106B对应的设备2。
可选地,当用户的拖拽操作作用在第二图标106B所在的范围时,电子设备100可以更改该第二图标106B的显示效果,该显示效果可包括:图标大小、图标颜色、图标位置等等。对比图2A和图2B可以看出,在用户的拖拽操作作用在第二图标106B所在的范围时,电子设备100可以更改第二图标106B的图标颜色,从而通过该颜色的变化,提示用户该图片控件103A对应的图片可以分享给第二图标106B对应的设备2。
可选地,电子设备100可以在用户的拖拽操作作用在第二图标106B所在的范围内,并保持一段时间,例如1S后,将图片控件103A对应的图片可以分享给第二图标106B对应的设备2,并显示如图4C所示的提示信息107。
示例性地,在电子设备100将图片控件103A对应的图片发送给设备2之后,设备2可以显示如图4D所示的用户界面40。
如图4D所示,用户界面40可以为设备2显示的用于应用程序菜单的示例性用户界面。其中,用户界面40可包括窗口401。该窗口401可用于显示电子设备100发送的图片。该窗口401可包括取消选项401A、保存选项401B、复制选项401C、图片401D。取消选项401A可用于触发取消获取电子设备100发送的图片,保存选项401B可用于触发将电子设备100发送的图片保存到本地,复制选项401C可用于触发复制电子设备100发送的图片,在设备2复制该图片之后,设备2可以在显示其他输入窗口时,检测到用户的粘贴操作,将该图片粘贴在该输入窗口中,例如,设备2在显示备忘录时,检测到用户的长按操作,触发显示粘贴选项,在检测到用户对该粘贴选项的确认操作后,触发将复制的图片粘贴在该备忘录中。图片401D中显示有电子设备100发送的图片,该图片可以为图4A所示的图片控件103A对应的图片。
可以理解的是,不限于在上述提及的用户界面40中显示该窗口401,当设备2在显示应用提供的用户界面时,检测到电子设备100发送的图片,则设备2可以在该应用提供的用户界面中显示该窗口401。
从图2A-图2G、图3A-图3F、图4A-图4D可以看出,在分享模式下,电子设备100可以检测到用户作用于界面元素的分享操作,将该界面元素分享到其它应用或其他设备。另外,图2A-图2G、图4A-图4D仅示出了电子设备100对图片的分享过程,应理解,本申请实施例对分享的内容不做限制,例如,在分享模式下,电子设备100可以检测到用户作用于文字控件的分享操作,将该文字控件对应的内容(例如文字)分享到其他应用或其他设备。
电子设备可以是搭载iOS、Android、Microsoft或者其它操作系统的便携式终端设备,例如手机、平板电脑、可穿戴设备等,还可以是具有触敏表面或触控面板的膝上型计算机(Laptop)、具有触敏表面或触控面板的台式计算机等非便携式终端设备。电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图5是本申请实施例的电子设备100的软件结构示意图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图5所示,应用程序包可以包括应用A和应用B等等应用程序,示例性地,应用A和应用B可以为相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图5所示,应用程序框架层可以包括系统窗口框架、系统视图框架、系统拖拽服务等。
系统窗口框架用于为应用程序提供窗口。
系统视图框架用于管理和显示视图,并管理控件的响应行为。电子设备100的显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
系统拖拽服务用于生成和管理拖拽窗口,并根据用户的分享操作(例如拖拽操作)更改窗口的显示位置。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面以图2A-图2G所示的用户界面为一个具体的例子,来详细描述电子设备100的软件结构中的各模块之间的交互过程。
其中,图2A-图2G示出了电子设备100在分享模式下,检测到作用于界面元素的拖拽操作时,将该界面元素对应的内容从一个应用分享到另一个应用时,涉及到的用户界面。
图6为本申请实施例提供的电子设备100的软件结构中,内部各模块之间的交互流程图。
如图6所示,本申请实施例提供的内容分享方法涉及到电子设备100的软件结构中的应用A、应用B、系统窗口框架、系统视图框架、系统拖拽服务。具体关于系统窗口框架、系统视图框架、系统拖拽服务的描述可以参见前述图5中的相关内容,这里不再赘述。
如图6所示,软件结构中的各模块之间的交互可包括:
阶段一(S101-S102):启动应用A
S101.应用A检测到启动操作。
电子设备100可以检测到作用于应用A的启动操作,该启动操作可用于触发启动应用A。例如,该启动操作可以为作用于应用A的图标的点击操作。示例性地,该应用A可以是指图2A的相关内容中提及的娱乐类应用。
可以理解的是,本申请实施例对应用A不作限制。
S102.应用A启动应用A的窗口的显示。
响应于该启动操作,电子设备100可以显示应用A的窗口。应理解,该窗口中可以包括一个或多个界面元素,这一个或多个界面元素通过排列组合,或上下叠加的方式构成用户界面显示在电子设备100的显示屏上。其中,一个用户界面中可以包括一个或多个窗口。界面元素可以分为控件和布局,其中布局是一种特殊的控件,一个布局中可以包含其他布局或控件,控件中不能有其他布局或控件。示例性地,该应用A的用户界面可以是指图2A所示的用户界面10。
图7为本申请实施例提供的窗口、控件和布局的一种树形结构示意图。
窗口是所有显示内容的根,可以在窗口中构建View树,View树描述了用户界面中包含的各控件和布局的叠加和排列关系。从图7可以看出,布局1位于窗口中,布局1中可包括布局21、布局22和布局23等等,布局1中包括控件31,布局22中包括控件32,布局23中包括控件33。
示例性地,图8为本申请实施例提供的用户界面10中的部分布局和控件的示意图。从图8可以看出,控件11和控件12位于布局1中。
阶段二(S103-S107):进入分享模式
S103.应用A检测到进入分享模式的操作。
电子设备100可以在显示应用A提供的用户界面时,检测到作用于该用户界面的操作,该操作可用于触发电子设备100进入分享模式。在该分享模式下,电子设备100可以更改电子设备100当前显示的用户 界面,并更改当前显示的用户界面中的界面元素的响应行为。具体可参见后续步骤S106和步骤S107。示例性地,该操作可以如图2A所示的快速三击操作。
应理解,当用户的分享操作为拖拽操作时,分享模式还可以被称为拖拽模式,本申请实施例对该名称不作限制。
S104.应用A将操作的指示信息发送给系统窗口框架。
响应于该操作,电子设备100通过应用A将该操作的指示信息发送给系统窗口框架。
S105.系统窗口框架根据指示信息向系统视图框架发送渲染请求。
电子设备100可以通过系统窗口框架将系统视图框架发送渲染请求。该渲染请求用于触发更改当前的显示内容,即对当前显示的用户界面使用分享模式下的渲染。其中,该显示内容的更改可用于提示用户电子设备100已进入分享模式。
S106.系统视图框架对应用A的窗口使用分享模式下的渲染。
电子设备100可以响应于该渲染请求,通过系统视图框架,对当前显示的窗口使用分享模式下的渲染,即显示分享模式下的该窗口。示例性地,该分享模式下的窗口可以是指在当前显示的该窗口中,在各界面元素的周围显示边框。该边框可以是指如图2B中所示的各界面元素周围的边框。
可以理解的是,分享模式下的渲染不仅限于在各界面元素的周围显示边框,还可以更改界面元素的显示效果,显示文字提示信息等等,具体可参见后续图9所示的步骤S204的相关内容,这里先不赘述。另外,也可以不更改当前的显示内容,此时,步骤S105-S106为可选的步骤。
S107.系统视图框架更改或创建应用A的窗口中的界面元素的响应行为。
电子设备100可以通过系统视图框架,更改或创建应用A的窗口中的界面元素的响应行为。在界面元素的响应行为更改或创建后,电子设备100能够响应于作用在该界面元素的拖拽操作,触发对该界面元素的分享。
具体地,系统视图框架可以调整界面元素对指定操作的响应行为,该指定操作可以是指拖拽操作,该响应行为可以是指对该界面元素的分享。
可以理解的是,该指定操作不限于拖拽操作,本申请实施例对此不作限制。
阶段三(S108-S117):内容分享
S108.应用A检测到拖拽操作。
在进入分享模式之后,电子设备100可以检测到作用于应用A的窗口中的界面元素的拖拽操作。该拖拽操作可以为作用于显示屏的触摸操作,示例性地,该拖拽操作可以是指如图2D-图2F所示的用户的手指作用于显示屏的连续拖拽操作。
可以理解的是,步骤S108-S117以分享操作为拖拽操作为例,描述了应用间的内容分享的部分过程,分享操作不限于拖拽操作,还可以为点击操作,长按并拖拽的操作等等,具体关于应用A检测到何种形式的分享操作,触发内容的分享,与步骤S107中系统视图框架调整的指定操作一致。
S109.应用A根据拖拽操作生成触摸事件。
电子设备100可以根据该拖拽操作,通过应用A生成触摸事件。
可以理解的是,当拖拽操作为对显示屏的触摸操作时,应用A根据该拖拽操作生成的输入事件为触摸事件。在一些实施例中,当拖拽操作为用户通过鼠标触发的操作时,应用A根据该拖拽操作生成的输入事件为鼠标事件。本申请实施例对该输入事件的类型不作限制。
其中,输入事件中可以包括事件类型、坐标、事件等等信息。其中,事件类型可以包括down事件、move事件、on事件,down事件表示一次用户手势的开始,on事件表示一次用户手势的结束,move事件表示一次用户手势的过程。一次用户手势触发的Input事件可以包括一次down事件,多次move事件和一次up事件。其中,该输入事件中的事件类型指示了用户的操作具体为拖拽操作、点击操作还是长按并拖动的操作等等。坐标是指分享操作作用在显示屏上的位置,时间是指用户触发分享操作的时间。
S110.应用A将触摸事件发送给系统窗口框架。
电子设备100可以通过应用A,将触摸事件发送给系统窗口框架。
S111.系统窗口框架将触摸事件发送给系统视图框架。
电子设备100可以通过系统窗口框架将触摸事件发送系统视图框架。
S112.系统视图框架根据触摸事件触发对目标界面元素的截图,并将该截图确定为拖拽过程中显示的内容。
目标界面元素为电子设备100开始检测到拖拽操作时,该拖拽操作作用的界面元素。例如,目标界面元素可以是指用户发起拖拽操作时,用户的手指开始触摸到显示屏,该触摸点指向的界面元素。
电子设备100可以通过系统视图框架,根据触摸事件触发对目标界面元素的截图,并将该截图确定为拖拽过程中显示的内容。其中,系统视图框架可以通过遍历当前应用A的窗口中的界面元素,根据该触摸事件中包含的down事件对应的坐标信息,找到该拖拽操作作用的目标界面元素,并根据步骤S107中,对该目标界面元素调整后的响应行为,判断当前作用于该目标界面元素的操作是否为指定操作,如果是,则触发执行调整后的响应。
电子设备100可以在检测到用户作用于目标界面元素的拖拽操作后,在拖拽过程中显示该目标界面元素的截图。
示例性地,该目标界面元素可以是指如图2C所示的图片控件103A,即为图2C所示的手势1指向的界面元素。
可以理解的是,步骤S112为可选的步骤。
S113.系统视图框架获取针对该目标界面元素的传输内容。
电子设备100可以通过系统视图框架获取该目标界面元素的内容,将该内容确定为分享过程传输的内容。例如,当目标界面元素对应的内容为图片时,电子设备100分享的内容即为该图片。
其中,系统视图框架可以通过以下两种方式获取到针对该目标界面元素的传输内容:
1)从目标界面元素本身携带的信息中获取该传输内容
具体地,系统视图框架可以通过该目标界面元素对外的接口,获取到该目标界面元素本身携带的信息。
示例性地,对于Text控件,开发者可以通过该Text控件的setstring接口,设置该Text控件对应的数据。在获取该Text控件的传输内容时,可以通过getstring接口,获取到该Text控件对应的数据,并将该数据作为该Text控件的传输内容。
2)从预先适配的配置文件中获取该传输内容
控件的配置文件指示了控件的属性、布局、大小、位置等等信息。开发者可以在配置控件的配置文件时,提前将该控件的传输内容写入该配置文件,从而提前适配该分享模式。这样,在需要获取该控件的传输内容时,可以从该控件的配置文件中查找该控件对应的传输内容。
如下示出了一个Text控件的配置文件的部分示例性代码:
其中,android:dragcontext=“Henry”为开发者为了适配该分享模式,提前在配置文件中添加的代码。当需要获取该控件的传输内容时,系统视图框架可以通过该提前添加的内容,确定该Text控件的传输内容为文本“Henry”。
应理解,传输内容是界面元素对应的内容,而界面元素是传输内容面向用户的一种展现形式。电子设备100传输的内容可以包括:文本、图片、语音、表格、视频、文件等等内容,本申请实施例对该传输的内容不做限制。
S114.系统视图框架将目标界面元素的截图和传输内容发送给系统拖拽服务。
电子设备100可以通过系统视图框架将目标界面元素的截图和传输内容发送给系统拖拽服务。
可以理解的是,当步骤S112为可选的步骤时,系统视图框架可以仅将传输内容发送给系统拖拽服务。
S115.系统拖拽服务生成拖拽窗口,并在拖拽窗口中显示截图。
电子设备100可以在用户拖拽过程中,跟随用户作用在显示屏上的触摸点,显示该目标界面元素的截图。具体地,电子设备100可以通过系统拖拽服务器生成一个拖拽窗口,并在该拖拽窗口显示该目标界面元素的截图,并且,系统拖拽服务可以根据用户的拖拽操作作用在显示屏上的触摸点的位置,同步移动该拖拽窗口,达到目标界面元素的截图跟随用户的拖拽操作移动的效果。示例性地,参见图2C-图2F,该拖拽过程中显示的截图可以为截图图片103B。
可以理解的是,步骤S115为可选的步骤。
S116.系统拖拽服务将传输内容发送给应用B。
电子设备100可以通过系统拖拽服务将传输内容发送给应用B。
其中,应用B可以为预先设置的指定应用,例如桌面、备忘录应用等等。或者,应用B可以为预设规则下确定的应用,例如用户最近历史开启的应用,用户使用频率最高的应用等等。或者,应用B可以为用户的分享操作选择的应用,例如,当分享操作为拖拽操作,在电子设备100检测到用户作用于目标界面元素到应用B的应用图标的拖拽操作时,系统拖拽服务器可以将传输内容发送给应用B。本申请实施例对应用B不作限制。
S117.应用B在应用B的窗口中显示传输内容或该传输内容的标识。
在电子设备100将传输内容发送给应用B之后,可以在应用B的该窗口中显示该传输内容或该传输内容的标识。在本申请实施例中,该传输内容和该传输内容的标识还可以被称为第二界面元素。具体关于界面元素的解释可以参见前述内容。
例如,当传输内容为图片或文字时,应用B可以启动应用B的窗口的显示,并直接在该窗口中显示该图片或文字。又例如,当传输内容为视频时,应用B可以启动应用B的窗口的显示,并在该窗口中显示该视频的播放界面,该播放界面中可以显示有该视频的一帧图像,该播放界面中可以包括播放控件,该播放控件可用于触发播放该视频。又例如,当传输内容为语音时,应用B可以启动应用B的窗口的显示,并在该窗口中显示该语音的播放图标,该播放图标可用于触发播放该语音。
应理解,该传输内容的标识为该传输内容的展现形式,其展现形式可以表现为预设的图标或者目标界面元素的截图等等,本申请实施例对此不做限制。
示例性地,该应用B的窗口可以是指如图2G所示的用户界面20,在该用户界面中显示的传输内容或传输内容的标识可以是指图2G所示的图片203A。
另外,需要注意的是,应用B的窗口可以在步骤S117之前由应用B启动显示,也可以在应用B获取到传输内容之后,在步骤S117中,由应用B先触发对应用B的窗口的显示,再再改应用B的窗口中显示传输内容或该传输内容的标识。
可以理解的是,系统拖拽服务除了将传输内容发送给应用B之外,还可以发送给其他设备,从而实现设备间的内容分享。另外,上述步骤S101-S117能够实现应用A到应用B的内容分享,应理解,应用A和应用B除了为不同的应用之外,也可以为相同的应用,本申请实施例对此不作限制。从步骤S101-S117可以看出,电子设备100通过应用程序框架层中的系统窗口框架、系统视图框架以及系统拖拽服务器来实现应用A到应用B的内容分享。其中,通过系统视图框架来自动调整界面元素对拖拽操作的响应行为,确定拖拽过程中的显示内容和传输内容。实现了系统层级的内容分享效果,这样,开发者无需单独对某一个应用进行适配,手动声明应用程序中能够进行拖动的内容,以及拖动过程中显示和传输的内容,即可实现应用间的内容分享,减少了开发者的工作量,扩大了内容分享的应用场景。
图9为本申请实施例提供的内容分享方法的流程示意图。
如图9所示,该方法包括:
S201.电子设备100显示第一窗口。
该第一窗口可以为第一应用的窗口。电子设备100可以检测到用户作用于第一应用的应用图标的操作,例如,点击操作,响应于该操作,显示第一应用的第一窗口。示例性地,该第一窗口中的显示内容可以是指图2A所示的用户界面10。
该第一窗口中可以包括一个或多个界面元素,界面元素是指用户界面中满足用户交互要求的一系列元素,包括:图片、文本、图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等等控件,以及这些控件的组合。
具体实现中,电子设备100可以通过应用A检测到启动操作,启动应用A的用户界面的显示。其中,第一应用可以是指应用A,第一窗口中的显示内容可以是指该应用A的用户界面,具体可以参见前述步骤S101-S102的相关描述。
S202.电子设备100检测到作用于第一窗口的进入分享模式的操作。
该操作可以是指作用于第一窗口的操作。示例性地,该操作可以是指图2A所示的操作,例如快速三击操作。或者,该操作还可以是指作用于下拉菜单中开启分享模式的操作,本申请实施例对该操作不作限制。在本申请实施例中,该操作还可以被称为第一操作,该操作用于触发电子设备100进入分享模式。
在进入分享模式后,电子设备100可以调整该第一窗口中的一个或多个界面元素的响应行为,使得电子设备100能够检测到作用于界面元素的分享操作,触发对该界面元素的分享。具体可以参见后续步骤S203的描述,这里先不展开。另外,在进入分享模式后,电子设备100可以对该第一窗口启动分享模式下的渲染,即更改显示内容,从可视化的角度提醒用户当前已经进入分享模式。具体可参见后续步骤S204的描述,这里先不展开。
换句话说,针对同一个操作,在进入分享模式之前和进入分享模式之后,电子设备100对同一个界面元素的响应行为不同,或者,在进入分享模式之前,该界面元素不存在响应行为。
该操作可以是指上述步骤S103提及的操作。具体可参见前述步骤S103的相关描述。
另外,进一步地,电子设备100可以仅控制该第一窗口中的部分界面元素的进入分享模式,或者,电子设备100可以不响应于该进入分享模式的操作。这样,可以仅允许第一窗口中的部分界面元素支持分享,禁止另一部分的界面元素进行分享。这是为了避免部分存在安全隐私信息的应用,不希望进行应用内的内容分享时,开发者可以自行调整支持和不支持分享的内容,为用户的安全隐私提供保障。
其中,电子设备100可以提供以下三种级别的禁止分享:
1)窗口级别
电子设备100禁止整个用户界面进入分享模式。
也就是说,在电子设备100检测到该作用于第一窗口的进入分享模式的操作后,电子设备100不会进入分享模式。
2)布局级别
电子设备100可以禁止整个用户界面中的部分区域进入分享模式,该部分区域中可以包括多个控件。
也就是说,在电子设备100检测到该作用于第一窗口的进入分享模式的操作后,电子设备100可以仅控制第一窗口中的第一区域进入分享模式。那么在进入分享模式之后,电子设备100仅调整该第一窗口的第一区域内的界面元素的响应行为,仅对该第一窗口中,第一区域的内容启动分享模式下的渲染。
3)控件级别
电子设备100可以禁止用户界面中的部分控件进入分享模式。
也就是说,电子设备100检测到该作用于第一窗口的进入分享模式的操作后,电子设备100可以仅控制第一控件之外的其他界面元素进入分享模式。那么在进入分享模式之后,电子设备100仅调整第一窗口中除第一控件之外的界面元素的响应行为,仅对该第一窗口中除第一控件之外的内容启动分享模式下的渲染。
示例性地,在开发者自定义应用中禁止分享的内容时,可以通过在上述三种级别的XML配置文件中调用“enablesystemdrag”接口来分别实现对不同级别的禁止分享。
如下示出了布局级别一个布局级别的配置文件的部分示例性代码:

其中,android:enablesystemdrag=“false”为开发者为禁止该布局进入分享模式,在布局的配置文件中添加的代码。
可以理解的是,本申请实施例对步骤S201和步骤S202的先后执行顺序不作限制,例如,电子设备100可以先检测到进入分享模式的操作,再显示该第一窗口。
S203.电子设备100更改或创建该第一窗口中的界面元素的响应行为。
界面元素的响应行为是指电子设备100检测到作用于在该界面元素的指定操作后,所执行的响应。
电子设备100更改或创建该第一窗口中的界面元素的响应行为,是为了使得电子设备100在进入分享模式之后,检测到作用于第一窗口中的界面元素的分享操作时,电子设备100执行的响应为触发该界面元素的分享。
电子设备100更改界面元素的响应行为是指,在进入分享模式之前,电子设备100可以检测到作用于该界面元素,与该分享操作相同的操作,执行一定的响应,在进入分享模式之后,电子设备100将该响应更改为触发该界面元素的分享。
电子设备100创建界面元素的响应行为是指,在进入分享模式之前,当电子设备100检测到作用于该界面元素,与该分享操作相同的操作时,电子设备100不会执行任何响应,在进入分享模式之后,电子设备100可以将该界面元素的响应确定为触发该界面元素的分享。
具体实现中,电子设备100可以通过系统视图框架更改或创建该第一窗口中的界面元素的响应行为。具体可参见前述步骤S107的相关描述,这里不再赘述。
应理解,电子设备100可以仅更改或创建第一窗口中的部分界面元素的响应行为。该部分界面元素可以为开发者设置的允许进入分享模式的界面元素。在进入分享模式后,电子设备100可以调整该第一窗口中的M个界面元素的响应行为,使得电子设备100能够响应于作用在该M个界面元素中的目标界面元素的分享操作,触发对该目标界面元素的分享。其中,M≥1,且M为正整数。具体关于仅部分界面元素进入分享模式的描述可参见前述步骤S202的相关内容,这里不再赘述。
S204.电子设备100显示分享模式下的第一窗口。
电子设备100响应于该操作,进入分享模式,对该第一窗口启动分享模式下的渲染,即显示分享模式下的第一窗口。该分享模式下的第一窗口不同于进入分享模式之前的第一窗口。具体地,电子设备100可以在进入分享模式后,在该第一窗口中显示提示信息(例如第一提示信息),该提示信息用于指示电子设备100已进入分享模式。
该提示信息可以表现为:
1)新增信息
也就是说,响应于进入分享模式的操作,电子设备100可以在该第一窗口中增加信息。通过该进入分享模式前后,第一窗口中信息的变化来提醒用户当前已经进入分享模式。
示例性地,该提示信息可以包括,进入分享模式之后,对第一窗口中的各界面元素新增的边框。如图2B中(a)所示,用户界面10中的第一菜单栏102以及第一菜单栏102中的拍照选项、关注选项、推荐选项、同城选项、更多选项,以及浏览区域103中的用户的头像、名称、发布的娱乐内容,以及针对该娱乐内容的分享选项、评论选项、点赞选项等,以及第二菜单栏104、第二菜单栏104中的首页选项、发现选项、消息选项、本地选项等等界面元素,都分别由一个矩形边框包围在内。
具体实现中,电子设备100进入分享模式之后,可以遍历第一窗口中的界面元素,确定各界面元素的尺寸和位置信息,并根据该尺寸和位置信息确定各界面元素的边框尺寸和位置,并在第一窗口中显示各界面元素的边框。例如,假设电子设备100获取到第一界面元素的长度L和宽度W,以及该第一界面元素在第一窗口中的位置,则电子设备100可以在进入分享模式之后,在该位置上显示一个长度为L,宽度为W的矩形边框。
其中,该边框可用于提示用户,当前电子设备100已进入分享模式,进一步,该边框还可用于提示用户,电子设备100可以检测到作用于边框包围的界面元素的分享操作,触发对该界面元素的分享。
其中,电子设备100可以通过以下两种方式确定各界面元素的尺寸和位置信息:
a)从界面元素的配置文件中获取各界面元素的尺寸和位置信息
这种情况下,界面元素的配置文件中可以包含该界面元素的尺寸和位置信息。电子设备100可以在遍历第一窗口中的界面元素的过程中,从各界面元素的配置文件中获取该界面元素的尺寸和位置信息。
b)根据第一窗口中各界面元素的布局计算各界面元素的尺寸和位置信息
这种情况下,电子设备100需要根据第一窗口中各界面元素的摆放情况和排列方式,以及各界面元素之间的相对关系(例如位置关系、大小关系等等),来计算得到各界面元素的尺寸和位置信息。例如,假设存在两个界面元素:第一控件和第二控件,其中,已知第一控件的长度为X,第二控件的长度为第一控件的Y倍,则可以根据该第二控件与第一控件的长度关系,计算得到第二控件的长度为XY。又例如,假设已知三个控件以三等分的布局方式并排在一个窗口内,则可以根据该窗口的大小和位置来相对确定这三个控件的大小和位置。
可以理解的是,电子设备100确定各界面元素的尺寸和位置信息的方式不限于上述两种,例如电子设备100可以结合上述两种方式,一部分界面元素可以直接从配置文件中获取该界面元的尺寸和位置信息,另一部分的界面元素可以结合各界面元素的布局来计算界面元素的尺寸和位置信息,本申请实施例对此不作限制。
可以理解的是,本申请实施例对该第一提示信息的表现形式不作限制,该第一提示信息可以表现为图形,图标,也可以表现为文字等等。例如,在进入分享模式后,电子设备100可以在第一窗口的底端显示“当前已进入,请选择需要分享的内容”的文字提示。又例如,在进入分享模式后,电子设备100在第一窗口的每个界面元素中显示一个分享图标。
2)界面元素的显示效果发生改变
也就是说,电子设备100可以在进入分享模式后,更改第一窗口中的各界面元素的显示效果。
该显示效果可以包括位置、大小、颜色、透明度、阴影、饱和度、亮度等等显示上的静态效果,也可以包括例如抖动的动态效果。例如,在进入分享模式之后,电子设备100可以降低第一窗口中各界面元素的饱和度。又例如,在进入分享模式之后,电子设备100可以抖动显示第一窗口中的各界面元素。
可以理解的是,分享模式下的第一窗口,与进入分享模式之前的第一窗口之前的区别,不限于上述提及的两种,本申请实施例对此不作限制。具体实现中,电子设备100可以通过系统视图框架,对当前显示的第一窗口使用分享模式下的渲染,具体可参见前述步骤S106的相关内容,这里不再赘述。
另外,当第一窗口中包括不支持分享的界面元素时,电子设备100可以通过以下一种或两种方式来显示分享模式下的第一窗口:
1)电子设备100仅对支持分享的界面元素进行分享模式下的渲染
也就是说,在电子设备100进入分享模式之后,电子设备100可以仅在支持分享的界面元素的区域中显示提示信息,例如,仅在支持分享的界面元素的周围显示边框,或者,电子设备100仅更改支持分享的界面元素的显示效果。
示例性地,参见图2B中的(b),第一菜单栏102和第二菜单栏104中的界面元素不包含边框,浏览区域103中的界面元素包含边框。
可以看出,该第一窗口中显示的提示信息除了可以提示用户当前电子设备100已进入分享模式之外,还可用于突出显示第一窗口中允许进入分享模式的界面元素。
2)电子设备100仅显示支持分享的界面元素
也就是说,在电子设备100进入分享模式之后,电子设备100可以仅显示支持分享的界面元素,停止显示不支持分享的界面元素,例如第三界面元素。
示例性地,参见图2B中的(a),电子设备100仅显示了浏览区域103中的内容。
可以理解的是,步骤S204是可选的。
S205.电子设备100检测到作用于第一窗口中的第一界面元素的分享操作。
由于电子设备100设置了第一窗口中各界面元素对分享操作的响应行为。当电子设备10检测到作用 于第一窗口中的第一界面元素的分享操作,电子设备100即可响应于该操作,触发针对该第一界面元素的分享。
该分享操作可以是指拖拽操作。示例性地,该第一界面元素可以是指如图2C所示的截图图片103B,该分享操作可以是指如图2C-图2F所示的拖拽操作,该拖拽操作可以包括图2C所示的手势1到手势2的拖拽操作,图2D所示的手势2到手势3的拖拽操作,图2E所示的手势3到手势4的拖拽操作,以及图5所示的手势5的结束拖拽操作(抬起操作)。示例性地,该分享操作还可以是指如图4A-图4B所示的拖拽操作,该拖拽操作可以包括如图4A所示的手势1到手势2的拖拽操作,以及图4B所示的手势2到手势3的拖拽操作。进一步地,该拖拽操作可以是指从第一界面元素所在的位置到指定位置的滑动操作。
进一步地,该第一界面元素可以包括一个或多个界面元素。当第一界面元素包括多个界面元素时,该分享操作可用于触发分享多个界面元素。这时,该分享操作还可以包括作用于界面元素的选择操作和拖拽操作。示例性地,该第一界面元素可以包括如图3B所示的图片控件103A和图片控件103C。该分享操作可以包括如图3A所示的作用于图片控件103A的点击操作,以及如图3B所示的对图片控件103C的点击操作,以及如图3D所示的手势1到手势2的拖拽操作。
可以理解的是,本申请实施例对该分享操作不作限制,该分享操作可以为一个操作,也可以为一系列操作,该分享操作可以为拖拽操作,点击操作,或长按并拖动的操作,本申请实施例对该分享操作不作限制。
S206.电子设备100获取该第一界面元素对应的传输内容。
在电子设备100检测到作用于该第一界面元素的分享操作后,电子设备100可以获取针对该第一界面元素的传输内容。
其中,电子设备100可以通过两种方式获取该第一界面元素的传输内容:1)从该第一界面元素本身携带的信息中获取该传输内容,2)从预先适配的配置文件中获取该传输内容。具体关于该这两种获取方式的详细描述可以参见前述图6中的步骤S113的相关内容,这里不再赘述。
具体实现中,电子设备100可以通过系统视图框架,获取针对该第一界面元素的传输内容。具体可以参见前述步骤S113的相关描述,这里不再赘述。
S207.电子设备100触发对该传输内容的分享。
电子设备100对传输内容的分享可以包括以下两种情况:
1)电子设备100对传输内容进行窗口间的分享
也就是说,电子设备100可以将传输内容从一个窗口分享到另一个窗口。具体地,电子设备100可以响应于分享操作,在第二窗口中显示该第一界面元素对应的第二界面元素。其中,该第二界面元素包括:传输内容或该传输内容的标识。该传输内容的标识可以为图标或该第一界面元素的截图,具体关于该标识的描述可以参见前述相关内容,这里不再赘述。
示例性地,该传输内容或该传输内容的标识可以是指该图片203A,或者,该传输内容或该传输内容的标识可以是指该图片203C和图片203D。
进一步地,窗口间的分享可以分为以下两种:
a)应用内的分享
这时,第一窗口和第二窗口同属于一个应用的窗口。示例性地,该第一窗口和第二窗口可以显示有该应用不同页面的内容。
b)应用间的分享
这时,第一窗口和第二窗口属于不同应用的窗口。例如,该第一窗口为第一应用的窗口,该第二窗口为第二应用的窗口。
这样,电子设备100既可以实现应用内的内容的拖拽分享,又可以实现应用间的内容的拖拽分享,提高了内容分享的灵活性。
另外,需要注意的是,电子设备100在显示第二窗口时,可以不再显示第一窗口。这样,电子设备100在完成内容分享的同时,可以完成窗口的切换。示例性地,该第二窗口可以是指图2G所示的用户界面20。或者,电子设备100在显示第二窗口时,仍然在同一个界面上显示第一窗口。这样,电子设备100在实现窗口间的内容分享时,能够同时显示内容分享方和内容接收方。示例性地,该用户界面可以是指图2F中(b)所示的用户界面30,该用户界面20中的区域301中显示第一窗口,区域302中显示第二窗口。
2)电子设备100对传输内容进行设备间的分享
也就是说,电子设备100可以将传输内容从电子设备100分享到其他设备(例如第二设备)中。该其他设备可以是指与电子设备100建立有连接关系的设备,或者,该其他设备与电子设备100同属于一个账号或一个群组下的设备。
在一些实施例中,该分享操作可以为拖拽操作,在电子设备100将第一界面元素对应的传输内容分享到其他窗口或其他设备之前,电子设备100可以显示根据该拖拽操作进行移动的第一界面元素的截图。
具体地,在电子设备100检测到作用于该第一界面元素的分享操作时,电子设备100可以对该第一界面元素进行截图,获得该第一界面元素的截图,之后,电子设备100可以显示该截图,且该截图可以根据用户的分享操作的移动轨迹进行移动。示例性地,该截图图片可以为如图2D-图2E所示的截图图片103B,或者,如图3D所示的截图图片103D,或者,如图4B所示的截图图片103B。
在一些实施例中,当电子设备100检测到分享操作时,电子设备100可以触发显示多个接收该传输内容的应用或设备图标,电子设备100在检测到用户对目标应用或目标设备图标的选择操作后,再将该传输内容分享到目标应用或目标设备。这样,用户可以根据自己的需求,自主选择接收该传输内容的目标应用或目标设备,增大用户的可操作性。
进一步地,当分享操作为拖拽操作,且电子设备100进行应用间的分享时,电子设备100可以在检测到拖拽操作从第一界面元素所在的位置,移动到指定位置(例如显示屏的底端),或者,检测到拖拽操作的指定移动轨迹(例如向下移动)时,触发显示传输内容的接收方。以接收方为应用列表为例,该应用列表中包含多个应用图标,进一步地,该拖拽操作具体为从第一界面元素所在的位置到该应用列表中的其中一个应用图标的位置时,触发将该第一界面元素对应的传输内容分享到该应用中。示例性地,该应用列表可以是指如图2D或图2E所示的应用列表105,该设备图标可以是指如图4A所示的设备列表106。
可以理解的是,电子设备100显示的传输内容的接收方不限于上述提及的应用列表。电子设备100还可以在检测到进入分享模式的操作后,触发显示传输内容的接收方。
例如,该接收方还可以为设备列表。其中,设备列表中可以显示多个设备的图标,当该拖拽操作具体为从第一界面元素的位置到该设备列表中其中一个设备图标的位置时,电子设备100可以触发将该第一界面元素对应的传输内容分享到该设备图标对应的设备中。这时,电子设备100可以被称为第一设备,该设备图标对应的设备可以被称为第二设备。
又例如,该接收方还可以为联系人列表。其中,联系人列表中可以显示多个联系人的图标,当该拖拽操作具体为从第一界面元素的位置到该联系人列表中其中一个联系人图标的位置时,电子设备100可以触发将该第一界面元素对应的传输内容分享到该联系人使用的设备中。这时,电子设备100可以被称为第一设备,该联系人使用的设备可以被称为第二设备。其中,联系人可以是指电子设备100中预存的电话联系人、或指定应用(例如微信应用)中的联系人、或账号群组中的联系人等等,本申请实施例对该联系人不做限制。本申请实施例对接收方的表现形式不作限制。
当拖拽操作具体为从第一界面元素的位置到指定位置的滑动操作时,该指定位置可以为上述提及的应用列表中的其中一个应用图标所在的位置,这时,电子设备100可以触发对传输内容进行窗口间的分享,在该应用的窗口中显示该传输内容或该传输内容的标识,或者,该指定位置即为上述提及的设备列表中的其中一个设备图标,或,联系人列表中的其中一个联系人图标所在的位置,这时,电子设备100可以触发将该第一界面元素对应的传输内容发送给其他设备。
另外,该第一界面元素可以包括一个或多个界面元素。这样,电子设备100可以通过一次分享操作完成针对一个或多个界面元素的分享。当第一界面元素包括多个(例如N个,N≥2,且N为正整数)界面元素时,电子设备100可以在检测到针对这多个界面元素的分享操作之前,检测到对这多个界面元素的选择操作。示例性地,该选择操作可以是指如图3A所示的作用于图片控件103A的选中操作,以及图3B所示的作用于图片控件103C的选中操作,该第一界面元素包括图片控件103A和图片控件103C。可以看出,本申请实施例提供的内容分享方法,用户可以通过快速进入分享模式,在分享模式下快速完成内容在不同应用或不同设备间的分享,扩大了内容分享的应用场景,便捷了用户的操作。
在本申请实施例中,上述步骤S201-S207可以由电子设备100的系统单元执行,该系统单元可以位于电子设备100的框架层,换句话说,该系统单元与第一窗口所属的应用是电子设备100的不同模块。也就是说,可以将该分享模式定义成系统层级的拖拽分享模式,这样,在该系统下的任意一个应用都可以响应于第一操作,进入分享模式,实现应用的内容分享,扩大了内容分享的应用场景,提升了用户对拖拽分享的体验感。具体关于电子设备100的框架层的描述可以参见前述图5中的相关内容,这里不再赘述。
应理解,上述方法实施例中的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
本申请还提供一种电子设备,该电子设备可以包括:存储器和处理器。其中,存储器可用于存储计算机程序;处理器可用于调用所述存储器中的计算机程序,以使得该电子设备执行上述任意一个实施例中电子设备100执行的方法。
本申请还提供了一种芯片系统,所述芯片系统包括至少一个处理器,用于实现上述任一个实施例中电子设备100执行的方法中所涉及的功能。
在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存程序指令和数据,存储器位于处理器之内或处理器之外。
该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
可选地,该芯片系统中的处理器可以为一个或多个。该处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
可选地,该芯片系统中的存储器也可以为一个或多个。该存储器可以与处理器集成在一起,也可以和处理器分离设置,本申请实施例并不限定。示例性地,存储器可以是非瞬时性处理器,例如只读存储器ROM,其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型,以及存储器与处理器的设置方式不作具体限定。
示例性地,该芯片系统可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
本申请还提供一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指令),当所述计算机程序被运行时,使得计算机执行上述任一个实施例中电子设备100任意一个执行的方法。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序(也可以称为代码,或指令)。当所述计算机程序被运行时,使得计算机执行上述任一个实施例中电子设备100任意一个执行的方法。
应理解,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(AP 800plication specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
另外,本申请实施例还提供一种装置。该装置具体可以是组件或模块,该装置可包括相连的一个或多个处理器和存储器。其中,存储器用于存储计算机程序。当该计算机程序被一个或多个处理器执行时,使得装置执行上述各方法实施例中的方法。
其中,本申请实施例提供的装置、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法。因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计 算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (19)

  1. 一种内容分享方法,其特征在于,所述方法包括:
    第一设备显示第一窗口,所述第一窗口包括一个或多个界面元素;
    所述第一设备检测到作用于所述第一窗口的第一操作;
    在所述第一设备检测到所述第一操作之后,所述第一设备检测到作用于所述一个或多个界面元素中第一界面元素的拖拽操作;
    响应于所述拖拽操作,所述第一设备在第二窗口中显示所述第一界面元素对应的第二界面元素,或者,所述第一设备将所述第一界面元素的传输内容发送给第二设备;
    其中,所述第一设备检测到所述第一操作之前,作用于所述第一界面元素的拖拽操作不用于触发所述第一设备在所述第二窗口中显示所述第二界面元素或将所述传输内容发送给所述第二设备。
  2. 根据权利要求1所述的方法,其特征在于,所述第一设备在第二窗口中显示所述第一界面元素对应的第二界面元素,或者,所述第一设备将所述第一界面元素对应的传输内容发送给第二设备之前,所述方法还包括:
    所述第一设备显示跟随所述拖拽操作的移动轨迹进行移动的所述第一界面元素的截图。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第一窗口和所述第二窗口属于同一个应用或不同应用。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述第二界面元素包括:所述传输内容或所述传输内容的标识。
  5. 根据权利要求4所述的方法,其特征在于,所述标识为图标,或,所述第一界面元素的截图。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述第一操作用于触发所述第一设备进入第一模式,所述第一设备检测到作用于所述第一窗口的第一操作之后,所述方法还包括:
    所述第一设备在所述第一窗口中显示第一提示信息,所述第一提示信息用于指示所述第一设备已进入所述第一模式。
  7. 根据权利要求6所述的方法,其特征在于,所述第一提示信息用于突出显示所述第一界面元素。
  8. 根据权利要求7所述的方法,其特征在于,所述第一提示信息包括所述第一界面元素的边框。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述第一设备检测到作用于所述第一窗口的第一操作之后,所述方法还包括:
    所述第一设备停止显示所述第一窗口中的第三界面元素。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述拖拽操作具体为:从所述第一界面元素所在的位置到指定位置的滑动操作。
  11. 根据权利要求10所述的方法,其特征在于,
    所述第一窗口为第一应用的窗口,所述第二窗口为第二应用的窗口,所述指定位置为所述第二应用的图标所在的位置,
    或者,
    所述指定位置为所述第二设备的图标所在的位置,或,第一联系人的图标所在的位置,其中,所述第二设备为所述第一联系人使用的设备。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述第二窗口显示在第一用户界面中,所述第一用户界面中还包括所述第一窗口。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述第一界面元素包括一个或多个界面元素。
  14. 根据权利要求1-13所述的方法,其特征在于,所述第一界面元素包括N个界面元素,N≥2,且N为正整数,所述第一设备检测到作用于所述一个或多个界面元素中第一界面元素的拖拽操作之前,所述方法还包括:
    所述第一设备检测到作用于所述N个界面元素的选择操作。
  15. 根据权利要求1-14任一项所述的方法,其特征在于,所述第一设备检测到作用于所述第一窗口的第一操作之后,所述方法还包括:
    所述第一设备更改或创建所述第一窗口中的M个界面元素的响应行为,使得所述第一设备能够响应于作用在所述M个界面元素中第一界面元素的拖拽操作,在所述第二窗口中显示所述标识或将所述传输内容发送给所述第二设备,其中,M≥1,且M为正整数。
  16. 根据权利要求1-15任一项所述的方法,其特征在于,所述第一设备检测到作用于所述一个或多个界面元素中第一界面元素的拖拽操作之后,所述方法还包括:
    所述第一设备从所述第一界面元素的信息中获取所述传输内容。
  17. 根据权利要求1-16任一项所述的方法,其特征在于,所述方法由所述第一设备的系统单元执行,所述系统单元与所述第一窗口所属的应用是所述第一设备的不同模块。
  18. 一种电子设备,其特征在于,包括存储器,一个或多个处理器,以及一个或多个程序;所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1至17任一项所述的方法。
  19. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1至17任一项所述的方法。
PCT/CN2023/105191 2022-07-08 2023-06-30 内容分享方法、图形界面及相关装置 WO2024008017A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210801011.4A CN117406874A (zh) 2022-07-08 2022-07-08 内容分享方法、图形界面及相关装置
CN202210801011.4 2022-07-08

Publications (1)

Publication Number Publication Date
WO2024008017A1 true WO2024008017A1 (zh) 2024-01-11

Family

ID=89454378

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/105191 WO2024008017A1 (zh) 2022-07-08 2023-06-30 内容分享方法、图形界面及相关装置

Country Status (2)

Country Link
CN (1) CN117406874A (zh)
WO (1) WO2024008017A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100070899A1 (en) * 2008-09-12 2010-03-18 Meebo, Inc. Techniques for sharing content on a web page
CN105849712A (zh) * 2013-10-23 2016-08-10 三星电子株式会社 用于发送数据的方法和设备,以及接收数据的方法和设备
CN106489129A (zh) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 一种内容分享的方法及装置
CN111367457A (zh) * 2020-03-09 2020-07-03 Oppo广东移动通信有限公司 内容分享方法、装置以及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100070899A1 (en) * 2008-09-12 2010-03-18 Meebo, Inc. Techniques for sharing content on a web page
CN105849712A (zh) * 2013-10-23 2016-08-10 三星电子株式会社 用于发送数据的方法和设备,以及接收数据的方法和设备
CN106489129A (zh) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 一种内容分享的方法及装置
CN111367457A (zh) * 2020-03-09 2020-07-03 Oppo广东移动通信有限公司 内容分享方法、装置以及电子设备

Also Published As

Publication number Publication date
CN117406874A (zh) 2024-01-16

Similar Documents

Publication Publication Date Title
WO2021013158A1 (zh) 显示方法及相关装置
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
WO2021027747A1 (zh) 一种界面显示方法及设备
US20220342850A1 (en) Data transmission method and related device
WO2021103981A1 (zh) 分屏显示的处理方法、装置及电子设备
WO2021063074A1 (zh) 一种分屏显示方法与电子设备
US20230041287A1 (en) Interaction Method for Cross-Device Task Processing, Electronic Device, and Storage Medium
KR102534354B1 (ko) 시스템 탐색 바 표시 제어 방법, 그래픽 사용자 인터페이스 및 전자 디바이스
WO2021000839A1 (zh) 一种分屏方法及电子设备
WO2021000881A1 (zh) 一种分屏方法及电子设备
US20230110064A1 (en) Data sharing method, graphical user interface, electronic device, and system
US10502580B2 (en) Method and apparatus for providing augmented reality function in electronic device
US20220214800A1 (en) Method for Switching Between Parent Page and Child Page and Related Apparatus
US20240053879A1 (en) Object Drag Method and Device
KR20210097794A (ko) 표시 방법 및 관련 장치
WO2021036651A1 (zh) 一种显示方法及电子设备
EP4383069A1 (en) Method for combining multiple applications and for simultaneously starting multiple applications, and electronic device
US11914850B2 (en) User profile picture generation method and electronic device
CN115297200A (zh) 一种具有折叠屏的设备的触控方法与折叠屏设备
WO2022062898A1 (zh) 一种窗口显示方法及设备
CN111597000B (zh) 一种小窗口管理方法及终端
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
US20240077987A1 (en) Widget display method and electronic device
WO2022063159A1 (zh) 一种文件传输的方法及相关设备
EP4198709A1 (en) Navigation bar display method, display method and first electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23834784

Country of ref document: EP

Kind code of ref document: A1