WO2022052677A1 - 界面显示方法及电子设备 - Google Patents

界面显示方法及电子设备 Download PDF

Info

Publication number
WO2022052677A1
WO2022052677A1 PCT/CN2021/110459 CN2021110459W WO2022052677A1 WO 2022052677 A1 WO2022052677 A1 WO 2022052677A1 CN 2021110459 W CN2021110459 W CN 2021110459W WO 2022052677 A1 WO2022052677 A1 WO 2022052677A1
Authority
WO
WIPO (PCT)
Prior art keywords
snapshot
window
size
electronic device
user
Prior art date
Application number
PCT/CN2021/110459
Other languages
English (en)
French (fr)
Inventor
阚彬
许嘉
蔺振超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022052677A1 publication Critical patent/WO2022052677A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Definitions

  • the embodiments of the present application relate to the technical field of terminals, and in particular, to an interface display method and an electronic device.
  • the display screen of an electronic device displays the window 01 of the first APP and the window 02 of the second APP.
  • the window 01 contains the image 1, and the user can drag the image 1 in the window 01 to the window 02.
  • the electronic device has no feedback of the drag effect, which makes the user's operation experience poor.
  • the embodiments of the present application provide an interface display method and an electronic device, which can timely feedback effects to the user during the process of the user performing the dragging operation.
  • an embodiment of the present application provides an interface display method, which is applied to an electronic device, where a screen of the electronic device displays a first window and a second window, including: receiving a drag instruction input by a user, the The drag instruction is used to instruct the electronic device to drag and drop the first object in the first window into the second window; if the application program corresponding to the second window supports the file type of the first object , display the first snapshot of the first object; if the application program corresponding to the second window does not support the file type of the first object, display the second snapshot of the first object, the second snapshot includes a first identifier, where the first identifier is used to indicate that the second window does not accept the first object.
  • the electronic device involved in the embodiments of the present application supports the display of multiple windows.
  • the aforementioned first window may be the window of the first APP
  • the second window may be the window of the second APP.
  • the user drags the first object in the first window into the second window, which essentially triggers the electronic device to copy the first object running in the first APP into the second APP.
  • "The user drags the first object in the first window into the second window” involved in the embodiment of the present application refers to that the user's finger enters the second window at the touch position on the screen.
  • the second window can receive the first object.
  • the electronic device may notify the user of the reception status of the first object by the second window by displaying different snapshots.
  • the electronic device can display the first object differently according to whether the application program corresponding to the second window supports the file type of the first object, thereby providing timely and accurate feedback for the user's dragging process.
  • displaying the second snapshot of the first object includes: adding a layer to the third snapshot of the first object to obtain the second snapshot, and the third snapshot is in the Generated after the electronic device receives the drag instruction and before determining that the user-triggered position enters the second window.
  • the layer is used to present to the user the display effect that the second window cannot receive the first object.
  • the layer may be gray or dark. In this way, in the process of dragging the first object by the user, the dragging effect can be fed back to the user in time.
  • displaying the second snapshot of the first object includes: adding a first mark to the third snapshot of the first object to obtain the second snapshot, where the third snapshot is It is generated after the electronic device receives the drag instruction and before determining that the user-triggered position enters the second window.
  • the flag is used to present to the user the display effect that the second window cannot receive the first object. In this way, during the process of dragging the first object by the user, the dragging effect can be fed back to the user in time.
  • the first sign is a graphic sign, or a label
  • the first object includes at least one object
  • the label is used to indicate the total number of first objects not received by the second window .
  • the first logo may be some specific symbols.
  • the first symbol may be a quantity symbol. The quantity symbol is used to indicate the total quantity of the first object not received by the second window.
  • the first sign may be gray or dark.
  • displaying the first snapshot of the first object includes: after determining that the third snapshot of the first object enters the second window, acquiring a first size of the third snapshot and the size of the second window, the third snapshot is generated after the electronic device receives the dragging instruction and before the user-triggered position is determined to enter the second window; the third snapshot is Adjusting from the first size to the second size to obtain the first snapshot, the ratio of the second size to the size of the second window is greater than or equal to the first threshold and less than or equal to the second threshold, so Both the first threshold and the second threshold are greater than 0 and less than 1.
  • the electronic device may adjust the size of the first snapshot based on the size of the second window, so that the visual effect of the snapshot in the second window is optimal. With this implementation manner, the electronic device can enable the user to preview the size of the received first object during the dragging process of the user, thereby further improving the user's operation experience.
  • adjusting the third snapshot from the first size to the second size to obtain the first snapshot includes: reducing the third snapshot from the first size to the full size The second size is obtained to obtain the first snapshot; or, the third snapshot is enlarged from the first size to the second size to obtain the first snapshot.
  • the electronic device can adaptively adjust the size of the snapshot according to the size of the receiving window during the dragging process by the user, so that the user can preview the size of the received object, thereby further improving the user's operating experience.
  • the second window includes an icon of a second object
  • the displaying the first snapshot corresponding to the first object includes: entering the third snapshot of the first object after determining After the second window, obtain the first size of the third snapshot, the size of the second window and the third size of the icon; adjust the third snapshot from the first size to the fourth size, Obtaining the first snapshot, and adjusting the icon from the third size to the fourth size, the ratio of twice the fourth size to the size of the second window is greater than or equal to a first threshold , and is less than or equal to a second threshold, the first threshold and the second threshold are both greater than 0 and less than 1; the first snapshot and the resized icon are arranged according to the position triggered by the user.
  • the electronic device can adaptively adjust the size and arrangement order of each icon according to the size of the second window and the number of objects in the second window, so as to present a preview display effect to the user, so that the The user can see the display effect of the size and arrangement order of the received objects in advance, so that the user's operation experience can be further improved.
  • the first object includes at least one object, displaying a first snapshot corresponding to the first object, and further comprising: adding a second identifier to the third snapshot to obtain the first snapshot , and the second identifier is used to indicate the total number of first objects that can be received by the second window.
  • the method further includes: generating a third snapshot of the first object, the first snapshot An object includes at least one object; a third identifier is added to the third snapshot, where the third identifier is used to indicate the total number of the first objects.
  • the electronic device further supports the user to drag at least two objects at a time.
  • the electronic device may identify the total number of objects dragged by the user. With this implementation manner, the electronic device can present more detailed dragging effect information to the user in the process of feeding back the dragging effect, so that the user's operation experience can be further improved.
  • an embodiment of the present application provides an electronic device, and the electronic device has a function of implementing the behavior of the electronic device in the above method.
  • the functions can be implemented by hardware, or can be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the structure of the above electronic device includes a processor, a receiver and a display, and the processor is configured to process the electronic device to perform corresponding functions in the above method.
  • the receiver is used to realize that the above electronic device receives various instructions input by the user.
  • the display is used to realize the display of the snapshot by the electronic device.
  • the electronic device may also include a memory for coupling with the processor that holds program instructions and data necessary for the electronic device.
  • the present application provides a computer storage medium, where instructions are stored in the computer storage medium, and when the instructions are executed on a computer, the computer can execute the first aspect and various possible implementations of the first aspect.
  • the interface displays some or all of the steps of the method.
  • the present application provides a computer program product that, when running on a computer, enables the computer to execute part or all of the steps of the interface display method in the first aspect and various possible implementations of the first aspect .
  • FIG. 1 is a schematic interface diagram of a dragging scene provided by an embodiment of the present application.
  • FIG. 2A is a schematic diagram of an exemplary hardware structure of an electronic device 100 provided by an embodiment of the present application.
  • FIG. 2B is a schematic diagram of an exemplary software architecture of the electronic device 100 according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an exemplary signal interaction inside an Android (Android) operating system provided by an embodiment of the present application;
  • FIG. 5 is a schematic interface diagram of a first exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an interface after a user leaves his hand in the dragging scene shown in FIG. 5 according to an embodiment of the present application;
  • FIG. 7 is a schematic interface diagram of a second exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 8 is a schematic interface diagram of a third exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 9 is a schematic interface diagram of a fourth exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 10 is a schematic interface diagram of a fifth exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 11 is a schematic interface diagram of a sixth exemplary dragging scene provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an exemplary interface of a collaborative office scenario provided by an embodiment of the present application.
  • FIG. 13 is a schematic interface diagram based on the exemplary dragging scene of FIG. 12 provided by an embodiment of the present application;
  • FIG. 14A is a schematic diagram of an exemplary composition of an electronic device 140 provided by an embodiment of the present application.
  • FIG. 14B is a schematic diagram of an exemplary structure of an electronic device 141 according to an embodiment of the present application.
  • the terms first, second, etc. may be used to describe other types of objects in the same way, and will not be repeated here.
  • the term "and/or”, used to describe the relationship of related objects indicates that there can be three kinds of relationships; for example, A and/or B, it can mean that A exists alone, A and B exist at the same time, and B exists alone Cases where A, B can be singular or plural.
  • the character "/" generally indicates that the associated objects are an "or" relationship.
  • the embodiment of the present application provides an interface display method.
  • the screen of the electronic device displays the first window and the second window at the same time, the user can drag and drop the first object in the first window into the second window, and according to the first window and the second window Whether the application program corresponding to the two windows supports the file type of the first object, the first object is displayed differently, so as to provide timely and accurate feedback for the user's dragging process.
  • a document in the first window is dragged to the second window, and if the application program corresponding to the second window supports the file type of the document, the document is highlighted in the second window. If the application program corresponding to the second window does not support the file type of the document, the document is displayed in gray in the second window.
  • the electronic device may be a portable electronic device that also includes other functions such as personal digital assistant and/or music player functions, such as a cell phone, a tablet computer, a wearable electronic device with wireless communication capabilities (eg, a smart watch) Wait.
  • portable electronic devices include, but are not limited to, portable electronic devices powered by or other operating systems.
  • the above-mentioned portable electronic device may also be other portable electronic devices, such as a laptop computer (Laptop) or the like. It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer.
  • FIG. 2A shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna, a wireless communication module 140, an audio module 150, a sensor module 160, a motor 170, and display screen 180 and so on.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna, a wireless communication module 140, an audio module 150, a sensor module 160, a motor 170, and display screen 180 and so on.
  • USB universal serial bus
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, electronic device 100 may also include one or more processors 110 .
  • the controller can generate an operation control signal according to the instruction operation code and the timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 .
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 may be respectively coupled to the touch sensor 160B and the like through different I2C bus interfaces.
  • the processor 110 can couple the touch sensor 160B through the I2C interface, so that the processor 110 and the touch sensor 160B communicate through the I2C bus interface, so as to realize the touch function and the drag function of the electronic device 100 .
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 140 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 140 through the UART interface to implement the Bluetooth function.
  • the audio module 150 can transmit audio signals to the wireless communication module 140 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect peripheral devices such as the processor 110 and the display screen 180 .
  • the MIPI interface includes a display serial interface (display serial interface, DSI) and the like.
  • the processor 110 communicates with the display screen 180 through a DSI interface to implement the display function of the electronic device 100 to present a dragging display effect to the user.
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 183, the display screen 180, the wireless communication module 140, the audio module 150, the sensor module 160, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB port 140 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the wireless communication function of the electronic device 100 may be implemented by an antenna, a wireless communication module 140, a baseband processor, and the like.
  • Antennas are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example, multiplexing can be used as a diversity antenna for a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the wireless communication module 140 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 140 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 140 receives electromagnetic waves via the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 150 can also receive the signal to be sent from the processor 110, perform frequency modulation on it, amplify the signal, and then convert it into electromagnetic waves for radiation through the antenna.
  • the antenna of the electronic device 100 is coupled with the wireless communication module 140 so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the WLAN wireless communication solution provided by the wireless communication module 140 may also enable the electronic device to communicate with a device in the network (eg, a device cooperating with the electronic device 100 ). In this way, data transmission can be performed between the electronic device and the cooperating device.
  • the electronic device 100 implements a display function through a GPU, a display screen 180, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 180 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change interface display effects.
  • the display screen 180 may include a display and a touch device.
  • the display is used to output display content to the user, for example, the first snapshot and the second snapshot involved in the embodiment of the present application, the first identifier included in the second snapshot, and the display effect previewed before the user leaves the hand.
  • the touch device is used to receive a drag operation input by the user on the display screen 180 .
  • the display screen 180 is used to display a user interface (user interface, UI) involved in the embodiments of the present application.
  • the display screen 180 includes a display panel, and the display panel may adopt a liquid crystal display screen 180 (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode.
  • Polar body active-matrix organic light emitting diode, AMOLED
  • flexible light emitting diode flex light-emitting diode, FLED
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (quantum dot light emitting diodes, QLED), etc.
  • the display screen 180 when the display panel adopts materials such as OLED, AMOLED, FLED, etc., the display screen 180 may be bent.
  • the display screen 180 can be bent means that the display screen 180 can be bent to any angle at any position, and can be maintained at the angle, for example, the display screen 180 can be folded in half from the middle. It can also be folded up and down from the middle.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save data such as music, photos, videos, etc. in an external memory card.
  • Internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may execute the above-mentioned instructions stored in the internal memory 121, thereby causing the electronic device 100 to execute the interface display method, various functional applications, and data processing provided in some embodiments of the present application.
  • the internal memory 121 may include a storage program area and a storage data area. Wherein, the stored program area may store the operating system; the stored program area may also store one or more application programs (such as gallery, contacts, etc.) and the like.
  • the storage data area may store data created during the use of the electronic device 100 (such as the number of objects dragged by the user, etc.) and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the audio module 150 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 150 may also be used to encode and decode audio signals. In some embodiments, the audio module 150 may be provided in the processor 110 , or some functional modules of the audio module 150 may be provided in the processor 110 .
  • the audio module 150 may include a speaker, a microphone, a headphone jack, and the like.
  • the sensor module 160 may include a pressure sensor 160A, a touch sensor 160B.
  • the pressure sensor 160A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 160A may be disposed on the display screen 180 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 160A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 180, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 160A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 160A.
  • the touch sensor 160B may also be referred to as a touch panel or touch sensitive surface.
  • the touch sensor 160B may be disposed on the display screen 180 , and the touch sensor 160B and the display screen 180 form a touch screen, also referred to as a "touch screen".
  • the touch sensor 160B is used to detect touch operations and off-hand operations acting on or near it.
  • Touch sensor 160B may communicate the detected touch operation to processor 110 to determine the type of touch event.
  • the electronic device 100 can calculate the position touched by the user and the position where the user leaves the hand according to the detection signal of the touch sensor 160B, and can also determine and recognize the drag operation of the user according to the continuous change of the position touched by the user. Further, the electronic device 100 may provide visual outputs related to the aforementioned related operations (touch operations, off-hand operations, and drag operations) through the display screen 180 .
  • the sensor module 160 may further include a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • Motor 170 can generate vibrating cues.
  • the motor 170 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 170 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 180 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 2B is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications (APPs) such as camera, gallery, mailbox, Bluetooth, memo, music, video, and file management.
  • APPs applications
  • camera gallery
  • mailbox mailbox
  • Bluetooth Bluetooth
  • memo music
  • music videos
  • file management files
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a view system, a drag and drop manager, a content provider, a resource manager, a notification manager, and the like.
  • the functional modules of the application framework layer may be integrated into the processor 110 illustrated in FIG. 2A , and the functions of the application framework layer in this embodiment may be implemented by the hardware processor 110 illustrated in FIG. 2A .
  • a window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen 184, determine whether there is a status bar, lock the screen, take a screenshot, and so on.
  • the window manager can also manage the distribution of each APP in the application layer and the window layout of each APP, so as to realize the function of displaying two APP windows on the display screen 184 .
  • the window manager has functions such as identifying the file types supported by the APP, so that the window manager can determine whether the APP can support the file type of the object dragged by the user.
  • the view system includes visual interface elements, such as interface elements that display text, interface elements that display images, and so on.
  • the view system can be used to build the display interface of the app.
  • a display interface can consist of one or more views. For example, it includes the display interface of various APP icons, etc.
  • the view system can also build snapshots of dragged objects.
  • the snapshot includes, for example, the size of the snapshot, a logo, and the like, and the logo may include layers, logos, and the like.
  • the drag manager may determine the position touched by the user and the snapshot of the corresponding object based on the detection signal reported by the touch sensor 160B. Furthermore, the drag manager can control the corresponding snapshot to move on the display screen 180 along with the position touched by the user, so as to realize the drag function.
  • FIG. 3 illustrates a signal interaction diagram inside the Android operating system.
  • the window manager may control the window of the first APP and the window of the second APP to be displayed on the display screen 180 in response to the user's operation instruction.
  • the view system draws a snapshot of the object corresponding to the touch signal.
  • the drag manager controls the corresponding snapshot to move along with the user's gesture track.
  • the window manager can detect the location of the snapshot on the display screen 180 . After detecting that the aforementioned snapshot is located within the window range of the second APP, the window manager may determine whether the second APP supports the file type corresponding to the snapshot.
  • the window manager transmits the judgment result to the view system. Furthermore, the view system adds a layer to the aforementioned snapshot based on the judgment result, so as to feed back to the user the display effect of the corresponding object received by the second APP through the color and brightness of the layer. In other embodiments, the view system may also add icons to the aforementioned snapshots to indicate the number of objects dragged by the user. In other embodiments, the view system may also draw the size of the snapshots that the second APP is allowed to receive according to the size of the second APP window, so that the size of the corresponding snapshot is adapted to the size of the second APP window.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the resource manager provides various resources for the application, such as localization strings, icons, images, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library media library
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes a display driver, a camera driver, an audio driver, a sensor driver, and the like, which are not limited in this embodiment of the present application.
  • the electronic device 100 may display the first window and the second window through the display screen 180 . Based on this, as shown in FIG. 4 , the electronic device 100 receives a drag instruction input by the user, where the drag instruction is used to instruct to drag the first object in the first window into the second window. If the application program corresponding to the second window supports the file type of the first object, the electronic device 100 displays the first snapshot of the first object. If the application program corresponding to the second window does not support the file type of the first object, the electronic device 100 displays a second snapshot of the first object, and the second snapshot includes the first identifier, and the first identifier is used to indicate that the second window does not accept the first image. an object. In this way, the electronic device 100 can feedback the dragging effect to the user during the process of dragging the object by the user, thereby improving the user's operation experience.
  • the first window displayed by the electronic device 100 may be a window of a first APP, and the second window may be a window of a second APP.
  • the above-mentioned first object may include text, images, audio files, and the like.
  • the dragged element displayed by the electronic device 100 may be a snapshot of the dragged object.
  • the electronic device 100 after receiving the drag instruction, the electronic device 100 generates a third snapshot of the first object, and the third snapshot moves with the user's drag track. After the third snapshot is dragged into the second window and before the user leaves the hand, the electronic device 100 may process the third snapshot to obtain the first snapshot or the second snapshot.
  • the snapshot can be the layer of the interface icon of the dragged object, and the content of the snapshot is the same as the interface icon of the dragged object.
  • the size and aspect ratio of the snapshot may be the same as the size and aspect ratio of the interface icon of the corresponding object, such as the size and aspect ratio of the image 13 in FIG. 5 and the size and aspect ratio of the snapshot 131. same.
  • the size and aspect ratio of the snapshot may also be different from the size and aspect ratio of the interface icon of the corresponding object, for example, the size and aspect ratio of the image 23 in FIG. 7 and the size of the snapshot 231 and different aspect ratios.
  • the above-mentioned first identification includes a layer. In other embodiments, the above-mentioned first identification includes a logo. In some other embodiments, the above-mentioned first identification includes a layer and a mark. The layer can be gray or dark, and the logo can be some specific symbols, so as to remind the application program corresponding to the second window that the first object cannot be received by displaying the layer and/or logo.
  • the first object in the first APP window is dragged into the second APP window, which essentially triggers the electronic device 100 to copy the first object in the first APP into the second APP.
  • the electronic device 100 can copy the first object to the second APP. If the second APP does not support the file type of the first object, or the corresponding window interface of the second APP cannot receive the first object, the electronic device 100 does not need to copy the first object to the second APP.
  • the electronic device 100 may control the snapshot of the first object to return to the window of the first APP, and hide the corresponding snapshot.
  • the first object is an image
  • the second APP is a mailbox APP.
  • the electronic device displays the sending address window and the mailbox content window of the mailbox APP, wherein the sending address window cannot receive the first object, but the mailbox content window can receive the first object.
  • the first object is a video
  • the second APP is an instant messaging APP.
  • the scenarios involved in the embodiments of the present application are all implementation scenarios before the user leaves the hand, and can also be described as a scenario in which the user's finger touches the screen of the electronic device.
  • the user's finger touching the screen of the electronic device may be the user's finger touching the screen of the mobile phone 10, or when the distance between the user's finger and the screen of the electronic device is less than 0.5 millimeters (mm), it may also be referred to as the user's finger Touch the screen of the electronic device.
  • the distance between the finger and the screen may be determined by the touch sensitivity of the mobile phone.
  • the removal of the user's finger from the screen of the electronic device may be considered as the user's up.
  • UI user interface
  • the UI involved in the embodiments of the present application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the conversion between an internal form of information and a form acceptable to the user.
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device, and finally presented as content that the user can recognize.
  • Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar) etc.
  • the attributes and contents of the above interface elements are defined by tags or nodes.
  • XML specifies the text type of interface elements through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to an interface element or attribute in the interface, and the node is presented as user-visible content after parsing and rendering.
  • applications such as hybrid applications, often contain web pages in their interface.
  • a web page, also known as a page can be understood as a special control embedded in an application interface.
  • a web page is source code written in a specific computer language, such as hypertext markup language (HTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
  • the source code of the web page can be loaded and displayed as user-identifiable content by a browser or a web page display component similar in function to a browser.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page. For example, HTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • GUI graphical user interface
  • It can be an icon, window, control and other interface elements displayed on the display screen 180 of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc.
  • Visual interface elements displayed on the display screen 180 of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc.
  • the electronic device 100 presents different display effects according to different reception conditions of the corresponding object by the second APP.
  • the "drag process” described in the embodiments of the present application refers to the operation process from after the user touches the object in the first APP window to before the user leaves the hand.
  • FIG. 5 takes the mobile phone 10 as an example, and a group of GUIs are used to illustrate the dragging process and the interface display effect of the user.
  • the GUI is a display interface when the user performs a drag operation.
  • the screen of the mobile phone 10 displays a first window 11 and a second window 12 .
  • the first window 11 includes an image 13 therein.
  • the view system of the mobile phone 10 draws a snapshot of the image 13 to obtain the snapshot 131 shown in (a) of FIG. 5 .
  • the snapshot 131 is an exemplary implementation of the aforementioned third snapshot.
  • the touch sensor of the mobile phone 10 continues to detect signals. Accordingly, the drag manager of the mobile phone 10 can control the movement of the snapshot 131 along the user based on the signal detected by the touch sensor. The track moves to implement the drag and drop function.
  • the mobile phone 10 can receive a split screen operation input by the user, and in response to the split screen operation, divide the screen into two areas. Afterwards, the mobile phone 10 displays the first window 11 of the first APP in one of the two areas in response to the operation of opening the APP input by the user, and displays the first window 11 of the second APP in the other area of the two areas. Two windows 12. Not detailed here.
  • the first APP and the second APP may be any APP installed on the mobile phone 10 .
  • the first APP is, for example, a gallery.
  • the first window 11 is a window of the gallery
  • the second APP is, for example, a memo.
  • the second window 12 is a window of the memo. No limit here.
  • the window manager of the mobile phone 10 may determine that the snapshot 131 has entered the second window 12 according to the position detected by the touch sensor. After that, the window manager can obtain the file type (ie, the image type) of the image 13 according to the information of the corresponding object (ie, the image 13 ) of the snapshot 131 . Further, the window manager detects whether the file type supported by the second APP (ie, the APP corresponding to the second window 12 ) includes an image. If the file type supported by the second APP contains images, the window manager may transmit the "contained" judgment result to the view system. The view system may not do anything with the snapshot 131.
  • the GUI shown in (b) of FIG. 5 is updated to the GUI shown in (b) of FIG. 5 .
  • the processor of the cell phone 10 may copy the image 13 to the second window 12 and the view system draws the copied image of the image 13 within the second window 12 . If the file type supported by the second APP does not contain images, the window manager may transmit the judgment result of "not contained" to the view system.
  • the view system adds interface elements to snapshot 131 , resulting in snapshot 132 .
  • the GUI shown in (a) of FIG. 5 is updated to the GUI shown in (c) of FIG. 5 .
  • the mobile phone 10 usually determines the position where the user drags the snapshot by detecting the touch position of the user's finger on the screen. Based on this, in the above embodiment, in the scenario where the position determined by the detection signal of the touch sensor enters the second window 12 , the window manager determines that the snapshot 131 is located in the second window 12 .
  • the "supported file type" of the second APP may refer to a type based on the software framework of the second APP that can normally display, run normally, and implement corresponding functions in response to operation instructions.
  • "implementing a corresponding function in response to an operation instruction” means that after the object is dragged to the window of the second APP, the second APP can perform the operation of the corresponding function on the object based on its own function attributes.
  • the second window 12 is a chat window of a social APP, for example, the social APP supports the file type of the image 13 . Further, when the image 13 is dragged to the chat window in FIG.
  • the social APP can send the duplicate image of the image 13 to the contact corresponding to the chat window.
  • the second window is a window for email content.
  • the mailbox APP can insert the duplicate image of the image 13 into the mailbox as an attachment of the mail.
  • the above-mentioned "unsupported file type" of the second APP may refer to a type based on the software framework of the second APP that cannot be displayed normally, cannot run normally, or is limited in performing corresponding functions.
  • the aforementioned interface elements may include layer marks, labels and other marks.
  • the GUI is a display interface when the second APP supports images.
  • the view system does not perform any operation on the snapshot 131 until the user leaves the hand.
  • the snapshot 131 is normally displayed in the second window 12 so that the user can receive the image 13 by visually perceiving the second APP.
  • the first snapshot of the image 13 is implemented as a snapshot 131 in this embodiment.
  • the GUI is a display interface when the second APP does not support images.
  • the view system can add a gray layer and a logo 14 on top of the snapshot 131 to update the snapshot 131 to the snapshot 132 .
  • the snapshot 132 is dark or gray compared to the bright color or color of the snapshot 131 , and the upper right corner of the snapshot 132 includes the indicator 14 so that the user can visually perceive the second APP Image 13 is not received.
  • the second snapshot of the image 13 is implemented as a snapshot 132 in this embodiment, and correspondingly, the gray layer and the mark 14 are the first marks of the second snapshot.
  • the mobile phone 10 may also output a sound to notify the user that the second APP does not support objects whose file type is image.
  • the mobile phone 10 may further notify the user through vibration that the second APP does not support objects whose file type is image.
  • the mobile phone 10 can present different interface display effects to the user based on the support of the received APP for the file type of the dragged object. In this way, when the user drags the object, the mobile phone 10 can feedback the drag effect to the user, thereby improving the user's operation experience.
  • Scenario 2 illustrates an interface display method of the electronic device 100 in the process that the user drags at least two objects at a time.
  • FIG. 7 still takes the mobile phone 10 as an example, and a set of GUIs are used to illustrate the interface display effect of the user dragging at least two objects.
  • the GUI is a display interface when the user selects the object to be dragged.
  • the screen of the mobile phone 10 displays a first window 21 and a second window 22 .
  • the first window 21 includes an image 23 and text 24 .
  • the mobile phone 10 may, for example, display a check box 230 at the lower right corner of the image 23 and a check box 240 at the lower right corner of the text 24 .
  • the cell phone 10 In response to the user clicking the check box 230 and the check box 240 , the cell phone 10 receives a selection instruction that associates the image 23 and the text 24 , and in turn, the cell phone 10 selects the image 23 and the text 24 . After that, after receiving the drag instruction input by the user, the mobile phone 10 executes the drag function, and the GUI shown in (a) of FIG. 7 is updated to the GUI shown in (b) of FIG. 7 .
  • the mobile phone 10 can recognize the corresponding instruction as the aforementioned dragging instruction.
  • a GUI as shown in (b) of FIG. 7 is marked, and the GUI is a display interface when the user performs a drag operation.
  • the view system of the mobile phone 10 generates a snapshot 231 of the image 23, a snapshot 241 of the text 24, and a logo 25.
  • the content of the logo 25 can be the user dragged.
  • the total number of dragged objects, in this example, the content of flag 25 is 2.
  • the view system displays the two snapshots as a stack, and highlights the area of the logo 25 in the upper right corner of the stacked snapshots.
  • the drag manager of the mobile phone 10 can control the snapshot 231 , the snapshot 241 and the logo 25 to move along the user's movement track to implement the drag function. Further, after detecting that the user drags the snapshot 231 and the snapshot 241 to the second window 22, the window manager detects whether the APP corresponding to the second window 22 supports objects whose file type is an image, and the APP corresponding to the second window 22. Whether to support objects of file type text. After that, the view system adds interface elements for snapshot 231 and snapshot 241 based on the detection result of the window manager. Accordingly, the GUI shown in (b) of FIG. 7 is updated to the GUI shown in (c) of FIG. 7 .
  • the view system may only draw snapshots of the corresponding at least two objects, and display the snapshots of the corresponding at least two objects in a stack without adding a flag indicating the total number of objects.
  • the two snapshots can be scaled to make the size of the two snapshots the same (see (b) in Figure 7).
  • the display effect shown so that during the stacking display process, the user's perception experience is better.
  • the view system may generate a snapshot of the same size as the image 23 itself, and generate a snapshot of the same size as the text 24 itself, after which, the two snapshots are not scaled, but are displayed directly stacked. No limit here.
  • the GUI is a display interface in which the user drags the snapshot 231 and the snapshot 241 to the second window 22 .
  • the APP corresponding to the second window 22 supports, for example, objects whose file type is image, but does not support objects whose file type is text.
  • the view system sets a flag 26 in the upper right corner of the snapshot 231 to obtain the snapshot 232 , and adds a gray layer on the interface of the snapshot 241 , and sets a flag 27 in the upper right corner of the snapshot 241 to obtain the snapshot 242 .
  • the flag 26 is highlighted, and the content of the flag 26 is, for example, the total number of objects that the APP corresponding to the second window 22 can receive.
  • the mark 27 is gray, and the content of the mark 27 is, for example, the total number of objects that the APP corresponding to the second window 22 does not receive.
  • the view system controls the snapshots 231 and 241 displayed in a stack in (b) of FIG. 7 to become the snapshots 232 and 242 tiled on the interface to display the snapshots 232 to the user.
  • the upper right area of the snapshot 232 includes a highlighted sign 26, the content of which is "1", indicating that the second window 22 can receive one of the above two objects.
  • the snapshot 242 is dark or gray, and the upper right area of the snapshot 242 includes a dark or gray mark 27 whose content is "1", indicating that the second window 22 does not receive one of the two objects.
  • the view system may only add a gray layer and a logo 27 on the snapshot 241 , and do nothing to the snapshot 231 .
  • the view system can tile and display the snapshot 231 and the snapshot 242 after adding the gray layer and the logo 27 on the interface.
  • the snapshot after adding the gray layer and the logo 27 is similar to the snapshot 242 shown in (c) in FIG. 6 , and will not be described in detail here.
  • the view system can control the snapshots of the at least three objects to be divided into two groups respectively. display, wherein the snapshots of the objects that can be received by the second APP form a group, at least one snapshot in the group is displayed stacked, and the total number of objects that the second APP can receive is displayed. Snapshots of objects not received by the second APP form another group, at least one snapshot in the group can be displayed stacked and showing the total number of objects not received by the second APP.
  • the GUI is a display interface for the user to drag and drop four objects.
  • the view system controls the snapshot stack display of the four objects, and sets the highlighted content as 4 numbers in the upper right corner area of the stack display snapshot. logo (not shown in the figure).
  • the window manager determines that the second window 32 can receive two of the objects and cannot receive another two objects. Further, as shown in FIG.
  • the view system controls the stack display of two snapshots that can be received by the second window 32, and marks a highlighted mark 33 in the upper right corner of the two snapshot stack interfaces, and the number indicated by the mark 33 is "2".
  • the view system adds a gray layer to the two snapshots that are not received by the second window 32, and then controls the stacked display of the two snapshots after the gray layer is added, and marks the gray display in the upper right area of the two snapshot stack interface
  • the symbol 34, the number indicated by the symbol 34 is "2".
  • the view system can control the snapshots of the at least three objects to be divided into two groups respectively. display, wherein the snapshots of the objects that can be received by the second APP form a group, at least one snapshot in the group is tiled and displayed, and the upper right area of each snapshot displays the sequence number of the snapshot. Snapshots of objects not received by the second APP form another group, at least one snapshot in the group can be tiled and displayed, and the upper right area of each snapshot displays the sequence number of the snapshot.
  • the GUI is another display interface for the user to drag and drop four objects.
  • the view system controls the snapshot stack display of the four objects, and highlights the sign with the number 4 in the upper right corner area of the stack display snapshot. (not shown in the figure).
  • the window manager determines that the second window 42 can receive two of the objects and cannot receive the other two objects.
  • the view system controls the two snapshots that can be received by the second window 32 to be tiled and displayed in the horizontal direction of the user's viewing angle.
  • the view system adds a gray layer to the two snapshots that are not received by the second window 32, and controls the two snapshots after adding the gray layer to be below the two snapshots that the second window 32 can receive, at the user's perspective.
  • Tile display horizontally.
  • the sign 45 containing the number 1 in the upper right corner area of the snapshot arranged on the left is gray
  • the sign 46 containing the number 2 in the upper right area of the snapshot arranged on the right is gray.
  • FIG. 7 to FIG. 9 are only schematic descriptions, and do not limit the embodiments of the present application.
  • the user may also drag more or fewer objects.
  • other display effects can also be presented.
  • the signs containing numbers in the above-mentioned embodiments may be set in the lower right corner area of the snapshot. This embodiment of the present application does not limit this.
  • the mobile phone 10 can present more detailed drag effect information to the user in the process of feedback of the drag effect, so that the user's operation experience can be further improved.
  • the size of the snapshot of the dragged object drawn by the view system is, for example, the first size.
  • the electronic device 100 can adaptively store the object according to the size of the second window. The snapshot is adjusted from the first size to the second size, so that the snapshot can be completely within the window, and the size ratio of the snapshot and the window can achieve the best visual effect.
  • the electronic device 100 may reduce the snapshot from the first size to the second size. Exemplarily, as shown in the scene illustrated in (a) of FIG. 10 . In other embodiments, the electronic device 100 may enlarge the snapshot from the first size to the second size. Exemplarily, as shown in the scenario illustrated in (b) of FIG. 10 .
  • the electronic device 100 may adjust the size of the snapshot so that the ratio of the snapshot size to the window size is greater than or equal to the first threshold. and less than or equal to the second threshold, so that the visual effect of the snapshot in the second window is optimal.
  • both the first threshold and the second threshold are greater than 0 and less than 1.
  • the first threshold is, for example, 50%
  • the second threshold is, for example, 70%. No limit here.
  • the above-mentioned “size” may include two parameters, a pixel in the x-axis direction (pixel, px) and a pixel in the y-axis direction.
  • the pixels in the x-axis direction of the snapshot are referred to as the "width” of the snapshot
  • the pixels in the y-axis direction of the snapshot are referred to as the "height" of the snapshot.
  • the GUI is a display interface for reducing the snapshot size of the mobile phone 10 .
  • the view system of the mobile phone 10 draws a snapshot of the image 50 to obtain a snapshot 51 .
  • Snapshot 51 is, for example, 1080px wide and 500px high.
  • the second APP includes, for example, a window 52 and a window 53 , and the user drags and drops the snapshot 51 into the window 52 , for example.
  • the window 52 is, for example, 1500px wide and 400px high.
  • the view system of the mobile phone 10 After detecting that the user drags the snapshot 51 to the window 52 of the second APP and determining that the second APP supports an object whose file type is an image, the view system of the mobile phone 10 adjusts the width of the snapshot 51 to, for example, 1000px, and adjusts the For example, the height is adjusted to 380px, and the snapshot 54 is obtained, so that the area of the snapshot 54 occupies 63% of the area of the window 52 . In this way, in response to the user's hands-off operation, the cell phone 10 may generate a new image according to the width and height of the snapshot 54 .
  • the first snapshot of image 50 is implemented as snapshot 54 in this embodiment.
  • the GUI is a display interface for the mobile phone 10 to enlarge the snapshot size.
  • the user drags the snapshot 51 into the window 53, for example.
  • the window 53 is, for example, 1500px wide and 1200px high.
  • the view system of the mobile phone 10 adjusts the width of the snapshot 51 to, for example, 1200px, and adjusts the height of the snapshot 51 to, for example, 1200px. Adjust to 900px to get snapshot 55, so that the area of snapshot 55 occupies 60% of the area of window 53.
  • the mobile phone 10 can generate a new image according to the width and height of the snapshot 55 .
  • the first snapshot of image 50 is implemented as snapshot 55 in this embodiment.
  • the view system may also add a highlighted mark to the snapshot 54 and the snapshot 55 , and the content of the mark is, for example, 1.
  • FIG. 10 is only a schematic description, and the dimensions shown in (a) of FIG. 10 and (b) of FIG. 10 do not limit the embodiments of the present application.
  • the electronic device 100 is implemented as a device with a larger screen, such as a mobile phone with a folding screen, the size of the above-mentioned window and the size of the snapshot can be larger. Not detailed here.
  • the mobile phone 10 can adaptively adjust the size of the snapshot according to the size of the receiving window during the dragging process by the user, so that the user can preview the size of the received object, thereby further improving the user's operating experience.
  • the third scenario is an interface display method of the electronic device 100 in a scenario where the second APP receives the first object. If the second APP continues to receive the second object and more objects, the electronic device 100 may display a preview effect of each object according to the size of the window and the number of objects that have been received. The electronic device 100 may also display a snapshot of the second object and a preview effect of the relative position of the icon of the first object according to the position triggered during the dragging process by the user.
  • the electronic device 100 may adjust the size of each object interface element in the window based on the condition that the sum of the sizes of all object icons in the window accounts for a proportion of the window size that is greater than or equal to the first threshold and less than or equal to the second threshold, to show the user the preview interface after the user leaves.
  • the first threshold and the second threshold are as described in the above-mentioned embodiments, and are not repeated here.
  • Scenario 4 Taking the mobile phone 10 as an example, the interface display method of the electronic device 100 is described through the implementation scenario of dragging and dropping two objects in combination with a set of GUIs shown in FIG. 11 .
  • the GUI is an exemplary display interface of the mobile phone 10 .
  • the second APP has received the first object 61 , and the icon of the first object 61 is shown in the second window 60 .
  • the second window 60 is, for example, 1500px wide and 1300px high.
  • the icon of the first object 61 is, for example, 1200px wide and 1100px high.
  • the GUI is an exemplary preview display interface of the mobile phone 10 in the process of the user dragging the second object to the second APP.
  • the window manager of the mobile phone 10 detects that the position touched by the user is on the right side of the first object 61 , and further, the view system of the mobile phone 10 can reduce the width of the icon of the first object 61 At 720px, the height remains the same, giving a snapshot of 63.
  • the view system also adjusts the width of the snapshot 62 to 720px and the height to 1300px to obtain the snapshot 64, and sets the snapshot 64 to the right of the snapshot 63 to present the preview interface of the first object 61 and the second object, so as to display the preview interface of the first object 61 and the second object in advance.
  • the user shows the display effect.
  • the view system arranges and displays the copied content of the second object on the right side of the first object according to the preview display effect shown in (b) in FIG. 11 , and makes the icon of the second object
  • the dimensions and the dimensions of the first object icon are as described above.
  • the GUI is another exemplary preview display interface of the mobile phone 10 during the process of the user dragging the second object to the second APP.
  • the window manager of the mobile phone 10 detects that the position touched by the user is on the left side of the first object 61 , and the view system sets the snapshot 64 on the left side of the snapshot 63 .
  • Corresponding size adjust the size of the icon of the first object 61 and the size of the snapshot 63 to present another preview display interface of the first object 61 and the second object.
  • the view system arranges and displays the copied content of the second object on the left side of the first object according to the preview display effect shown in (c) in FIG. 11 , and makes the size of the icon of the second object and the size of the first object icon as previously described.
  • FIG. 11 is only a schematic description, and does not constitute a limitation to the embodiments of the present application.
  • the icons or snapshots of the above two objects may also be arranged according to the upper and lower positional relationship, which is not limited here.
  • the mobile phone 10 can adaptively adjust the icons of the received objects according to the total number of objects received by the second APP. size, and the size of the snapshot to be taken away. And the mobile phone 10 can adjust the arrangement order of each icon in response to the position touched by the user's finger. It will not be described in detail here.
  • the mobile phone 10 can adaptively adjust the size and arrangement order of each icon according to the size of the receiving window and the number of objects that have been received during the dragging process by the user, so that the user can preview the size and order of the received objects. Arrange the order, so as to further improve the user's operating experience.
  • any one of the above scenarios 1 to 4 is described by taking a mobile phone with a non-flexible screen as an example, the embodiments illustrated in the above scenarios are also applicable to a mobile phone with a flexible screen. It will not be repeated here.
  • FIG. 5 to FIG. 11 are only schematic descriptions, and do not limit the embodiments of the present application.
  • the electronic device 100 may also determine the user's drag operation according to an instruction entered by the user through a touchpad or a mouse.
  • the electronic device 100 may also notify the user through other interface display effects.
  • the mobile phone 10 can also display a dialog box, and the content of the dialog box can be, for example, a reminder message "x cannot receive the image 13", where "x" is the second APP's name. No limit here.
  • the first device may be wirelessly connected to the second device, and after that, in response to the collaborative office setting input by the user, the interface of the first device may, for example, display the information of the second device. Collaboration window. Furthermore, the first device can localize the application of the second device, so that the user can operate the second device on the first device side.
  • the operation interface of the laptop computer 30 is as shown in the GUI shown in FIG. 13 .
  • the GUI shown in (a) of FIG. 13 is a display interface of the notebook computer after the collaborative office connection between the notebook computer and the mobile phone is established.
  • the screen of the notebook computer 30 displays the application program of the notebook computer 30 and the collaboration interface 71 of the mobile phone.
  • the collaborative interface of the mobile phone includes, for example, interface elements such as APP icons included in the main interface of the mobile phone.
  • the notebook computer receives the instruction of the user to click the application program in the notebook computer, and can display the window of the corresponding application program on the screen of the notebook computer.
  • the notebook computer receives the instruction of the user to click on the APP in the collaboration interface 71 , and can display the window of the corresponding APP in the collaboration interface 71 .
  • the GUI shown in (a) of FIG. 13 is updated to the GUI shown in (b) of FIG. 13 .
  • the GUI shown in (b) of FIG. 13 is a display interface that displays two windows for the notebook computer.
  • the screen of the notebook computer displays the mailbox window 72 and the gallery window 73 , wherein the window 73 is displayed in the collaboration interface 71 , and at least one image is displayed in the window 73 .
  • the notebook computer can execute the embodiment illustrated in any one of the first to fourth scenarios.
  • the notebook computer can also respond to the user's operation instruction of dragging any object from the window 72 to the window 73, and execute the embodiment of any scene illustration from the first scene to the fourth scene. It will not be repeated here.
  • FIG. 12 to FIG. 13 are only schematic descriptions, and do not limit the embodiments of the present application.
  • the notebook computer in response to the user clicking to open other applications, the notebook computer may also display windows of other applications.
  • the display interface may be different from the above-mentioned embodiment. For example, the interface of collaborative office is black and more windows can be displayed. No limit here.
  • the electronic device can process the snapshot of the object according to whether the target APP supports the file type of the object and the size of the target APP window, etc. Present different display effects. In this way, the electronic device can present the display effect of whether the target APP receives the object to the user during the dragging process of the user, thereby improving the user's use experience.
  • the above embodiments have introduced various solutions of the interface display method provided by the present application from the perspective of the hardware structure of the electronic device and the actions performed by each software and hardware.
  • Those skilled in the art should be easily aware that the processing steps of receiving a drag instruction, detecting whether the APP corresponding to the second window supports the file type of the dragged object, displaying different snapshots, etc., described in conjunction with the embodiments disclosed herein,
  • the present application can not only be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Experts may use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • the above-mentioned electronic device 100 may implement the above-mentioned corresponding functions in the form of functional modules.
  • the electronic device may include a receiving module, a processing module and a display module.
  • the receiving module may be used to perform the receiving of instructions and information in any of the embodiments illustrated in FIG. 4 to FIG. 13 .
  • the display module can be used to perform the display of windows and snapshots in any of the above-mentioned embodiments illustrated in FIGS. 4 to 13 .
  • the processing module may be configured to perform operations other than the reception of instructions and information and the display of windows and snapshots in any of the above-described embodiments illustrated in FIGS. 4 to 13 .
  • the electronic device 140 includes a receiver 1401 , a processor 1402 and a display 1403 .
  • the receiver 1401 can perform the reception of instructions and information in any of the embodiments illustrated in FIG. 4 to FIG. 13 .
  • the display 1403 can be used to perform the display of windows and snapshots in any of the above-mentioned embodiments illustrated in FIGS. 4 to 13 .
  • the processor 1402 may be configured to perform operations other than the reception of instructions and information and the display of windows and snapshots in any of the above-described embodiments illustrated in FIGS. 4 to 13 .
  • the display 1403 displays a first window and a second window.
  • the receiver 1401 may be configured to receive a drag instruction input by the user, where the drag instruction is used to instruct the electronic device to drag the first object in the first window to the second window Inside.
  • the processor 1402 may be configured to control the display to display the first snapshot of the first object when the application corresponding to the second window supports the file type of the first object.
  • the processor 1402 may also be configured to control the display to display a second snapshot of the first object when the application program corresponding to the second window does not support the file type of the first object, and the second snapshot
  • the snapshot includes a first identifier, and the first identifier is used to indicate that the second window does not accept the first object.
  • FIG. 14A describes the electronic device of the present application from the perspective of independent functional entities.
  • functional entities that run independently may be integrated into one hardware entity.
  • the electronic device 141 may include a processor 1411 , a transceiver 1412 and a memory 1413 .
  • the electronic device 141 in this embodiment of the present application may correspond to the electronic device involved in FIG. 4 , the mobile phone 10 in the methods illustrated in FIGS. 5 to 11 , and the notebook computer in the embodiments illustrated in FIGS. 12 and 13 .
  • the transceiver 1412 is used for receiving the instructions executed by the electronic device described in FIG. 4 to FIG. 13
  • the memory 1413 can be used for storing codes
  • the processor 1411 is used for executing the codes stored in the memory 1413, so as to realize FIG. 4 to FIG.
  • the other processing performed by the electronic device described in 13 except for receiving the instruction and displaying the snapshot, will not be repeated here.
  • the embodiment of the present application also provides a computer storage medium corresponding to the electronic device, wherein the computer storage medium provided in the electronic device can store a program, and when the program is executed, it can implement the methods provided in FIG. 4 to FIG. 13 . Part or all of the steps in each embodiment of the interface display method.
  • the storage medium in any device may be a magnetic disk, an optical disk, a read-only memory (ROM) or a random access memory (RAM), and the like.
  • One or more of the above modules or units may be implemented in software, hardware or a combination of both.
  • the software exists in the form of computer program instructions and is stored in the memory, and the processor can be used to execute the program instructions and implement the above method flow.
  • the processor may include, but is not limited to, at least one of the following: a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a microcontroller (MCU), or artificial intelligence Processors and other types of computing devices that run software, each computing device may include one or more cores for executing software instructions to perform operations or processing.
  • the processor can be built in a SoC (system on chip) or an application specific integrated circuit (ASIC), or can be an independent semiconductor chip.
  • SoC system on chip
  • ASIC application specific integrated circuit
  • the internal processing of the processor may further include necessary hardware accelerators, such as field programmable gate array (FPGA), PLD (Programmable Logic Device) , or a logic circuit that implements dedicated logic operations.
  • FPGA field programmable gate array
  • PLD Programmable Logic Device
  • a logic circuit that implements dedicated logic operations.
  • the hardware can be CPU, microprocessor, DSP, MCU, artificial intelligence processor, ASIC, SoC, FPGA, PLD, dedicated digital circuit, hardware accelerator or non-integrated discrete device Any one or any combination, which may or may not run the necessary software to perform the above method flow.
  • the above modules or units When the above modules or units are implemented using software, they can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present invention are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center is by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)), and the like.
  • the size of the sequence number of each process does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, and should not be Implementation constitutes any limitation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及电子技术领域,公开了一种界面显示方法及电子设备。本申请涉及的电子设备的屏幕显示第一窗口和第二窗口。进而,该电子设备接收将第一窗口内的第一对象拖拽到第二窗口内的拖拽指令。当第二窗口对应的应用程序支持第一对象的文件类型,电子设备显示第一对象的第一快照。当第二窗口对应的应用程序支持第一对象的文件类型,电子设备显示第一对象的第二快照。第一快照与第二快照不同。这样,电子设备能够在用户拖拽第一对象的过程当中,即可向用户呈现目标窗口是否接收对象的显示效果,从而能够提高用户的使用体验。

Description

界面显示方法及电子设备
本申请要求于2020年09月09日提交到国家知识产权局、申请号为202010940657.1、发明名称为“界面显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及终端技术领域,尤其涉及一种界面显示方法及电子设备。
背景技术
随着电子设备的显示屏越来越大,在显示屏能够同时显示的应用程序(application,APP)的窗口数量也在随之增多,进一步的,衍生出了APP窗口之间内容交互的技术。如图1所示的界面,电子设备的显示屏显示有第一APP的窗口01和第二APP的窗口02,窗口01内包含图像1,用户可以将窗口01内的图像1拖拽到窗口02内。通常,用户将图像1拖拽到窗口02的过程中,电子设备无拖拽效果的反馈,从而使得用户的操作体验不佳。
发明内容
本申请实施例提供了一种界面显示方法及电子设备,能够在用户执行拖拽操作的过程中及时向用户反馈效果。
第一方面,本申请实施例提供了一种界面显示方法,该方法应用于电子设备,所述电子设备的屏幕显示第一窗口和第二窗口,包括:接收用户输入的拖拽指令,所述拖拽指令用于指示所述电子设备将所述第一窗口内的第一对象拖拽到所述第二窗口内;若所述第二窗口对应的应用程序支持所述第一对象的文件类型,显示所述第一对象的第一快照;若所述第二窗口对应的应用程序不支持所述第一对象的文件类型,显示所述第一对象的第二快照,所述第二快照包含第一标识,所述第一标识用于指示所述第二窗口不接受所述第一对象。
其中,本申请实施例涉及的电子设备支持多个窗口的显示。前述第一窗口内可以是第一APP的窗口,第二窗口可以是第二APP的窗口。进而,用户将第一窗口内的第一对象拖拽到第二窗口内,实质上是触发电子设备将第一APP运行的第一对象复制到第二APP内。本申请实施例涉及的“用户将第一窗口内的第一对象拖拽到第二窗口内”,是指用户手指在屏幕的触摸位置进入第二窗口之内。基于此,当第二APP支持第一对象的文件类型,第二窗口能够接收第一对象。当第二APP不支持第一对象的文件类型,第二窗口不能够接收第一对象。而本申请实施例,在用户离手之前,电子设备可以通过显示不同的快照,告知用户第二窗口对第一对象的接收情况。
可见,采用本实现方式,电子设备能够根据第二窗口对应的应用程序是否支持第一对象的文件类型,对第一对象进行不同显示,从而为用户的拖拽过程提供及时准确的反馈。
在一种可能的设计中,显示所述第一对象的第二快照,包括:为所述第一对象的第三快照添加图层,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。其中,该图层用于向用户呈现第二窗口不能接收第一对象的显示效果。一些实施例中,图层可以是灰色或暗色。这样能够在用户 拖拽第一对象的过程中,及时向用户反馈拖拽的效果。
在一种可能的设计中,显示所述第一对象的第二快照,包括:为所述第一对象的第三快照添加第一标志,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。其中,该标志用于向用户呈现第二窗口不能接收第一对象的显示效果。这样能够在用户拖拽第一对象的过程中,及时向用户反馈拖拽的效果。
在一种可能的设计中,所述第一标志为图形标识,或者标号,所述第一对象包括至少一个对象,所述标号用于指示所述第二窗口不接收的第一对象的总数量。一些实施例中,第一标志可以是一些特定的符号。另一些实施例中,第一标志可以是数量符号。该数量符号用于指示第二窗口不接收的第一对象的总数量。本实施例中,第一标志可以是灰色或暗色。采用本实现方式,电子设备能够在反馈拖拽效果的过程中,向用户呈现更详细的拖拽效果信息,从而能够进一步提升用户的操作体验。
在一种可能的设计中,显示所述第一对象的第一快照,包括:在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸和所述第二窗口的尺寸,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成;将所述第三快照从所述第一尺寸调整为第二尺寸,得到所述第一快照,所述第二尺寸与所述第二窗口的尺寸的比值大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈值均大于0且小于1。其中,电子设备可以基于第二窗口的尺寸调整第一快照的尺寸,以使快照在第二窗口的视觉效果达到最佳。采用本实现方式,电子设备能够在用户拖拽过程中,使用户预览所接收的第一对象的尺寸,从而能够进一步提升用户的操作体验。
在一种可能的设计中,将所述第三快照从所述第一尺寸调整为第二尺寸,得到所述第一快照,包括:将所述第三快照由所述第一尺寸缩小到所述第二尺寸,得到所述第一快照;或者,将所述第三快照由所述第一尺寸放大到所述第二尺寸,得到所述第一快照。采用本实现方式,电子设备能够在用户拖拽过程中,根据接收窗口的尺寸适应性调整快照的尺寸,以使用户预览所接收的对象的尺寸,从而能够进一步提升用户的操作体验。
在一种可能的设计中,所述第二窗口包括第二对象的图标,所述显示所述第一对象对应的第一快照,包括:在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸、所述第二窗口的尺寸和所述图标的第三尺寸;将所述第三快照从所述第一尺寸调整为第四尺寸,得到所述第一快照,以及将所述图标从所述第三尺寸调整为所述第四尺寸,二倍的所述第四尺寸占所述第二窗口的尺寸的比例大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈值均大于0且小于1;根据用户触发的位置排列所述第一快照和调整尺寸后的所述图标。当第二窗口内已经包含其他对象,那么,电子设备能够根据第二窗口的尺寸以及第二窗口内对象的数量,适应性调整各个图标的尺寸和排列顺序,以向用户呈现预览显示效果,使用户预先看到所接收的对象的尺寸和排列顺序的显示效果,从而能够进一步提升用户的操作体验。
在一种可能的设计中,所述第一对象包括至少一个对象,显示所述第一对象对应的第一快照,还包括:为所述第三快照添加第二标识,得到所述第一快照,所述第二标识用于指示所述第二窗口能够接收的第一对象的总数量。采用本实现方式,电子设备能够在反馈拖拽效果的过程中,向用户呈现更详细的拖拽效果信息,从而能够进一步提升用户的操作体验。
在一种可能的设计中,在接收用户输入的拖拽指令之后,在显示所述第一快照或所述第二快照之前,还包括:生成所述第一对象的第三快照,所述第一对象包括至少一个对象;为所述第三快照添加第三标识,所述第三标识用于指示所述第一对象的总数量。本申请实施例中,电子设备还支持用户一次拖拽至少两个对象,相应的,在用户拖拽过程中,电子设备可以标识用户拖拽的对象的总数量。采用本实现方式,电子设备能够在反馈拖拽效果的过程中,向用户呈现更详细的拖拽效果信息,从而能够进一步提升用户的操作体验。
第二方面,本申请实施例提供了一种电子设备,该电子设备具有实现上述方法中电子设备行为的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。在一个可能的设计中,上述电子设备的结构中包括处理器、接收器和显示器,所述处理器被配置为处理该电子设备执行上述方法中相应的功能。所述接收器用于实现上述电子设备接收用户输入的各项指令。所述显示器用于实现上述电子设备对快照的显示。所述电子设备还可以包括存储器,所述存储器用于与处理器耦合,其保存该电子设备必要的程序指令和数据。
第三方面,本申请提供了一种计算机存储介质,该计算机存储介质中存储有指令,当所述指令在计算机上运行时,使得计算机执行第一方面及第一方面各种可能的实现方式中的界面显示方法的部分或全部步骤。
第四方面,本申请提供了一种计算机程序产品,该计算机程序产品在计算机上运行时,使得计算机执行第一方面及第一方面各种可能的实现方式中的界面显示方法的部分或全部步骤。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的一种拖拽场景的界面示意图;
图2A为本申请实施例提供的电子设备100的示例性硬件结构示意图;
图2B为本申请实施例提供的电子设备100的示例性软件架构示意图;
图3为本申请实施例提供的安卓(Android)操作系统内部的示例性信号交互示意图;
图4为本申请实施例提供的界面显示方法的方法流程图;
图5为本申请实施例提供的第一种示例性拖拽场景的界面示意图;
图6为本申请实施例提供的图5示意的拖拽场景中用户离手之后的界面示意图;
图7为本申请实施例提供的第二种示例性拖拽场景的界面示意图;
图8为本申请实施例提供的第三种示例性拖拽场景的界面示意图;
图9为本申请实施例提供的第四种示例性拖拽场景的界面示意图;
图10为本申请实施例提供的第五种示例性拖拽场景的界面示意图;
图11为本申请实施例提供的第六种示例性拖拽场景的界面示意图;
图12为本申请实施例提供的协同办公场景的示例性界面示意图;
图13为本申请实施例提供的基于图12的示例性拖拽场景的界面示意图;
图14A为本申请实施例提供的电子设备140的示例性组成示意图;
图14B为本申请实施例提供的电子设备141的示例性结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例的技术方案进行清楚地描述。
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请实施例的限制。如在本申请实施例的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,尽管在以下实施例中可能采用术语第一、第二等来描述某一类对象,但所述对象不应限于这些术语。这些术语仅用来将该类对象的具体对象进行区分。例如,以下实施例中可能采用术语第一、第二等来描述窗口,但窗口不应限于这些术语。这些术语仅用来将显示屏所显示的不同窗口进行区分。以下实施例中可能采用术语第一、第二等来描述的其他类对象同理,此处不再赘述。此外,术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
本申请实施例提供了一种界面显示方法,在电子设备的屏幕同时显示第一窗口和第二窗口时,用户可以将第一窗口内的第一对象拖拽到第二窗口内,并根据第二窗口对应的应用程序是否支持第一对象的文件类型,对第一对象进行不同显示,从而为用户的拖拽过程提供及时准确的反馈。
例如,第一窗口内的文档拖拽到第二窗口,若第二窗口对应的应用程序支持文档的文件类型,文档在第二窗口内高亮显示。若第二窗口对应的应用程序不支持文档的文件类型,文档在第二窗口内灰暗显示。
以下介绍电子设备、用于这样的电子设备的用户界面、以及用于使用这样的电子设备的实施例。
在一些实施例中,电子设备可以是还包含其它功能诸如个人数字助理和/或音乐播放器功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴电子设备(如智能手表)等。便携式电子设备的示例性实施例包括但不限于搭载或者其它操作系统的便携式电子设备。上述便携式电子设备也可以是其它便携式电子设备,诸如膝上型计算机(Laptop)等。还应当理解的是,在其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是台式计算机。
图2A示出了电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线,无线通信模块140,音频模块150,传感器模块160,马达170,以及显示屏180等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请实施例另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network  processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。在一些实施例中,电子设备100也可以包括一个或多个处理器110。
其中,控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了电子设备100的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标志模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器160B等。例如:处理器110可以通过I2C接口耦合触摸传感器160B,使处理器110与触摸传感器160B通过I2C总线接口通信,实现电子设备100的触摸功能和拖拽功能。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块140。例如:处理器110通过UART接口与无线通信模块140中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块150可以通过UART接口向无线通信模块140传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏180等外围器件。MIPI接口包括显示屏180串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和显示屏180通过DSI接口通信,实现电子设备100的显示功能,以向用户呈现拖拽的显示效果。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头183,显示屏180,无线通信模块140,音频模块150,传感器模块160等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接140口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线,无线通信模块140,以及基带处理器等实 现。
天线用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
无线通信模块140可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块140可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块140经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块150还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
在一些实施例中,电子设备100的天线和无线通信模块140耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
在一些实施例中,无线通信模块140提供的WLAN无线通信的解决方案也可使得电子设备可以与网络中的设备(如与电子设备100协同的设备)通信。这样,电子设备便可以与协同设备之间进行数据传输。
电子设备100通过GPU,显示屏180,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏180和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变界面显示效果。在本申请实施例中,显示屏180中可包括显示器和触控器件。显示器用于向用户输出显示内容,例如,本申请实施例涉及的第一快照和第二快照,第二快照包含的第一标识,以及用户离手之前预览的显示效果等。触控器件用于接收用户在显示屏180上输入的拖拽操作。
显示屏180用于显示本申请实施例涉及的用户界面(user interface,UI)。显示屏180包括显示面板,显示面板可以采用液晶显示屏180(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。
在本申请的一些实施例中,当显示面板采用OLED、AMOLED、FLED等材料时,显示屏180可以被弯折。这里,显示屏180可以被弯折是指显示屏180可以在任意部位被弯折到任意角度,并可以在该角度保持,例如,显示屏180可以从中部左右对折。也可以从中部上 下对折。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、照片、视频等数据保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备100执行本申请一些实施例中所提供的界面显示方法,以及各种功能应用以及数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如用户拖拽的对象的数量等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
音频模块150用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块150还可以用于对音频信号编码和解码。在一些实施例中,音频模块150可以设置于处理器110中,或将音频模块150的部分功能模块设置于处理器110中。音频模块150可以包括扬声器、麦克风和耳机接口等。
传感器模块160可以包括压力传感器160A,触摸传感器160B。
压力传感器160A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器160A可以设置于显示屏180。压力传感器160A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器160A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏180,电子设备100根据压力传感器160A检测所述触摸操作强度。电子设备100也可以根据压力传感器160A的检测信号计算触摸的位置。
触摸传感器160B,也可称触控面板或触敏表面。触摸传感器160B可以设置于显示屏180,由触摸传感器160B与显示屏180组成触摸屏,也称“触控屏”。触摸传感器160B用于检测作用于其上或附近的触摸操作和离手操作。触摸传感器160B可以将检测到的触摸操作传递给处理器110,以确定触摸事件类型。电子设备100可以根据触摸传感器160B的检测信号计算用户触摸的位置和用户的离手的位置,还可以根据用户触摸位置的连续变化,确定识别用户的拖拽操作。进而,电子设备100可以通过显示屏180提供与前述相关操作(触摸操作、离手操作和拖拽操作)相关的视觉输出。
传感器模块160还可以包括陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,环境光传感器,骨传导传感器等。
马达170可以产生振动提示。马达170可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏180不同区域的触摸操作,马达170也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
上述电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架 构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2B是本申请实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2B所示,应用程序包可以包括相机,图库,邮箱,蓝牙,备忘录,音乐,视频,文件管理等应用程序(APP)。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2B所示,应用程序框架层可以包括窗口管理器,视图系统,拖拽管理器,内容提供器,资源管理器,通知管理器等。其中,应用程序框架层各功能模块可以集成到图2A示意的处理器110中,本实施例中应用程序框架层的功能可以由图2A示意的硬件处理器110实现。
窗口管理器用于管理窗口程序。示例性的,窗口管理器可以获取显示屏184大小,判断是否有状态栏,锁定屏幕,截取屏幕等。窗口管理器还可以对应用程序层中各APP的分布,以及各APP的窗口布局进行管理,以实现显示屏184显示两个APP窗口的功能。此外,窗口管理器具备识别APP所支持的文件类型的等功能,这样,窗口管理器能够确定APP是否能够支持用户拖入对象的文件类型。
视图系统包括可视界面元素,例如显示文字的界面元素,显示图像的界面元素等。视图系统可用于构建APP的显示界面。显示界面可以由一个或多个视图组成的。例如,包括各类APP图标的显示界面等。视图系统还可以构建被拖拽对象的快照。所述快照例如包括快照的尺寸,标识等,所述标识可以包括图层和标志等。
拖拽管理器可以基于触摸传感器160B上报的检测信号,确定用户触摸的位置以及相应对象的快照。进而,拖拽管理器可以控制相应快照随着用户触摸的位置在显示屏180移动,以实现拖拽功能。
示例性的,与图2B相似的,图3示意了Android操作系统内部的信号交互图。其中,窗口管理器可以响应用户的操作指令,控制第一APP的窗口和第二APP的窗口在显示屏180显示。在获取到触摸传感器160B上报的触摸信号和用户手势的移动指令之后,视图系统绘制触摸信号对应的对象的快照。并且,拖拽管理器控制相应快照随用户的手势轨迹移动。窗口管理器可以检测快照在显示屏180的位置。当检测到前述快照位于第二APP的窗口范围内之后,窗口管理器可以判断第二APP是否支持快照对应的文件类型。之后,窗口管理器将判断结果传输到视图系统。进而,视图系统基于判断结果为前述快照添加图层,以通过图层的颜色和亮度向用户反馈第二APP接收相应对象的显示效果。另一些实施例中,视图系统还可以为前述快照添加图标,以指示用户拖拽的对象的数量。另一些实施例中,视图系统还可以根据第二APP窗口的尺寸,绘制第二APP允许接收的快照的尺寸,以使相应快照的尺寸与第二APP窗口的尺寸相适应。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图像,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
为了便于理解,本申请以下实施例将以具有图2A和图2B所示结构的电子设备100为例,结合附图对本申请实施例提供的界面显示方法进行具体阐述。
本申请实施例中,电子设备100可以通过显示屏180显示第一窗口和第二窗口。基于此,如图4所示,电子设备100接收用户输入的拖拽指令,该拖拽指令用于指示将第一窗口内的第一对象拖拽到第二窗口内。若第二窗口对应的应用程序支持第一对象的文件类型,电子设备100显示第一对象的第一快照。若第二窗口对应的应用程序不支持第一对象的文件类型,电子设备100显示第一对象的第二快照,第二快照包含第一标识,该第一标识用于指示第二窗口不接受第一对象。这样,电子设备100能够在用户拖拽对象的过程中,即可向用户反馈拖拽效果,从而提高用户的操作体验。
其中,电子设备100所显示的第一窗口可以是第一APP的窗口,第二窗口可以是第二APP的窗口。上述第一对象可以包括文本、图像、音频文件等。在用户拖拽过程中,电子设备100显示的被拖拽元素可以是被拖拽对象的快照。示例性的,在接收到拖拽指令之后,电子设备100生成第一对象的第三快照,该第三快照随用户的拖拽轨迹移动。在第三快照被拖拽到第二窗口内之后,且在用户离手之前,电子设备100可以对该第三快照处理,得到第一快照或者第二快照。
快照可以是被拖拽对象的界面图标的图层,快照的内容与被拖拽对象的界面图标相同。一些实施例中,快照的尺寸以及长宽比,与其对应对象的界面图标的尺寸和长宽比可以相同, 如图5中的图像13的尺寸以及长宽比和快照131的尺寸以及长宽比相同。另一些实施例中,快照的尺寸以及长宽比,与其对应对象的界面图标的尺寸和长宽比也可以不同,例如,如图7中的图像23的尺寸以及长宽比和快照231的尺寸以及长宽比不同。
一些实施例中,上述第一标识包括图层。另一些实施例中,上述第一标识包括标志。其他一些实施例中,上述第一标识包括图层和标志。图层可以是灰色或暗色,标志可以是一些特定的符号,以通过显示该图层和/或标志,来提醒第二窗口对应的应用程序不能接收第一对象。
本申请实施例中,第一APP窗口内的第一对象被拖拽到第二APP的窗口内,实质上是触发电子设备100将第一APP中的第一对象复制到第二APP内。基于此,若第二APP支持第一对象的文件类型,或者,第二APP的相应窗口界面能够接收第一对象,那么,电子设备100可以将第一对象复制到第二APP。若第二APP不支持第一对象的文件类型,或者,第二APP的相应窗口界面不能够接收第一对象,那么,电子设备100则无需将第一对象复制到第二APP。响应于用户离手(up)的操作指令,电子设备100可以控制第一对象的快照返回第一APP的窗口,以及隐藏相应快照。
需要说明的是,是否能够接收同一对象,相同APP的不同界面或不同窗口可能存在差异。例如,第一对象是图像,第二APP是邮箱APP。电子设备显示邮箱APP的发送地址窗口和邮箱内容窗口,其中发送地址窗口不能接收第一对象,但是邮箱内容窗口能够接收第一对象。再例如,第一对象是视频,第二APP即时通信APP。电子设备显示即时通信APP的通讯录界面时,该通讯录界面不能接收第一对象,电子设备切换到显示即时通信APP的与某联系人的对话界面时,该对话界面能够接收第一对象。
另外,需要指出的是,本申请实施例涉及的场景,均是用户离手之前的实施场景,也可以描述为,用户的手指触摸着电子设备屏幕的场景。其中,用户的手指触摸着电子设备的屏幕可以是用户的手指触摸到手机10的屏幕,或者,用户的手指距离电子设备屏幕的距离为小于0.5毫米(mm)时,也可以称为用户的手指触摸到电子设备的屏幕。实际实现中,用户的手指触摸电子设备的屏幕时,手指和屏幕的距离可以由手机的触摸灵敏度决定。用户的手指从电子设备的屏幕离开可以认为是用户离手(up)。
以下结合电子设备100的用户界面(user interface,UI),对本申请各实施例进行描述。
本申请实施例涉及的UI是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析,渲染,最终呈现为用户可以识别的内容,比如图像、文本、音视频文件、控件(control)等界面元素。控件(control)也称为部件(widget),是用户界面的基本元素,典型的控件有工具栏(toolbar)、菜单栏(menu bar)、文本框(text box)、按钮(button)、滚动条(scrollbar)等。上述界面元素的属性和内容是通过标签或者节点来定义的,比如XML通过<Textview>、<ImgView>、<VideoView>等节点来规定界面元素的文本类型。一个节点对应界面中一个界面元素或属性,节点经过解析和渲染之后呈现为用户可视的内容。此外,很多应用程序,比如混合应用(hybrid application)的界面中通常还包含有网页。网页,也称为页面,可以理解为内嵌在应用程序界面中的一个特殊的控件,网页是通过特定计算机语言编写的源代码,例如超文本标记语言(hyper text markup language,HTML), 层叠样式表(cascading style sheets,CSS),java脚本(JavaScript,JS)等,网页源代码可以由浏览器或与浏览器功能类似的网页显示组件加载和显示为用户可识别的内容。网页所包含的具体内容也是通过网页源代码中的标签或者节点来定义的,比如HTML通过<p>、<img>、<video>、<canvas>来定义网页的元素和属性。
用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏180中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
场景一:
在场景一中,在用户将第一APP窗口内的对象拖拽到第二APP窗口内的拖拽过程中,电子设备100根据第二APP对相应对象不同的接收情况,呈现不同的显示效果。本申请实施例所述的“拖拽过程”,是指从用户触摸到第一APP窗口内的对象之后,到用户离手之前的操作过程。图5以手机10为例,通过一组GUI示意用户拖拽过程以及界面显示效果。
如图5中的(a)所示的GUI,该GUI为用户执行拖拽操作时的显示界面。如图5中的(a)所示,手机10的屏幕显示第一窗口11和第二窗口12。第一窗口11中包括图像13。在用户的手指触摸着图像13,并且开始在屏幕移动的过程中,手机10的视图系统绘制图像13的快照,得到图5中的(a)示意的快照131。本实施例中,快照131即为前述第三快照的一种示例性实现方式。在用户的手指触摸着快照131并持续移动过程中,手机10的触摸传感器持续检测到信号,相应的,手机10的拖拽管理器可以基于触摸传感器检测的信号,控制快照131沿着用户的移动轨迹移动,以实现拖拽功能。
需要指出的是,本实施例中,手机10可以接收用户输入的分屏操作,响应于该分屏操作,将屏幕划分为两个区域。之后,手机10响应于用户输入的开启APP的操作,在两个区域中的一个区域显示第一APP的窗口第一窗口11,以及在两个区域中的另一个区域显示第二APP的窗口第二窗口12。此处不详述。其中,第一APP和第二APP可以是手机10安装的任意APP。第一APP例如是图库,相应的,第一窗口11是图库的窗口,第二APP例如是备忘录,相应的,第二窗口12是备忘录的窗口。此处不限制。
在用户将快照131拖拽到第二窗口12之后,且在用户离手之前,手机10的窗口管理器可以根据触摸传感器检测到的位置,确定快照131已经进入第二窗口12。之后,窗口管理器可以根据快照131对应对象(即图像13)的信息,获取图像13的文件类型(即图像类型)。进而,窗口管理器检测第二APP(即第二窗口12对应的APP)支持的文件类型中是否包含图像。若第二APP支持的文件类型中包含图像,窗口管理器可以将“包含”的判断结果传输到视图系统。视图系统可以不对快照131做任何操作。图5中的(a)所示的GUI更新为图5中的(b)所示的GUI。在用户离手之后,手机10的处理器可以将图像13复制到第二窗口12,且视图系统在第二窗口12内绘制图像13的复制图像。若第二APP支持的文件类型中不包含图像,窗口管理器可以将“不包含”的判断结果传输到视图系统。视图系统为快照131添加界面元素,得到快照132。图5中的(a)所示的GUI更新为图5中的(c)所示的GUI。
可以理解的是,手机10通常通过检测用户手指在屏幕的触摸位置,确定用户拖拽快照的位置。基于此,上述实施例中,在触摸传感器的检测信号确定的位置进入第二窗口12的场景中,窗口管理器确定快照131位于第二窗口12内。
可以理解的是,上述第二APP“支持的文件类型”可以指,基于第二APP的软件框架, 能够正常显示、正常运行以及响应操作指令实现相应功能的类型。其中,“响应操作指令实现相应功能”是指当对象被拖拽到第二APP的窗口之后,第二APP基于自身的功能属性,可以对该对象执行相应功能的操作。例如,图6示意的GUI中,第二窗口12是社交APP的聊天窗口,该社交APP例如支持图像13的文件类型。进一步,当图像13被拖拽到图6中的聊天窗口,且检测到用户离手之后,该社交APP可以将图像13的复制图像发送给该聊天窗口对应的联系人。再如,若第二APP是邮箱APP,第二窗口是邮件内容的窗口。图像13被拖拽到第二窗口之后,邮箱APP可以将图像13的复制图像作为邮件的附件插入邮箱。此处不详述。上述第二APP“不支持的文件类型”可以指,基于第二APP的软件框架,不能正常显示、不能正常运行或者执行相应功能受限的类型。
另外,前述界面元素可以包括图层标志以及标号等标志。
如图5中的(b)所示的GUI,该GUI为第二APP支持图像时的显示界面。如图5中的(b)所示,在用户离手之前,视图系统不对快照131执行任何操作。快照131在第二窗口12内正常显示,以使用户通过视觉感知第二APP能够接收图像13。其中,图像13的第一快照在本实施例中实现为快照131。
如图5中的(c)所示的GUI,该GUI为第二APP不支持图像时的显示界面。本场景中,在用户离手之前,视图系统可以在快照131之上添加灰色图层以及标志14,以将快照131更新为快照132。如图5中的(c)所示,快照132相较于快照131的亮色或者彩色,呈现为暗色或者灰色,且快照132的右上角部分包括指示标志14,以使用户通过视觉感知第二APP不接收图像13。其中,图像13的第二快照在本实施例中实现为快照132,相应的,灰色图层以及标志14即为第二快照的第一标识。
另一些实施例中,手机10呈现图5中的(c)所示GUI的过程中,还可以输出声音,以通知用户第二APP不支持文件类型为图像的对象。
另一些实施例中,手机10呈现图5中的(c)所示GUI的过程中,还可以通过震动通知用户第二APP不支持文件类型为图像的对象。
可见,采用本实现方式,在用户拖拽对象的过程中,手机10能够基于接收APP对被拖拽对象的文件类型的支持情况,向用户呈现不同的界面显示效果。这样,手机10在用户拖拽对象的过程中,即可向用户反馈拖拽效果,从而提高用户的操作体验。
场景二:
场景二示意了用户一次拖拽至少两个对象的过程中,电子设备100的界面显示方法。图7仍然以手机10为例,通过一组GUI示意用户拖拽至少两个对象的界面显示效果。
如图7中的(a)所示的GUI,该GUI为用户选择待拖拽对象时的显示界面。如图7中的(a)所示,手机10的屏幕显示第一窗口21和第二窗口22。第一窗口21中包括图像23和文本24。示例性的,响应于用户长按图像23或者文本24的操作,手机10例如可以在图像23的右下角显示复选框230,在文本24的右下角显示复选框240。响应于用户点击复选框230和复选框240,手机10接收到选择指令,该选择指令关联图像23和文本24,进而,手机10选中图像23和文本24。之后,在接收到用户输入的拖拽指令后,手机10执行拖拽功能,图7中的(a)所示的GUI更新为图7中的(b)所示的GUI。
需要指出的是,用户触摸着图像23和文本24中的任一对象移动,手机10均可以将相应指令识别为前述拖拽指令。
标志如图7中的(b)所示的GUI,该GUI为用户执行拖拽操作时的显示界面。如图7 中的(b)所示,在用户的手指开始移动的过程中,手机10的视图系统生成图像23的快照231,文本24的快照241以及标志25,标志25的内容可以是用户拖拽的对象的总数量,本示例中,标志25的内容是2。进而,视图系统将该两个快照堆叠显示,且使标志25在堆叠的快照右上角的区域高亮显示。另外,在用户的手指开始移动的过程中,手机10的拖拽管理器可以控制快照231、快照241和标志25,沿着用户的移动轨迹移动,以实现拖拽功能。进一步的,在检测到用户将快照231和快照241拖拽到第二窗口22之后,窗口管理器检测第二窗口22对应的APP是否支持文件类型为图像的对象,以及第二窗口22对应的APP是否支持文件类型为文本的对象。之后,视图系统基于窗口管理器的检测结果,为快照231和快照241添加界面元素。相应的,如图7中的(b)所示的GUI更新为如图7中的(c)所示的GUI。
另一些实施例中,若用户一次拖拽至少两个对象,视图系统可以仅绘制相应至少两个对象的快照,以及堆叠显示相应至少两个对象的快照,而不添加指示对象总数量的标志。
需要指出的是,为了优化显示效果,视图系统生成图像23的快照以及文本24的快照之后,可以对两幅快照进行缩放处理,以使两幅快照的尺寸相同(如图7中的(b)所示的显示效果),从而在堆叠显示过程中,使得用户的观感体验较好。另一些实施例中,视图系统可以生成与图像23本身等尺寸的快照,以及生成与文本24本身等尺寸的快照,之后,不对两快照进行缩放处理,而直接堆叠显示。此处不限制。
如图7中的(c)所示的GUI,该GUI为用户将快照231和快照241拖拽到第二窗口22内的显示界面。本实施例中,第二窗口22对应的APP例如支持文件类型为图像的对象,但不支持文件类型为文本的对象。进而,视图系统在快照231右上角的区域设置标志26,得到快照232,以及在快照241的界面之上添加灰色图层,并在快照241右上角的区域设置标志27,得到快照242。标志26高亮显示,标志26的内容例如是第二窗口22对应的APP能够接收的对象的总数量。标志27呈灰色,标志27的内容例如是第二窗口22对应的APP不接收的对象的总数量。如图7中的(c)所示,视图系统控制图7中(b)堆叠显示的快照231和快照241,变为在界面上平铺显示的快照232和快照242,以向用户显示快照232的全部界面和快照242的全部界面。其中,快照232的右上角区域包含高亮显示的标志26,其内容是“1”,以表示第二窗口22能够接收上述两个对象中的1个对象。快照242呈现暗色或者灰色,快照242的右上角区域包含暗色或者灰色显示的标志27,其内容是“1”,以表示第二窗口22不接收上述两个对象中的1个对象。
另一些实施例中,在用户将快照231和快照241拖拽到第二窗口22内之后,视图系统可以仅在快照241之上添加灰色图层和标志27,对快照231不做任何处理。且视图系统可以将快照231和添加灰色图层和标志27之后的快照242,平铺在界面上显示。其中,添加灰色图层和标志27之后的快照,与图6中的(c)示意的快照242显示效果类似,此处不再详述。
另一些实施例中,若用户一次拖拽至少三个对象,在将该至少三个对象的快照拖拽到第二窗口内之后,视图系统可以控制该至少三个对象的快照分别以两个组显示,其中,第二APP能够接收的对象的快照形成一个组,该组中的至少一个快照堆叠显示,且显示第二APP能够接收的对象的总数量。第二APP不接收的对象的快照形成另一个组,该组中的至少一个快照可以堆叠显示,且显示第二APP不接收的对象的总数量。
示例性的,如图8所示的GUI,该GUI为用户拖拽四个对象的显示界面。如图8所示,响应于用户在第一窗口31内的拖拽操作,视图系统控制该四个对象的快照堆叠显示,并在堆叠显示的快照右上角区域设置高亮显示内容为4数字的标志(图中未示出)。响应于用户触摸 的位置进入第二窗口32内(即,该四个对象被拖拽入第二窗口32)的操作,窗口管理器例如确定第二窗口32能够接收其中两个对象,不能接收另外两个对象。进而,如图8所示,视图系统控制第二窗口32能够接收的两个快照堆叠显示,并在该两个快照堆叠界面的右上角区域标注高亮显示的标志33,标志33示意的数字是“2”。另外,视图系统为第二窗口32不接收的两个快照添加灰色图层,之后,控制添加灰色图层之后的两个快照堆叠显示,并在该两个快照堆叠界面的右上角区域标注灰色显示的标志34,标志34示意的数字是“2”。
另一些实施例中,若用户一次拖拽至少三个对象,在将该至少三个对象的快照拖拽到第二窗口内之后,视图系统可以控制该至少三个对象的快照分别以两个组显示,其中,第二APP能够接收的对象的快照形成一个组,该组中的至少一个快照平铺显示,且每个快照的右上角区域显示该快照的顺序号。第二APP不接收的对象的快照形成另一个组,该组中的至少一个快照可以平铺显示,且每个快照的右上角区域显示该快照的顺序号。
示例性的,如图9所示的GUI,该GUI为用户拖拽四个对象的另一种显示界面。如图9所示,响应于用户在第一窗口41内的拖拽操作,视图系统控制该四个对象的快照堆叠显示,并在堆叠显示的快照右上角区域高亮显示内容为数字4的标志(图中未示出)。响应于用户触摸的位置进入第二窗口42内的操作,窗口管理器例如确定第二窗口42能够接收其中两个对象,不能接收另外两个对象。进而,如图9所示,视图系统控制第二窗口32能够接收的两个快照在用户视角的水平方向平铺显示。其中,排列在左边的快照右上角区域高亮显示包含数字1的标志43,排列在右边的快照右上角区域高亮显示包含数字2的标志44。另外,视图系统为第二窗口32不接收的两个快照添加灰色图层,并控制添加灰色图层之后的两个快照在第二窗口32能够接收的两个快照的之下,在用户视角的水平方向平铺显示。其中,排列在左边的快照右上角区域包含数字1的标志45呈灰色,排列在右边的快照右上角区域包含数字2的标志46呈灰色。
可以理解的是,图7至图9仅是示意性描述,对本申请实施例不构成限制。在另一些实施例中,用户还可以拖拽更多或者更少的对象。此外,对象的快照被拖拽到第二APP的窗口之后,还可以呈现为其他显示效果。例如,上述实施例中包含数字的标志,可以设置在快照的右下角区域等。本申请实施例对此不限制。
采用本实现方式,手机10能够在反馈拖拽效果的过程中,向用户呈现更详细的拖拽效果信息,从而能够进一步提升用户的操作体验。
场景三:
本申请实施例中,视图系统绘制的被拖拽对象的快照的尺寸例如是第一尺寸。在场景三中,若第二APP能够接收被拖拽的对象,在检测到对象的快照进入第二APP的窗口(第二窗口)之后,电子设备100可以根据第二窗口的尺寸,适应性将快照由第一尺寸调整为第二尺寸,以使快照能够完全位于窗口之内,且使快照与窗口的尺寸比例达到最佳视觉效果。
实际实现中,一些实施例中,电子设备100可以将快照由第一尺寸缩小到第二尺寸。示例性的,如图10中的(a)示意的场景所示。另一些实施例中,电子设备100可以将快照由第一尺寸放大到第二尺寸。示例性的,如图10中的(b)示意的场景所示。
一些实施例中,若进入第二窗口的快照是第二APP接收的第一个对象的快照,电子设备100可以调整快照的尺寸,使得快照的尺寸占窗口的尺寸的比例大于或者等于第一阈值且小于或者等于第二阈值,以使快照在第二窗口的视觉效果达到最佳。其中,第一阈值和第二阈值均大于0且小于1。第一阈值例如是50%,第二阈值例如是70%。此处不限制。以下以手 机10为例,结合图10示意的一组GUI对场景三的界面显示方法进行描述。
需要指出的是,上述“尺寸”可以包括x轴方向的像素(pixel,px)和y轴方向的像素两个参数。本申请实施例中,将快照x轴方向的像素称为快照“宽”,将快照y轴方向的像素称为快照的“高”。
如图10中的(a)所示的GUI,该GUI为手机10缩小快照尺寸的显示界面。如图10中的(a)所示,响应于用户拖拽图像50的操作,手机10的视图系统绘制图像50的快照,得到快照51。快照51例如宽1080px,高500px。本实施例中,第二APP例如包括窗口52和窗口53,用户例如将快照51拖拽到窗口52内。窗口52例如宽1500px,高400px。在检测到用户将快照51拖拽到第二APP的窗口52,以及确定第二APP支持文件类型为图像的对象之后,手机10的视图系统将快照51的宽度例如调整为1000px,将快照51的高度例如调整为380px,得到快照54,使得快照54的面积占窗口52面积的63%。这样,响应于用户的离手操作,手机10可以按照快照54的宽度和高度生成新的图像。图像50的第一快照在本实施例中实现为快照54。
如图10中的(b)所示的GUI,该GUI为手机10放大快照尺寸的显示界面。本实施例中,用户例如将快照51拖拽到窗口53内。窗口53例如宽1500px,高1200px。如图10中的(b)所示,在检测到用户将快照51拖拽到第二APP的窗口53之后,手机10的视图系统将快照51的宽度例如调整为1200px,将快照51的高度例如调整为900px,得到快照55,使得快照55的面积占窗口53面积的60%。这样,响应于用户的离手操作,手机10可以按照快照55的宽度和高度生成新的图像。图像50的第一快照在本实施例中实现为快照55。
另一些实施中,视图系统还可以为快照54和快照55添加高亮显示的标志,标志的内容例如是1。
可以理解的是,图10仅是示意性描述,图10中的(a)和图10中的(b)所示的尺寸不构成对本申请实施例限制。在另一些实施例中,电子设备100若实现为屏幕更大的设备,例如折叠屏手机,上述窗口的尺寸和快照的尺寸可以更大。此处不详述。
采用本实现方式,手机10能够在用户拖拽过程中,根据接收窗口的尺寸适应性调整快照的尺寸,以使用户预览所接收的对象的尺寸,从而能够进一步提升用户的操作体验。
场景四:
场景三是第二APP接收到第一个对象的场景下,电子设备100的界面显示方法。若第二APP继续接收第二个对象以及更多对象,电子设备100可以根据窗口的尺寸和已接收的对象的数量,显示各个对象的预览效果。电子设备100还可以根据用户拖拽过程中触发的位置,显示第二对象的快照以及第一对象的图标的相对位置的预览效果。
示例性的,电子设备100可以以窗口内所有对象图标的尺寸之和,占窗口尺寸的比例大于或者等于第一阈值且小于或者等于第二阈值为条件,调整窗口内各对象界面元素的尺寸,以向用户显示用户离手之后的预览界面。第一阈值和第二阈值如上述实施例所述,此处不赘述。
场景四以手机10为例,结合图11示意的一组GUI,通过分别拖拽两个对象的实施场景,对电子设备100的界面显示方法进行描述。
如图11中的(a)所示的GUI,该GUI为手机10的一种示例性显示界面。如图11中的(a)所示,第二APP已经接收第一对象61,第一对象61的图标如第二窗口60中所示。本实施例中,第二窗口60例如宽1500px,高1300px。第一对象61的图标例如宽1200px,高 1100px。在检测到用户将快照62拖拽到第二窗口60之内,且第二APP支持快照62对应的文件类型之后,在用户离手之前,图11中的(a)所示的GUI更新为图11中的(b)所示的GUI。
如图11中的(b)所示的GUI,该GUI为用户向第二APP拖拽第二个对象的过程中,手机10的一种示例性预览显示界面。如图11中的(b)所示,手机10的窗口管理器检测到用户触摸的位置在第一对象61的右侧,进而,手机10的视图系统可以将第一对象61的图标的宽度缩小到720px,高度保持不变,得到快照63。视图系统还将快照62的宽度调整为720px,高度调整为1300px,得到快照64,以及将快照64设置在快照63右侧,呈现出第一对象61与第二个对象的预览界面,以预先向用户展示显示效果。进而,在检测到用户离手之后,视图系统按照图11中的(b)示意的预览显示效果,将第二对象的复制内容排列在第一对象的右侧显示,且使得第二对象图标的尺寸和第一对象图标的尺寸如前所述。
另一些实施例中,如图11中的(c)所示的GUI,该GUI为用户向第二APP拖拽第二个对象的过程中,手机10的另一种示例性预览显示界面。如图11中的(c)所示,手机10的窗口管理器检测到用户触摸的位置在第一对象61的左侧,视图系统将快照64设置在快照63左侧,并按照图11中的(b)对应的尺寸,调整第一对象61的图标的尺寸和快照63的尺寸,呈现出第一对象61与第二个对象的另一种预览显示界面。进而,在检测到用户离手之后视图系统按照图11中的(c)示意的预览显示效果,将第二对象的复制内容排列在第一对象的左侧显示,且使得第二对象图标的尺寸和第一对象图标的尺寸如前所述。
可以理解的是,图11仅是示意性描述,不构成对本申请实施例的限制。另一些实施例中,上述两个对象的图标或者快照,还可以按照上下的位置关系排列,此处不限制。
此外,其他一些实施例中,若用户继续向第二窗口60拖拽第三个及以上的对象,手机10可以根据第二APP所接收对象的总数量,适应性调整已接收的对象的图标的尺寸,以及待离手的快照的尺寸。且手机10可以响应于用户手指触摸的位置,调整各个图标的排列顺序。此处不再详述。
采用本实现方式,手机10能够在用户拖拽过程中,根据接收窗口的尺寸以及已经接收的对象的数量,适应性调整各个图标的尺寸和排列顺序,以使用户预览所接收的对象的尺寸和排列顺序,从而能够进一步提升用户的操作体验。
另外,虽然上述场景一至场景四的任一场景中,均以非柔性屏手机为例进行的描述,但上述各场景中示意的实施例,同样适用于柔性屏手机。此处不再赘述。
可以理解的是,图5至图11仅是示意性描述,对本申请实施例不构成限制。在其他一些实施例中,若电子设备100实现为其他设备,电子设备100还可以根据用户通过触摸板或者鼠标键入的指令,确定用户的拖拽操作。另外,第二窗口不接收对象(即第二APP不支持相应文件类型)的场景中,电子设备100还可以通过其他界面显示效果通知用户。例如,场景一中,若第二窗口不接收图像13,手机10还可以显示对话框,对话框的内容例如可以是“x无法接收图像13”的提醒信息,其中“x”是第二APP的名称。此处不限制。
以上是以一个电子设备通过分屏呈现两个APP的窗口为例进行的描述。另一些实施例中,在至少两个设备协同办公的场景中,本申请实施例的界面显示方法同样适用。
如图12所示的一种协同办公的场景中,第一设备可以与第二设备无线连接,之后,响应 于用户输入的协同办公的设置,第一设备的界面上例如可以显示第二设备的协同窗口。进而,第一设备可以将第二设备的应用本地化,从而使得用户可以在第一设备端对第二设备进行操作。
示例性的,以第一设备是笔记本电脑为例,第二设备是手机为例,二者建立协同办公的连接之后,笔记本电脑30的操作界面如图13示意的GUI所示。
如图13中的(a)所示的GUI,该GUI为笔记本电脑与手机建立协同办公连接之后,笔记本电脑的显示界面。如图13中的(a)所示,笔记本电脑30的屏幕显示笔记本电脑30的用用程序,以及手机的协同界面71。手机的协同界面例如包括手机主界面包含的APP图标等界面元素。笔记本电脑接收到用户点击笔记本电脑中应用程序的指令,可以在笔记本电脑的屏幕上显示相应应用程序的窗口。笔记本电脑接收到用户点击协同界面71中APP的指令,可以在协同界面71内显示相应APP的窗口。例如,响应于用户点击笔记本电脑中的邮箱的指令,以及用户点击协同界面71中图库的指令,图13中的(a)所示的GUI更新为图13中的(b)所示的GUI。
如图13中的(b)所示的GUI,该GUI为笔记本电脑显示两个窗口的显示界面。如图13中的(b)所示,笔记本电脑的屏幕显示邮箱的窗口72和图库的窗口73,其中,窗口73在协同界面71内显示,窗口73内显示至少一副图像。响应于用户从窗口73向窗口72拖拽任意图像的操作指令,笔记本电脑可以执行场景一至场景四中任一场景示意的实施例。同理,笔记本电脑还可以响应用户从窗口72向窗口73拖拽任意对象的操作指令,执行场景一至场景四中任一场景示意的实施例。此处不再赘述。
可以理解的是,图12至图13仅是示意性描述,对本申请实施例不构成限制。另一些实施例中,响应于用户点击开启其他应用程序,笔记本电脑还可以显示其他应用程序的窗口。另外,协同办公的设备若是其他设备,显示界面可以不同于上述实施例。例如,协同办公的界面黑可以显示更多窗口。此处不限制。
综上,采用本申请实施例的实现方式,电子设备可以在用户拖拽对象的过程中,根据目标APP是否支持该对象的文件类型,以及目标APP窗口的尺寸等,对对象的快照进行处理,呈现不同的显示效果。这样,电子设备能够在用户拖拽过程当中,即可向用户呈现目标APP是否接收对象的显示效果,从而能够提高用户的使用体验。
上述实施例从电子设备的硬件结构,以及各软、硬件所执行的动作的角度对本申请提供的界面显示方法的各方案进行了介绍。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的接收拖拽指令、检测第二窗口对应的APP是否支持被拖拽对象的文件类型、显示不同快照等的处理步骤,本申请不仅能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
例如,上述电子设备100可以通过功能模块的形式来实现上述相应的功能。一些实施例中,电子设备可以包括接收模块、处理模块和显示模块。接收模块可以用于执行上述图4至图13示意的任意实施例中指令和信息的接收。显示模块可以用于执行上述图4至图13示意的任意实施例中窗口和快照的显示。处理模块可以用于执行上述图4至图13示意的任意实施例中除了指令和信息的接收以及窗口和快照的显示之外的操作。
可以理解的是,以上各个模块的划分仅仅是一种逻辑功能的划分,实际实现时,所述接收模块的功能可以集成到接收器实现,所述处理模块的功能可以集成到处理器实现,所述显示模块的功能可以集成到显示器实现。如图14A所示,电子设备140包括接收器1401,处理器1402和显示器1403。所述接收器1401可以执行上述图4至图13示意的任意实施例中指令和信息的接收。所述显示器1403可以用于执行上述图4至图13示意的任意实施例中窗口和快照的显示。所述处理器1402可以用于执行上述图4至图13示意的任意实施例中除了指令和信息的接收以及窗口和快照的显示之外的操作。
例如,所述显示器1403显示第一窗口和第二窗口。基于此,所述接收器1401可以用于接收用户输入的拖拽指令,所述拖拽指令用于指示所述电子设备将所述第一窗口内的第一对象拖拽到所述第二窗口内。所述处理器1402可以用于在所述第二窗口对应的应用程序支持所述第一对象的文件类型时,控制所述显示器显示所述第一对象的第一快照。所述处理器1402还可以用于在所述第二窗口对应的应用程序不支持所述第一对象的文件类型时,控制所述显示器显示所述第一对象的第二快照,所述第二快照包含第一标识,所述第一标识用于指示所述第二窗口不接受所述第一对象。
图14A是从独立功能实体的角度对本申请的电子设备进行描述。在另一种实施场景中,各独立运行的功能实体可以集成在一个硬件实体中。如图14B所示,本实施场景中,电子设备141可以包括处理器1411、收发器1412和存储器1413。
应理解,本申请实施例的电子设备141可对应图4中涉及的电子设备,图5至图11示意方法中的手机10,以及图12和图13示意的实施例中的笔记本电脑。其中,收发器1412用于执行图4至图13中所述电子设备执行的指令的接收,存储器1413可以用于存储代码,处理器1411用于执行存储器1413中存储的代码,实现图4至图13中所述电子设备执行的除了指令的接收和快照显示外的其它处理,在此不再赘述。
具体内容可以参考图4至图13对应的实施例的相关描述,此处不再赘述。
具体实现中,对应电子设备本申请实施例还提供一种计算机存储介质,其中,设置在电子设备中的计算机存储介质可存储有程序,该程序执行时,可实施包括图4至图13提供的界面显示方法的各实施例中的部分或全部步骤。任意设备中的存储介质均可为磁碟、光盘、只读存储记忆体(read-only memory,ROM)或随机存储记忆体(random access memory,RAM)等。
以上模块或单元的一个或多个可以软件、硬件或二者结合来实现。当以上任一模块或单元以软件实现的时候,所述软件以计算机程序指令的方式存在,并被存储在存储器中,处理器可以用于执行所述程序指令并实现以上方法流程。所述处理器可以包括但不限于以下至少一种:中央处理单元(central processing unit,CPU)、微处理器、数字信号处理器(DSP)、微控制器(microcontroller unit,MCU)、或人工智能处理器等各类运行软件的计算设备,每种计算设备可包括一个或多个用于执行软件指令以进行运算或处理的核。该处理器可以内置于SoC(片上系统)或专用集成电路(application specific integrated circuit,ASIC),也可是一个独立的半导体芯片。该处理器内处理用于执行软件指令以进行运算或处理的核外,还可进一步包括必要的硬件加速器,如现场可编程门阵列(field programmable gate array,FPGA)、PLD(可编程逻辑器件)、或者实现专用逻辑运算的逻辑电路。
当以上模块或单元以硬件实现的时候,该硬件可以是CPU、微处理器、DSP、MCU、人工智能处理器、ASIC、SoC、FPGA、PLD、专用数字电路、硬件加速器或非集成的分立器件 中的任一个或任一组合,其可以运行必要的软件或不依赖于软件以执行以上方法流程。
当以上模块或单元使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
应理解,在本申请实施例的各种实施例中,各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对实施例的实施过程构成任何限定。
本说明书的各个部分均采用递进的方式进行描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点介绍的都是与其他实施例不同之处。尤其,对于装置和系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例部分的说明即可。
以上所述的具体实施方式,对本发明的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上所述仅为本发明的具体实施方式而已,并不用于限定本发明的保护范围,凡在本发明的技术方案的基础之上,所做的任何修改、等同替换、改进等,均应包括在本发明的保护范围之内。

Claims (19)

  1. 一种界面显示方法,其特征在于,应用于电子设备,所述电子设备的屏幕显示第一窗口和第二窗口,所述方法包括:
    接收用户输入的拖拽指令,所述拖拽指令用于指示所述电子设备将所述第一窗口内的第一对象拖拽到所述第二窗口内;
    若所述第二窗口对应的应用程序支持所述第一对象的文件类型,显示所述第一对象的第一快照;
    若所述第二窗口对应的应用程序不支持所述第一对象的文件类型,显示所述第一对象的第二快照,所述第二快照包含第一标识,所述第一标识用于指示所述第二窗口不接受所述第一对象。
  2. 如权利要求1所述的方法,其特征在于,显示所述第一对象的第二快照,包括:
    为所述第一对象的第三快照添加图层,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。
  3. 如权利要求1或2所述的方法,其特征在于,显示所述第一对象的第二快照,包括:
    为所述第一对象的第三快照添加第一标志,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。
  4. 如权利要求3所述的方法,其特征在于,所述第一标志为图形标识,或者标号,所述第一对象包括至少一个对象,所述标号用于指示所述第二窗口不接收的第一对象的总数量。
  5. 如权利要求1所述的方法,其特征在于,显示所述第一对象的第一快照,包括:
    在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸和所述第二窗口的尺寸,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成;
    将所述第三快照从所述第一尺寸调整为第二尺寸,得到所述第一快照,所述第二尺寸与所述第二窗口的尺寸的比值大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈值均大于0且小于1。
  6. 如权利要求5所述的方法,其特征在于,将所述第三快照从所述第一尺寸调整为第二尺寸,得到所述第一快照,包括:
    将所述第三快照由所述第一尺寸缩小到所述第二尺寸,得到所述第一快照;或者,
    将所述第三快照由所述第一尺寸放大到所述第二尺寸,得到所述第一快照。
  7. 如权利要求1所述的方法,其特征在于,所述第二窗口包括第二对象的图标,所述显示所述第一对象对应的第一快照,包括:
    在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸、所述第二窗口的尺寸和所述图标的第三尺寸;
    将所述第三快照从所述第一尺寸调整为第四尺寸,得到所述第一快照,以及将所述图标从所述第三尺寸调整为所述第四尺寸,二倍的所述第四尺寸占所述第二窗口的尺寸的比例大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈 值均大于0且小于1;
    根据用户触发的位置排列所述第一快照和调整尺寸后的所述图标。
  8. 如权利要求5-7中任一项所述的方法,其特征在于,所述第一对象包括至少一个对象,显示所述第一对象对应的第一快照,还包括:
    为所述第三快照添加第二标识,得到所述第一快照,所述第二标识用于指示所述第二窗口能够接收的第一对象的总数量。
  9. 如权利要求1-8中任一项所述的方法,其特征在于,在接收用户输入的拖拽指令之后,在显示所述第一快照或所述第二快照之前,还包括:
    生成所述第一对象的第三快照,所述第一对象包括至少一个对象;
    为所述第三快照添加第三标识,所述第三标识用于指示所述第一对象的总数量。
  10. 一种电子设备,其特征在于,包括处理器,接收器和显示器,所述显示器显示第一窗口和第二窗口,
    所述接收器,用于接收用户输入的拖拽指令,所述拖拽指令用于指示所述电子设备将所述第一窗口内的第一对象拖拽到所述第二窗口内;
    所述处理器,用于在所述第二窗口对应的应用程序支持所述第一对象的文件类型时,控制所述显示器显示所述第一对象的第一快照;还用于在所述第二窗口对应的应用程序不支持所述第一对象的文件类型时,控制所述显示器显示所述第一对象的第二快照,所述第二快照包含第一标识,所述第一标识用于指示所述第二窗口不接受所述第一对象。
  11. 如权利要求10所述的电子设备,其特征在于,
    所述处理器,还用于为所述第一对象的第三快照添加图层,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。
  12. 如权利要求10或11所述的电子设备,其特征在于,
    所述处理器,还用于为所述第一对象的第三快照添加第一标志,得到所述第二快照,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成。
  13. 如权利要求12所述的电子设备,其特征在于,
    所述第一标志为图形标识,或者标号,所述第一对象包括至少一个对象,所述标号用于指示所述第二窗口不接收的第一对象的总数量。
  14. 如权利要求10所述的电子设备,其特征在于,
    所述处理器,还用于在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸和所述第二窗口的尺寸,所述第三快照在所述电子设备接收到所述拖拽指令之后,且在确定用户触发的位置进入所述第二窗口之前生成;
    所述处理器,还用于将所述第三快照从所述第一尺寸调整为第二尺寸,得到所述第一快照,所述第二尺寸与所述第二窗口的尺寸的比值大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈值均大于0且小于1。
  15. 如权利要求14所述的电子设备,其特征在于,
    所述处理器,还用于将所述第三快照由所述第一尺寸缩小到所述第二尺寸,得到所述第一快照;或者,
    将所述第三快照由所述第一尺寸放大到所述第二尺寸,得到所述第一快照。
  16. 如权利要求10所述的电子设备,其特征在于,
    所述处理器,还用于在确定所述第一对象的第三快照进入所述第二窗口之后,获取所述第三快照的第一尺寸、所述第二窗口的尺寸和所述图标的第三尺寸;
    所述处理器,还用于将所述第三快照从所述第一尺寸调整为第四尺寸,得到所述第一快照,以及将所述图标从所述第三尺寸调整为所述第四尺寸,二倍的所述第四尺寸占所述第二窗口的尺寸的比例大于或者等于第一阈值,且小于或者等于第二阈值,所述第一阈值和所述第二阈值均大于0且小于1;
    所述处理器,还用于根据用户触发的位置排列所述第一快照和调整尺寸后的所述图标。
  17. 如权利要求14-16中任一项所述的电子设备,
    所述处理器,还用于为所述第三快照添加第二标识,得到所述第一快照,所述第一对象包括至少一个对象,所述第二标识用于指示所述第二窗口能够接收的第一对象的总数量。
  18. 如权利要求10-17中任一项所述的电子设备,其特征在于,
    所述处理器,还用于生成所述第一对象的第三快照,所述第一对象包括至少一个对象;
    所述处理器,还用于为所述第三快照添加第三标识,所述第三标识用于指示所述第一对象的总数量。
  19. 一种计算机可读存储介质,其特征在于,包括计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行如权利要求1至9中任一项所述的方法。
PCT/CN2021/110459 2020-09-09 2021-08-04 界面显示方法及电子设备 WO2022052677A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010940657.1 2020-09-09
CN202010940657.1A CN114237778A (zh) 2020-09-09 2020-09-09 界面显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022052677A1 true WO2022052677A1 (zh) 2022-03-17

Family

ID=80632595

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/110459 WO2022052677A1 (zh) 2020-09-09 2021-08-04 界面显示方法及电子设备

Country Status (2)

Country Link
CN (1) CN114237778A (zh)
WO (1) WO2022052677A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115794413A (zh) * 2023-01-09 2023-03-14 荣耀终端有限公司 一种内存处理方法及相关装置
CN116700554A (zh) * 2022-10-24 2023-09-05 荣耀终端有限公司 信息的显示方法、电子设备及可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100083154A1 (en) * 2008-09-30 2010-04-01 Fujifilm Corporation Apparatus, method and program for controlling drag and drop operation and computer terminal
CN103279269A (zh) * 2013-05-31 2013-09-04 华为技术有限公司 一种应用程序之间的数据交互方法及装置、终端设备
CN104216607A (zh) * 2013-09-05 2014-12-17 侯金涛 基于html5的虚拟操作系统的图标拖拽方法及系统
CN104932796A (zh) * 2015-06-02 2015-09-23 无锡天脉聚源传媒科技有限公司 一种组件拖放的控制方法及装置
CN109782976A (zh) * 2019-01-15 2019-05-21 Oppo广东移动通信有限公司 文件处理方法、装置、终端及存储介质
CN110221759A (zh) * 2019-05-31 2019-09-10 广州视源电子科技股份有限公司 一种元素拖拽方法、装置、存储介质及交互智能平板

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100010072A (ko) * 2008-07-22 2010-02-01 엘지전자 주식회사 이동 단말기의 멀티태스킹을 위한 사용자 인터페이스제어방법
WO2018120533A1 (zh) * 2016-12-27 2018-07-05 华为技术有限公司 一种多屏显示的方法和设备
CN110727382A (zh) * 2019-09-06 2020-01-24 华为技术有限公司 一种分屏显示方法及电子设备
CN114461111B (zh) * 2019-10-31 2022-09-23 华为技术有限公司 启动功能的方法及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100083154A1 (en) * 2008-09-30 2010-04-01 Fujifilm Corporation Apparatus, method and program for controlling drag and drop operation and computer terminal
CN103279269A (zh) * 2013-05-31 2013-09-04 华为技术有限公司 一种应用程序之间的数据交互方法及装置、终端设备
CN104216607A (zh) * 2013-09-05 2014-12-17 侯金涛 基于html5的虚拟操作系统的图标拖拽方法及系统
CN104932796A (zh) * 2015-06-02 2015-09-23 无锡天脉聚源传媒科技有限公司 一种组件拖放的控制方法及装置
CN109782976A (zh) * 2019-01-15 2019-05-21 Oppo广东移动通信有限公司 文件处理方法、装置、终端及存储介质
CN110221759A (zh) * 2019-05-31 2019-09-10 广州视源电子科技股份有限公司 一种元素拖拽方法、装置、存储介质及交互智能平板

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700554A (zh) * 2022-10-24 2023-09-05 荣耀终端有限公司 信息的显示方法、电子设备及可读存储介质
CN116700554B (zh) * 2022-10-24 2024-05-24 荣耀终端有限公司 信息的显示方法、电子设备及可读存储介质
CN115794413A (zh) * 2023-01-09 2023-03-14 荣耀终端有限公司 一种内存处理方法及相关装置
CN115794413B (zh) * 2023-01-09 2024-05-14 荣耀终端有限公司 一种内存处理方法及相关装置

Also Published As

Publication number Publication date
CN114237778A (zh) 2022-03-25

Similar Documents

Publication Publication Date Title
KR102220085B1 (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치
US9448694B2 (en) Graphical user interface for navigating applications
US9891965B2 (en) Multi-window control method and electronic device supporting the same
CN105683895B (zh) 提供用户交互的用户终端设备及其方法
US11687235B2 (en) Split-screen method and electronic device
CN108958685B (zh) 连接移动终端和外部显示器的方法和实现该方法的装置
US20220075518A1 (en) Fast Data Copying Method and Electronic Device
KR102080146B1 (ko) 휴대단말과 외부 표시장치 연결 운용 방법 및 이를 지원하는 장치
WO2022062898A1 (zh) 一种窗口显示方法及设备
US10353988B2 (en) Electronic device and method for displaying webpage using the same
WO2022052677A1 (zh) 界面显示方法及电子设备
US11455075B2 (en) Display method when application is exited and terminal
US20150180998A1 (en) User terminal apparatus and control method thereof
WO2023226455A1 (zh) 应用图标的显示方法、电子设备及可读存储介质
WO2022028310A1 (zh) 添加批注的方法、电子设备及相关装置
WO2023040666A1 (zh) 键盘显示方法、折叠屏设备和计算机可读存储介质
CN108780400B (zh) 数据处理方法及电子设备
KR20150092995A (ko) 전자장치에서 사용자 입력 방법 및 장치
WO2023005751A1 (zh) 渲染方法及电子设备
WO2022213831A1 (zh) 一种控件显示方法及相关设备
WO2024125301A1 (zh) 显示方法和电子设备
WO2024087808A1 (zh) 界面显示方法及电子设备
US9910832B2 (en) Selecting user interface elements to display linked documents with a linking document
KR102382074B1 (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치
EP4332744A1 (en) Control method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21865733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21865733

Country of ref document: EP

Kind code of ref document: A1