CN114237778A - Interface display method and electronic equipment - Google Patents

Interface display method and electronic equipment Download PDF

Info

Publication number
CN114237778A
CN114237778A CN202010940657.1A CN202010940657A CN114237778A CN 114237778 A CN114237778 A CN 114237778A CN 202010940657 A CN202010940657 A CN 202010940657A CN 114237778 A CN114237778 A CN 114237778A
Authority
CN
China
Prior art keywords
snapshot
window
size
user
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010940657.1A
Other languages
Chinese (zh)
Inventor
阚彬
许嘉
蔺振超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010940657.1A priority Critical patent/CN114237778A/en
Priority to PCT/CN2021/110459 priority patent/WO2022052677A1/en
Publication of CN114237778A publication Critical patent/CN114237778A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of electronics, and discloses an interface display method and electronic equipment. The application relates to a screen of an electronic device displaying a first window and a second window. Further, the electronic device receives a drag instruction to drag a first object within the first window into the second window. And when the application program corresponding to the second window supports the file type of the first object, the electronic equipment displays the first snapshot of the first object. And when the application program corresponding to the second window supports the file type of the first object, the electronic equipment displays a second snapshot of the first object. The first snapshot is different from the second snapshot. Therefore, the electronic equipment can present the display effect of whether the target window receives the object or not to the user in the process that the user drags the first object, and therefore the use experience of the user can be improved.

Description

Interface display method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an interface display method and electronic equipment.
Background
With the increasing display screen of the electronic device, the number of windows of Applications (APPs) that can be simultaneously displayed on the display screen is also increasing, and further, a technology for content interaction between APP windows is derived. As shown in fig. 1, a display screen of the electronic device displays a window 01 of a first APP and a window 02 of a second APP, the window 01 includes an image 1, and a user can drag the image 1 in the window 01 into the window 02. Generally, in the process that the user drags the image 1 to the window 02, the electronic device has no feedback of a dragging effect, so that the operation experience of the user is poor.
Disclosure of Invention
The embodiment of the application provides an interface display method and electronic equipment, which can feed back effects to a user in time in the process of executing a dragging operation by the user.
In a first aspect, an embodiment of the present application provides an interface display method, which is applied to an electronic device, where a screen of the electronic device displays a first window and a second window, and the method includes: receiving a dragging instruction input by a user, wherein the dragging instruction is used for instructing the electronic equipment to drag a first object in the first window into the second window; if the application program corresponding to the second window supports the file type of the first object, displaying a first snapshot of the first object; and if the application program corresponding to the second window does not support the file type of the first object, displaying a second snapshot of the first object, wherein the second snapshot comprises a first identifier, and the first identifier is used for indicating that the second window does not accept the first object.
The electronic equipment related to the embodiment of the application supports display of a plurality of windows. The first window may be a window of a first APP, and the second window may be a window of a second APP. Furthermore, the user drags the first object in the first window into the second window, which substantially triggers the electronic device to copy the first object running in the first APP into the second APP. The embodiment of the application refers to the fact that the user drags the first object in the first window into the second window, and the finger of the user enters the second window at the touch position of the screen. Based on this, when the second APP supports the file type of the first object, the second window can receive the first object. When the second APP does not support the file type of the first object, the second window cannot receive the first object. In the embodiment of the application, before the user leaves the hand, the electronic device can inform the user of the receiving condition of the second window to the first object by displaying different snapshots.
Therefore, by adopting the implementation mode, the electronic equipment can display the first object differently according to whether the application program corresponding to the second window supports the file type of the first object or not, so that timely and accurate feedback is provided for the dragging process of the user.
In one possible design, displaying a second snapshot of the first object includes: adding a layer to a third snapshot of the first object to obtain the second snapshot, wherein the third snapshot is generated after the electronic device receives the dragging instruction and before a position triggered by a user is determined to enter the second window. And the layer is used for presenting a display effect that the second window cannot receive the first object to the user. In some embodiments, the layers may be gray or dark. Therefore, the dragging effect can be fed back to the user in time in the process of dragging the first object by the user.
In one possible design, displaying a second snapshot of the first object includes: and adding a first mark to a third snapshot of the first object to obtain the second snapshot, wherein the third snapshot is generated after the electronic equipment receives the dragging instruction and before a position triggered by a user is determined to enter the second window. Wherein the indicator is used for presenting to the user a display effect that the second window cannot receive the first object. Therefore, the dragging effect can be fed back to the user in time in the process of dragging the first object by the user.
In one possible design, the first mark is a graphical identifier, or a label, and the first object includes at least one object, and the label is used to indicate a total number of first objects that are not received by the second window. In some embodiments, the first flag may be some specific symbol. In other embodiments, the first flag may be a number symbol. The number symbol is used to indicate the total number of first objects that the second window does not receive. In this embodiment, the first indicia may be gray or dark. By adopting the implementation mode, the electronic equipment can present more detailed dragging effect information to the user in the process of feeding back the dragging effect, so that the operation experience of the user can be further improved.
In one possible design, displaying a first snapshot of the first object includes: after determining that a third snapshot of the first object enters the second window, acquiring a first size of the third snapshot and a size of the second window, wherein the third snapshot is generated after the electronic device receives the dragging instruction and before a position triggered by a user is determined to enter the second window; and adjusting the third snapshot from the first size to a second size to obtain the first snapshot, wherein the ratio of the second size to the size of the second window is greater than or equal to a first threshold and smaller than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and smaller than 1. The electronic device can adjust the size of the first snapshot based on the size of the second window, so that the visual effect of the snapshot in the second window is optimal. By adopting the implementation mode, the electronic equipment can enable the user to preview the size of the received first object in the dragging process of the user, so that the operation experience of the user can be further improved.
In one possible design, resizing the third snapshot from the first size to a second size, resulting in the first snapshot, includes: reducing the third snapshot from the first size to the second size to obtain the first snapshot; or, the third snapshot is enlarged from the first size to the second size to obtain the first snapshot. By adopting the implementation mode, the electronic equipment can adaptively adjust the size of the snapshot according to the size of the receiving window in the dragging process of the user so that the user can preview the size of the received object, and the operation experience of the user can be further improved.
In one possible design, the second window includes an icon of a second object, and the displaying the first snapshot corresponding to the first object includes: after determining that the third snapshot of the first object enters the second window, acquiring a first size of the third snapshot, a size of the second window, and a third size of the icon; adjusting the third snapshot from the first size to a fourth size to obtain the first snapshot, and adjusting the icon from the third size to the fourth size, wherein a ratio of twice the fourth size to the size of the second window is greater than or equal to a first threshold and less than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and less than 1; and arranging the first snapshot and the icon after the size adjustment according to the position triggered by the user. When other objects are contained in the second window, the electronic device can adaptively adjust the size and the arrangement sequence of each icon according to the size of the second window and the number of the objects in the second window, so as to present a preview display effect to a user, so that the user can see the display effect of the size and the arrangement sequence of the received objects in advance, and the operation experience of the user can be further improved.
In one possible design, the first object includes at least one object, and displaying a first snapshot corresponding to the first object further includes: and adding a second identifier to the third snapshot to obtain the first snapshot, wherein the second identifier is used for indicating the total number of the first objects which can be received by the second window. By adopting the implementation mode, the electronic equipment can present more detailed dragging effect information to the user in the process of feeding back the dragging effect, so that the operation experience of the user can be further improved.
In one possible design, after receiving a drag instruction input by a user, before displaying the first snapshot or the second snapshot, the method further includes: generating a third snapshot of the first object, the first object comprising at least one object; adding a third identifier to the third snapshot, wherein the third identifier is used for indicating the total number of the first objects. In the embodiment of the application, the electronic device further supports the user to drag at least two objects at one time, and correspondingly, in the process of dragging by the user, the electronic device can identify the total number of the objects dragged by the user. By adopting the implementation mode, the electronic equipment can present more detailed dragging effect information to the user in the process of feeding back the dragging effect, so that the operation experience of the user can be further improved.
In a second aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing a behavior of the electronic device in the foregoing method. The functions can be realized by hardware, and the functions can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above-described functions. In one possible design, the electronic device includes a processor, a receiver, and a display, and the processor is configured to process the electronic device to perform the corresponding functions of the method. The receiver is used for realizing that the electronic equipment receives various instructions input by a user. The display is used for displaying the snapshot by the electronic equipment. The electronic device may also include a memory, coupled to the processor, that retains program instructions and data necessary for the electronic device.
In a third aspect, the present application provides a computer storage medium, where instructions are stored, and when the instructions are executed on a computer, the instructions cause the computer to perform some or all of the steps of the interface display method in the first aspect and various possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to perform some or all of the steps of the interface display method in the first aspect and various possible implementations of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is an interface schematic diagram of a drag scene according to an embodiment of the present disclosure;
fig. 2A is a schematic diagram of an exemplary hardware structure of the electronic device 100 according to an embodiment of the present disclosure;
fig. 2B is a schematic diagram of an exemplary software architecture of the electronic device 100 according to an embodiment of the present application;
fig. 3 is a schematic diagram of exemplary signal interaction inside an Android operating system according to an embodiment of the present application;
FIG. 4 is a flowchart of a method of an interface display method according to an embodiment of the present disclosure;
FIG. 5 is an interface diagram of a first exemplary drag scenario provided by an embodiment of the present application;
FIG. 6 is a schematic interface diagram of the user after leaving his hand in the dragging scene illustrated in FIG. 5 according to an embodiment of the present application;
FIG. 7 is an interface diagram of a second exemplary drag scenario provided by an embodiment of the present application;
FIG. 8 is an interface diagram of a third exemplary drag scenario provided by an embodiment of the present application;
FIG. 9 is an interface diagram of a fourth exemplary drag scenario provided by an embodiment of the present application;
FIG. 10 is an interface diagram of a fifth exemplary drag scenario provided by an embodiment of the present application;
FIG. 11 is an interface diagram of a sixth exemplary drag scenario provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of an exemplary interface of a collaborative office scenario provided by an embodiment of the present application;
FIG. 13 is an interface diagram based on the exemplary drag scenario of FIG. 12 provided by an embodiment of the present application;
fig. 14A is a schematic diagram of an exemplary composition of an electronic device 140 provided in an embodiment of the present application;
fig. 14B is a schematic structural diagram of an electronic device 141 according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be described below clearly with reference to the drawings in the embodiments of the present application.
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the present application. As used in the description of the embodiments of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that although the terms first, second, etc. may be used in the following embodiments to describe a class of objects, the objects should not be limited to these terms. These terms are only used to distinguish between particular objects of that class of objects. For example, the terms first, second, etc. may be used in the following embodiments to describe windows, but the windows should not be limited to these terms. These terms are only used to distinguish between different windows displayed by a display screen. The following embodiments may adopt the terms first, second, etc. to describe other class objects in the same way, and are not described herein again. Furthermore, the term "and/or" is used to describe an association relationship of associated objects, meaning that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The embodiment of the application provides an interface display method, when a first window and a second window are simultaneously displayed on a screen of an electronic device, a user can drag a first object in the first window into the second window, and the first object is displayed differently according to whether an application program corresponding to the second window supports a file type of the first object, so that timely and accurate feedback is provided for a dragging process of the user.
For example, a document in the first window is dragged to the second window, and if the application program corresponding to the second window supports the file type of the document, the document is highlighted in the second window. And if the application program corresponding to the second window does not support the file type of the document, displaying the document in the second window in a dark mode.
Embodiments of an electronic device, a user interface for such an electronic device, and for using such an electronic device are described below.
In some embodiments, the electronic device may be a portable electronic device, such as a cell phone, a tablet, a wearable electronic device with wireless communication capabilities (e.g., a smart watch), and the like, that also includes other functionality, such as personal digital assistant and/or music player functionality. Exemplary embodiments of the portable electronic device include, but are not limited to, a portable electronic device that mounts or otherwise operates a system. The portable electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be understood that in other embodiments, the electronic device may not be a portable electronic device, but may be a desktop computer.
Fig. 2A shows a schematic structural diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, an antenna, a wireless communication module 140, an audio module 150, a sensor module 160, a motor 170, a display screen 180, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the electronic device 100.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 160B, etc. via different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 160B via an I2C interface, such that the processor 110 and the touch sensor 160B communicate via an I2C bus interface to implement the touch and drag functions of the electronic device 100.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 140. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 140 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 150 may transmit the audio signal to the wireless communication module 140 through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 180. The MIPI interface includes a display screen 180 serial interface (DSI) and the like. In some embodiments, processor 110 and display screen 180 communicate through a DSI interface to implement display functionality of electronic device 100 to present a dragged display effect to a user.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 183, the display screen 180, the wireless communication module 140, the audio module 150, the sensor module 160, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 140 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the electronic device 100 may be implemented by an antenna, the wireless communication module 140, a baseband processor, and the like.
The antenna is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the multiplexing may be as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The wireless communication module 140 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 140 may be one or more devices integrating at least one communication processing module. The wireless communication module 140 receives electromagnetic waves via an antenna, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. Wireless communication module 150 may also receive signals to be transmitted from processor 110, frequency modulate them, amplify them, and convert them into electromagnetic waves via an antenna for radiation.
In some embodiments, the antenna of the electronic device 100 and the wireless communication module 140 are coupled such that the electronic device 100 may communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
In some embodiments, the solution for WLAN wireless communication provided by wireless communication module 140 may also enable the electronic device to communicate with devices in a network (e.g., devices that are cooperative with electronic device 100). Thus, the electronic device can perform data transmission with the cooperative device.
The electronic device 100 implements display functions via the GPU, the display screen 180, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 180 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or change interface display effects. In the embodiment of the present application, the display screen 180 may include a display and a touch device. The display is used for outputting display contents to the user, for example, the first snapshot and the second snapshot related to the embodiment of the application, the first identifier included in the second snapshot, the display effect previewed before the user leaves the hand, and the like. The touch device is used for receiving a drag operation input by a user on the display screen 180.
The display screen 180 is used for displaying a User Interface (UI) related to the embodiment of the present application. The display screen 180 includes a display panel, and the display panel may adopt a liquid crystal display 180 (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeled, a quantum dot light-emitting diode (QLED), and the like.
In some embodiments of the present application, when the display panel is made of OLED, AMOLED, FLED, or the like, the display screen 180 may be bent. Here, the display screen 180 may be bent means that the display screen 180 may be bent at any position to any angle and may be held at the angle, for example, the display screen 180 may be folded right and left from the middle. Or can be folded from the middle part up and down.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data such as music, photos, video, etc. are stored in an external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device 100 to execute the interface display method provided in some embodiments of the present application, and various functional applications, data processing, and the like. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data created during the use of the electronic device 100 (such as the number of objects dragged by the user, etc.), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The audio module 150 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 150 may also be used to encode and decode audio signals. In some embodiments, the audio module 150 may be disposed in the processor 110, or some functional modules of the audio module 150 may be disposed in the processor 110. The audio module 150 may include a speaker, a microphone, and a headphone interface, etc.
The sensor module 160 may include a pressure sensor 160A and a touch sensor 160B.
The pressure sensor 160A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 160A may be disposed on display screen 180. The pressure sensor 160A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 160A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 180, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 160A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 160A.
Touch sensor 160B, which may also be referred to as a touch panel or touch sensitive surface. The touch sensor 160B may be disposed on the display screen 180, and the touch sensor 160B and the display screen 180 form a touch screen, which is also called a "touch screen". The touch sensor 160B is used to detect a touch operation and an off-hand operation acting thereon or thereabout. The touch sensor 160B may communicate the detected touch operation to the processor 110 to determine the touch event type. The electronic device 100 may calculate a position touched by the user and a position away from the hand of the user according to the detection signal of the touch sensor 160B, and may determine to recognize the drag operation of the user according to a continuous change of the position touched by the user. Further, the electronic apparatus 100 may provide visual output related to the aforementioned related operations (touch operation, hands-off operation, and drag operation) through the display screen 180.
The sensor module 160 may also include a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
The motor 170 may generate a vibration cue. The motor 170 may be used for both an incoming call vibration prompt and a touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 170 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 180. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2B is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2B, the application package may include Applications (APP) such as camera, gallery, mailbox, bluetooth, memo, music, video, file management, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2B, the application framework layers may include a window manager, a view system, a drag manager, a content provider, a resource manager, a notification manager, and the like. The functional modules of the application framework layer may be integrated into the processor 110 illustrated in fig. 2A, and the functions of the application framework layer in this embodiment may be implemented by the hardware processor 110 illustrated in fig. 2A.
The window manager is used for managing window programs. Illustratively, the window manager may obtain the size of the display screen 184, determine if there is a status bar, lock the screen, intercept the screen, etc. The window manager may also manage the distribution of each APP in the application layer, and the window layout of each APP, to achieve the function of the display screen 184 displaying two APP windows. In addition, the window manager has the function of identifying the file types supported by the APP, and the like, so that the window manager can determine whether the APP can support the file types of the user dragging objects.
The view system includes visual interface elements such as interface elements that display text, interface elements that display images, and the like. The view system may be used to build a display interface for an APP. The display interface may be composed of one or more views. For example, a display interface including various types of APP icons, and the like. The view system may also construct a snapshot of the dragged object. The snapshot includes, for example, a size, an identifier, and the like of the snapshot, and the identifier may include a layer, a mark, and the like.
The drag manager may determine the location touched by the user and the snapshot of the corresponding object based on the detection signal reported by the touch sensor 160B. Further, the drag manager may control the corresponding snapshot to move on the display screen 180 along with the position touched by the user, so as to implement the drag function.
Exemplarily, similar to fig. 2B, fig. 3 illustrates a signal interaction diagram inside the Android operating system. The window manager may respond to an operation instruction of a user, and control the window of the first APP and the window of the second APP to be displayed on the display screen 180. After the touch signal reported by the touch sensor 160B and the movement instruction of the user gesture are acquired, the view system draws a snapshot of an object corresponding to the touch signal. And the drag manager controls the corresponding snapshot to move along with the gesture track of the user. The window manager may detect the location of the snapshot on the display screen 180. When detecting that the snapshot is located within the window range of the second APP, the window manager may determine whether the second APP supports the file type corresponding to the snapshot. Thereafter, the window manager transmits the determination result to the view system. And then, the view system adds a layer to the snapshot based on the judgment result so as to feed back the display effect of the second APP receiving the corresponding object to the user through the color and the brightness of the layer. In other embodiments, the view system may also add icons to the snapshots to indicate the number of objects dragged by the user. In other embodiments, the view system may further draw, according to the size of the window of the second APP, the size of the snapshot that the second APP allows to receive, so that the size of the corresponding snapshot is adapted to the size of the window of the second APP.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, images, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver and the like.
For convenience of understanding, the following embodiments of the present application will specifically describe an interface display method provided by the embodiments of the present application by taking the electronic device 100 having the structure shown in fig. 2A and fig. 2B as an example, with reference to the accompanying drawings.
In this embodiment, the electronic device 100 may display the first window and the second window through the display screen 180. Based on this, as shown in fig. 4, the electronic device 100 receives a drag instruction input by a user, the drag instruction being used to instruct to drag a first object in a first window into a second window. If the application corresponding to the second window supports the file type of the first object, the electronic device 100 displays a first snapshot of the first object. If the application program corresponding to the second window does not support the file type of the first object, the electronic device 100 displays a second snapshot of the first object, where the second snapshot includes a first identifier, and the first identifier is used to indicate that the second window does not accept the first object. In this way, the electronic device 100 can feed back the dragging effect to the user in the process of dragging the object by the user, so as to improve the operation experience of the user.
The first window displayed by the electronic device 100 may be a window of a first APP, and the second window may be a window of a second APP. The first object may include text, images, audio files, and the like. During the user dragging process, the dragged element displayed by the electronic device 100 may be a snapshot of the dragged object. Illustratively, after receiving the drag instruction, the electronic device 100 generates a third snapshot of the first object, which moves along with the drag trajectory of the user. After the third snapshot is dragged into the second window and before the user leaves the hand, the electronic device 100 may process the third snapshot to obtain the first snapshot or the second snapshot.
The snapshot can be a layer of an interface icon of the dragged object, and the content of the snapshot is the same as the interface icon of the dragged object. In some embodiments, the size and aspect ratio of the snapshot, and of the interface icon of the object to which it corresponds, may be the same, as the size and aspect ratio of image 13 in fig. 5 and the size and aspect ratio of snapshot 131. In other embodiments, the size and aspect ratio of the snapshot, and the size and aspect ratio of the interface icon of the object corresponding thereto, may also be different, for example, as shown in fig. 7 for image 23 and snapshot 231.
In some embodiments, the first identifier includes a layer. In other embodiments, the first identifier comprises a logo. In some other embodiments, the first identifier includes a layer and a flag. The layer may be gray or dark, and the mark may be some specific symbol to remind the application corresponding to the second window that the first object cannot be received by displaying the layer and/or the mark.
In this embodiment of the application, dragging a first object in a window of a first APP into a window of a second APP substantially triggers the electronic device 100 to copy the first object in the first APP into the second APP. Based on this, if the second APP supports the file type of the first object, or the corresponding window interface of the second APP is capable of receiving the first object, the electronic device 100 may copy the first object to the second APP. If the second APP does not support the file type of the first object, or the corresponding window interface of the second APP cannot receive the first object, the electronic device 100 does not need to copy the first object to the second APP. In response to an operation instruction of a user's hands-off (up), the electronic apparatus 100 may control the snapshot of the first object to return to the window of the first APP and hide the corresponding snapshot.
It should be noted that, if the same object can be received, different interfaces or different windows of the same APP may have differences. For example, the first object is an image and the second APP is a mailbox APP. The electronic equipment displays a sending address window and a mailbox content window of the mailbox APP, wherein the sending address window cannot receive the first object, but the mailbox content window can receive the first object. As another example, the first object is a video, and the second APP is an instant messaging APP. When the electronic equipment displays an address book interface of the instant messaging APP, the address book interface cannot receive the first object, and when the electronic equipment switches to a conversation interface which displays the instant messaging APP and a contact person, the conversation interface can receive the first object.
In addition, it should be noted that all scenes related to the embodiments of the present application are implemented before the user leaves the hand, and may also be described as a scene in which the finger of the user touches the screen of the electronic device. The touching of the screen of the electronic device by the finger of the user may be the touching of the screen of the mobile phone 10 by the finger of the user, or the touching of the screen of the electronic device by the finger of the user may be referred to as the touching of the screen of the electronic device by the finger of the user when the distance between the finger of the user and the screen of the electronic device is less than 0.5 millimeter (mm). In practical implementation, when a finger of a user touches a screen of an electronic device, a distance between the finger and the screen may be determined by a touch sensitivity of a mobile phone. The departure of the user's finger from the screen of the electronic device may be considered a user departure hand (up).
Embodiments of the present application are described below with reference to a User Interface (UI) of the electronic device 100.
The UI related to the embodiment of the application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and the UI realizes conversion between an internal form of information and a form acceptable by the user. The user interface of the application program is a source code written by a specific computer language such as java, extensible markup language (XML), and the like, and the interface source code is analyzed and rendered on the terminal device, and finally presented as content that can be recognized by a user, such as interface elements such as images, texts, audio and video files, controls (controls), and the like. A control (control), also called a component (widget), is a basic element of a user interface, and typical controls are a toolbar (toolbar), a menu bar (menu bar), a text box (text box), a button (button), a scroll bar (scrollbar), and the like. The properties and contents of the interface elements are defined by tags or nodes, for example, XML specifies the text type of the interface element by nodes < Textview >, < ImgView >, < VideoView >, and the like. A node corresponds to an interface element or attribute in the interface, and the node is rendered as user-viewable content after parsing and rendering. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their interfaces. A web page, also called a page, can be understood as a special control embedded in an application program interface, the web page is a source code written by a specific computer language, such as hypertext markup language (HTML), Cascading Style Sheets (CSS), java scripts (JavaScript, JS), etc., and the web page source code can be loaded and displayed as a content recognizable to a user by a browser or a web page display component similar to a browser function. The specific content contained in the web page is also defined by tags or nodes in the source code of the web page, such as HTML, which defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, window, control, etc. displayed in the display screen 180 of the electronic device, where the control may include a visual interface element such as an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, Widget, etc.
Scene one:
in the first scenario, in a dragging process in which a user drags an object in a first APP window into a second APP window, the electronic device 100 presents different display effects according to different receiving conditions of the second APP on the corresponding object. The "dragging process" in the embodiment of the present application refers to an operation process from after the user touches the object in the first APP window to before the user leaves his hand. Fig. 5 illustrates the mobile phone 10 as an example, and illustrates a user dragging process and an interface display effect through a set of GUIs.
The GUI, as shown in (a) of fig. 5, is a display interface when the user performs a drag operation. As shown in fig. 5 (a), the screen of the mobile phone 10 displays a first window 11 and a second window 12. The first window 11 includes an image 13 therein. In the process that the user's finger touches the image 13 and starts to move on the screen, the view system of the cell phone 10 draws a snapshot of the image 13, resulting in a snapshot 131 illustrated in fig. 5 (a). In this embodiment, the snapshot 131 is an exemplary implementation manner of the third snapshot. In the process that the user's finger touches the snapshot 131 and continuously moves, the touch sensor of the mobile phone 10 continuously detects a signal, and accordingly, the drag manager of the mobile phone 10 may control the snapshot 131 to move along the movement track of the user based on the signal detected by the touch sensor, so as to implement the drag function.
It should be noted that, in the present embodiment, the mobile phone 10 may receive a screen splitting operation input by the user, and in response to the screen splitting operation, the screen is divided into two areas. Thereafter, the mobile phone 10 displays a window first window 11 of the first APP in one of the two areas and a window second window 12 of the second APP in the other of the two areas in response to the operation of opening the APP input by the user. And is not described in detail herein. The first APP and the second APP may be any APPs installed in the mobile phone 10. The first APP is for example a gallery and correspondingly the first window 11 is a window of the gallery and the second APP is for example a memo and correspondingly the second window 12 is a window of the memo. And are not limited herein.
After the user drags the snapshot 131 to the second window 12, and before the user leaves his hand, the window manager of the cell phone 10 may determine that the snapshot 131 has entered the second window 12 according to the position detected by the touch sensor. Thereafter, the window manager can obtain the file type (i.e., image type) of the image 13 according to the information of the object (i.e., image 13) corresponding to the snapshot 131. Further, the window manager detects whether the file type supported by the second APP (i.e. the APP corresponding to the second window 12) contains an image. If the file type supported by the second APP contains an image, the window manager may transmit the "containing" determination result to the view system. The view system may not do anything with snapshot 131. The GUI shown in (a) in fig. 5 is updated to the GUI shown in (b) in fig. 5. After the user is out of hand, the processor of the handset 10 may copy the image 13 to the second window 12 and the viewing system draws the copy of the image 13 within the second window 12. If the file type supported by the second APP does not contain an image, the window manager may transmit the "not-contained" determination result to the view system. The view system adds an interface element to snapshot 131, resulting in snapshot 132. The GUI shown in (a) in fig. 5 is updated to the GUI shown in (c) in fig. 5.
It will be appreciated that the cell phone 10 typically determines where the user dragged the snapshot by detecting the location of the touch of the user's finger on the screen. Based on this, in the above-described embodiment, in a scene in which the position determined by the detection signal of the touch sensor enters the second window 12, the window manager determines that the snapshot 131 is located within the second window 12.
It can be understood that the "supported file type" of the second APP may refer to a type that can normally display, normally run, and implement a corresponding function in response to an operation instruction based on a software framework of the second APP. The phrase "responding to the operation instruction to implement the corresponding function" means that after the object is dragged to the window of the second APP, the second APP can perform the operation of the corresponding function on the object based on the function attribute of the second APP. For example, in the GUI illustrated in FIG. 6, the second window 12 is a chat window of a social APP, such as a file type that supports the image 13. Further, when the image 13 is dragged to the chat window in fig. 6, and it is detected that the user is away from the hand, the social APP may send the copy image of the image 13 to the contact corresponding to the chat window. For another example, if the second APP is a mailbox APP, the second window is a window of mail content. After the image 13 is dragged to the second window, the mailbox APP may insert the copy image of the image 13 as an attachment to the mail into the mailbox. And is not described in detail herein. The second APP "unsupported file type" may refer to a type that cannot be normally displayed, cannot normally run, or is limited in execution of a corresponding function based on a software framework of the second APP. .
In addition, the interface element may include a layer mark and a mark such as a label.
The GUI is a display interface when the second APP supports the image, as shown in (b) in fig. 5. As shown in fig. 5 (b), the view system does not perform any operation on the snapshot 131 until the user is out of hand. The snapshot 131 is normally displayed within the second window 12 to enable the user to receive the image 13 by visually perceiving the second APP. Wherein the first snapshot of image 13 is implemented in the present embodiment as snapshot 131.
The GUI is a display interface when the second APP does not support an image, as shown in (c) of fig. 5. In this scenario, before the user leaves his hand, the view system may add a gray layer and a flag 14 on top of the snapshot 131 to update the snapshot 131 to the snapshot 132. As shown in fig. 5 (c), the snapshot 132 appears dark or gray compared to the bright or colored color of the snapshot 131, and the upper right corner portion of the snapshot 132 includes the indicator 14 so that the user does not receive the image 13 by visually perceiving the second APP. The second snapshot of the image 13 is implemented as a snapshot 132 in this embodiment, and accordingly, the gray layer and the mark 14 are the first identifier of the second snapshot.
In other embodiments, in the process that the mobile phone 10 presents the GUI shown in (c) in fig. 5, a sound may be further output to notify the user that the second APP does not support the object whose file type is an image.
In other embodiments, during the process that the mobile phone 10 presents the GUI shown in (c) of fig. 5, the user may be further notified by vibration that the second APP does not support the object whose file type is an image.
Therefore, with the implementation manner, in the process of dragging the object by the user, the mobile phone 10 can present different interface display effects to the user based on the support condition of receiving the APP on the file type of the dragged object. In this way, the mobile phone 10 can feed back the dragging effect to the user in the process of dragging the object by the user, thereby improving the operation experience of the user.
Scene two:
scene two illustrates an interface display method of the electronic device 100 in a process in which the user drags at least two objects at a time. Fig. 7 still illustrates, by way of example, the mobile phone 10, and illustrates an interface display effect of a user dragging at least two objects through a set of GUIs.
The GUI, as shown in (a) of fig. 7, is a display interface when the user selects an object to be dragged. As shown in fig. 7 (a), the screen of the mobile phone 10 displays a first window 21 and a second window 22. The first window 21 comprises an image 23 and text 24. Illustratively, in response to a user's long press on the image 23 or the text 24, the cellular phone 10 may display a check box 230 in the lower right corner of the image 23 and a check box 240 in the lower right corner of the text 24, for example. In response to the user clicking on the check boxes 230 and 240, the cell phone 10 receives a selection instruction that associates the image 23 with the text 24, and in turn, the cell phone 10 checks the image 23 with the text 24. After that, after receiving a drag instruction input by the user, the mobile phone 10 executes a drag function, and the GUI shown in (a) of fig. 7 is updated to the GUI shown in (b) of fig. 7.
It should be noted that, when the user touches any one of the image 23 and the text 24 to move, the mobile phone 10 may recognize the corresponding instruction as the aforementioned drag instruction.
A GUI as shown in (b) of fig. 7 is labeled, which is a display interface when the user performs a drag operation. As shown in (b) of fig. 7, in the process of starting to move the finger of the user, the view system of the mobile phone 10 generates a snapshot 231 of the image 23, a snapshot 241 of the text 24, and a mark 25, and the content of the mark 25 may be the total number of objects dragged by the user, and in this example, the content of the mark 25 is 2. In turn, the viewing system displays the two snapshot stacks with the logo 25 highlighted in the area in the upper right hand corner of the snapshots of the stack. In addition, during the process that the finger of the user starts to move, the drag manager of the mobile phone 10 may control the snapshot 231, the snapshot 241, and the mark 25 to move along the movement track of the user, so as to implement the drag function. Further, after it is detected that the user drags the snapshot 231 and the snapshot 241 to the second window 22, the window manager detects whether the APP corresponding to the second window 22 supports an object whose file type is an image, and whether the APP corresponding to the second window 22 supports an object whose file type is a text. Thereafter, the view system adds interface elements to snapshot 231 and snapshot 241 based on the detection results of the window manager. Accordingly, the GUI shown in (b) of fig. 7 is updated to the GUI shown in (c) of fig. 7.
In other embodiments, if the user drags at least two objects at a time, the view system may only draw snapshots of the respective at least two objects, and stack and display the snapshots of the respective at least two objects without adding a flag indicating the total number of objects.
It should be noted that, in order to optimize the display effect, after the view system generates the snapshot of the image 23 and the snapshot of the text 24, the two snapshots may be scaled to make the sizes of the two snapshots the same (the display effect shown in (b) in fig. 7), so that the user's experience is better during the stacking display process. In other embodiments, the viewing system may generate snapshots that are the same size as the image 23 itself and the text 24 itself, and then display the snapshots directly in a stack without zooming. And are not limited herein.
The GUI, as shown in (c) of fig. 7, is a display interface for the user to drag the snapshot 231 and the snapshot 241 into the second window 22. In this embodiment, the APP corresponding to the second window 22 supports, for example, an object whose file type is an image, but does not support an object whose file type is a text. Further, the view system sets a flag 26 in the upper right-hand region of snapshot 231, resulting in snapshot 232, and adds a gray layer on top of the interface of snapshot 241, and sets a flag 27 in the upper right-hand region of snapshot 241, resulting in snapshot 242. The sign 26 is highlighted, the content of the sign 26 being for example the total number of objects that the APP corresponding to the second window 22 can receive. The flag 27 is grey and the content of the flag 27 is, for example, the total number of objects that the APP corresponding to the second window 22 does not receive. As shown in (c) of fig. 7, the view system controls (b) the snapshots 231 and 241 of the stack display in fig. 7 to become the snapshots 232 and 242 tiled on the interface to display the entire interfaces of the snapshots 232 and the entire interfaces of the snapshots 242 to the user. The upper right hand region of the snapshot 232 contains a highlighted logo 26 with a content of "1" to indicate that the second window 22 is capable of receiving 1 of the two objects. The snapshot 242 appears dark or gray and the upper right hand area of the snapshot 242 contains a mark 27 displayed dark or gray with a "1" content to indicate that the second window 22 does not receive 1 of the two objects.
In other embodiments, after the user drags the snapshot 231 and the snapshot 241 into the second window 22, the view system may only add the gray layer and the mark 27 on top of the snapshot 241 without any processing of the snapshot 231. And the view system may tile snapshot 231 and snapshot 242 after the gray layer and logo 27 are added on the interface. The snapshot after the gray layer and the mark 27 are added is similar to the display effect of the snapshot 242 shown in fig. 6 (c), and will not be described in detail here.
In other embodiments, if the user drags at least three objects at a time, after dragging the snapshots of the at least three objects into the second window, the view system may control the snapshots of the at least three objects to be displayed in two groups, respectively, where the snapshots of the objects that the second APP can receive form one group, at least one snapshot in the group is displayed in a stacked manner, and the total number of the objects that the second APP can receive is displayed. The snapshots of the objects not received by the second APP form another group, at least one snapshot of which may be displayed in a stack and which displays the total number of objects not received by the second APP.
Illustratively, the GUI shown in FIG. 8 is a display interface for a user to drag four objects. As shown in fig. 8, in response to a drag operation by the user in the first window 31, the view system controls the stack display of snapshots of the four objects, and sets a 4-digit mark (not shown in the figure) highlighting the contents in the upper right-hand region of the snapshots displayed in the stack display. In response to the user touching a location that enters within the second window 32 (i.e., the four objects are dragged into the second window 32), the window manager determines, for example, that the second window 32 is capable of receiving two of the objects and is not capable of receiving the other two objects. Further, as shown in FIG. 8, the vision system controls the display of two snapshot stacks that the second window 32 is capable of receiving, and marks a highlighted icon 33 in the upper right hand region of the two snapshot stack interfaces, the icon 33 being indicated by the numeral "2". In addition, the view system adds gray layers to the two snapshots not received by the second window 32, and then controls the display of the two snapshot stacks after the gray layers are added, and marks 34 for gray display are marked in the upper right corner area of the interfaces of the two snapshot stacks, and the number indicated by the mark 34 is "2".
In other embodiments, if the user drags at least three objects at a time, after dragging the snapshots of the at least three objects into the second window, the view system may control the snapshots of the at least three objects to be displayed in two groups, respectively, where the snapshots of the objects that the second APP can receive form one group, at least one snapshot in the group is tiled for display, and the upper right-hand region of each snapshot displays the sequence number of the snapshot. Snapshots of objects not received by the second APP form another group, at least one snapshot in the group may be tiled, and the upper right-hand region of each snapshot shows the sequence number of the snapshot.
Illustratively, the GUI shown in FIG. 9 is another display interface for a user to drag four objects. As shown in fig. 9, in response to a drag operation by the user in the first window 41, the view system controls the stack display of snapshots of the four objects, and highlights a mark (not shown in the figure) whose content is number 4 in the upper right-hand corner area of the snapshots of the stack display. In response to the user touching the location to enter an operation within the second window 42, the window manager determines, for example, that the second window 42 is capable of receiving two of the objects and is incapable of receiving the other two objects. Further, as shown in fig. 9, the view system controls the tiling of the two snapshots the second window 32 is capable of receiving in the horizontal direction from the user's perspective. Where the top right area of the snapshot arranged on the left highlights the indicia 43 containing the number 1 and the top right area of the snapshot arranged on the right highlights the indicia 44 containing the number 2. In addition, the view system adds a gray layer to the two snapshots not received by the second window 32, and controls the two snapshots after the gray layer is added to be tiled in the horizontal direction from the user perspective below the two snapshots that the second window 32 can receive. Where the indicia 45 containing the number 1 in the upper right-hand region of the snapshot arranged to the left is colored gray and the indicia 46 containing the number 2 in the upper right-hand region of the snapshot arranged to the right is colored gray.
It is to be understood that fig. 7 to 9 are only schematic illustrations and do not limit the embodiments of the present application. In other embodiments, the user may also drag more or fewer objects. In addition, after the snapshot of the object is dragged to the window of the second APP, other display effects can be presented. For example, the flag containing numbers in the above embodiment may be set in the lower right corner area of the snapshot, and the like. The embodiments of the present application do not limit this.
By adopting the implementation mode, the mobile phone 10 can present more detailed dragging effect information to the user in the process of feeding back the dragging effect, so that the operation experience of the user can be further improved.
Scene three:
in the embodiment of the present application, the size of the snapshot of the dragged object drawn by the view system is, for example, the first size. In the third scenario, if the second APP can receive the dragged object, after detecting that the snapshot of the object enters the window (second window) of the second APP, the electronic device 100 may adaptively adjust the snapshot from the first size to the second size according to the size of the second window, so that the snapshot can be completely located in the window, and the size ratio between the snapshot and the window achieves the best visual effect.
In practical implementations, in some embodiments, the electronic device 100 may reduce the snapshot from the first size to the second size. Exemplarily, as shown in the scenario illustrated in fig. 10 (a). In other embodiments, the electronic device 100 may zoom in on the snapshot from a first size to a second size. Exemplarily, as shown in the scenario illustrated in fig. 10 (b).
In some embodiments, if the snapshot entering the second window is a snapshot of the first object received by the second APP, the electronic device 100 may adjust the size of the snapshot so that the ratio of the size of the snapshot to the size of the window is greater than or equal to the first threshold and less than or equal to the second threshold, so as to optimize the visual effect of the snapshot in the second window. Wherein the first threshold and the second threshold are both greater than 0 and less than 1. The first threshold value is, for example, 50%, and the second threshold value is, for example, 70%. And are not limited herein. The following describes an interface display method of a scene three by taking the mobile phone 10 as an example and combining a set of GUIs illustrated in fig. 10.
It should be noted that the "size" may include two parameters, i.e., pixel (px) in the x-axis direction and pixel (px) in the y-axis direction. In the embodiment of the present application, a pixel in the snapshot x-axis direction is referred to as "width" of the snapshot, and a pixel in the snapshot y-axis direction is referred to as "height" of the snapshot.
The GUI shown in fig. 10 (a) is a display interface for reducing the size of the snapshot for the mobile phone 10. As shown in fig. 10 (a), in response to an operation of dragging the image 50 by the user, the view system of the cell phone 10 draws a snapshot of the image 50, resulting in a snapshot 51. Snapshot 51 is, for example, 1080px wide and 500px high. In this embodiment, the second APP includes, for example, a window 52 and a window 53, and the user drags, for example, the snapshot 51 into the window 52. The window 52 is, for example, 1500px wide and 400px high. After detecting that the user drags the snapshot 51 to the window 52 of the second APP and determining that the second APP supports an object of which the file type is an image, the view system of the cell phone 10 adjusts the width of the snapshot 51 to 1000px, for example, and adjusts the height of the snapshot 51 to 380px, for example, to obtain the snapshot 54, so that the area of the snapshot 54 occupies 63% of the area of the window 52. In this way, in response to the user's hands-off operation, the handset 10 may generate a new image by the width and height of the snapshot 54. The first snapshot of the image 50 is implemented in the present embodiment as snapshot 54.
The GUI shown in (b) of fig. 10 is a display interface for enlarging the snapshot size of the cell phone 10. In this embodiment, the user drags the snapshot 51 into the window 53, for example. The window 53 is, for example, 1500px wide and 1200px high. As shown in (b) in fig. 10, after detecting that the user drags the snapshot 51 to the window 53 of the second APP, the view system of the cell phone 10 adjusts the width of the snapshot 51 to 1200px, for example, and adjusts the height of the snapshot 51 to 900px, for example, to obtain the snapshot 55, so that the area of the snapshot 55 occupies 60% of the area of the window 53. In this way, in response to the user's hands-off operation, the handset 10 may generate a new image by the width and height of the snapshot 55. The first snapshot of the image 50 is implemented in the present embodiment as snapshot 55.
In other implementations, the view system may also add highlighted flags to snapshot 54 and snapshot 55, the contents of which are, for example, 1.
It is to be understood that fig. 10 is only a schematic illustration, and the dimensions shown in fig. 10 (a) and 10 (b) do not limit the embodiments of the present application. In other embodiments, the size of the window and the size of the snapshot may be larger if the electronic device 100 is implemented as a larger screen device, such as a folding screen mobile phone. And is not described in detail herein.
By adopting the implementation manner, the mobile phone 10 can adaptively adjust the size of the snapshot according to the size of the receiving window in the dragging process of the user, so that the user can preview the size of the received object, and the operation experience of the user can be further improved.
Scene four:
scene three is an interface display method of the electronic device 100 in a scene where the second APP receives the first object. If the second APP continues to receive the second object and more objects, the electronic device 100 may display the preview effect of each object according to the size of the window and the number of the received objects. The electronic device 100 may further display a snapshot of the second object and a preview effect of the relative position of the icon of the first object according to the position triggered during the user's dragging.
For example, the electronic device 100 may adjust the size of each object interface element in the window on the condition that the sum of the sizes of all object icons in the window accounts for a ratio of the size of the window to the size of the window, which is greater than or equal to a first threshold and less than or equal to a second threshold, so as to display a preview interface of the user after the user leaves his hand. The first threshold and the second threshold are as described in the above embodiments, and are not described herein.
Taking the mobile phone 10 as an example, and combining a group of GUIs illustrated in fig. 11, an interface display method of the electronic device 100 is described by respectively dragging an implementation scene of two objects.
The GUI shown in fig. 11 (a) is an exemplary display interface of the mobile phone 10. As shown in fig. 11 (a), the second APP has received the first object 61, and the icon of the first object 61 is shown in the second window 60. In this embodiment, the second window 60 has a width of 1500px and a height of 1300px, for example. The icons of the first object 61 are for example 1200px wide and 1100px high. After detecting that the user drags the snapshot 62 into the second window 60 and the second APP supports the file type corresponding to the snapshot 62, the GUI shown in (a) in fig. 11 is updated to the GUI shown in (b) in fig. 11 before the user leaves his hands.
The GUI shown in fig. 11 (b) is an exemplary preview display interface of the mobile phone 10 during the process of dragging the second object to the second APP by the user. As shown in (b) of fig. 11, the window manager of the mobile phone 10 detects that the position touched by the user is on the right side of the first object 61, and further, the view system of the mobile phone 10 may narrow down the width of the icon of the first object 61 to 720px while keeping the height unchanged, resulting in the snapshot 63. The view system further adjusts the width of the snapshot 62 to 720px and the height to 1300px to obtain the snapshot 64, and sets the snapshot 64 to the right of the snapshot 63 to present a preview interface of the first object 61 and the second object so as to show the display effect to the user in advance. Further, after detecting that the user is away from the hand, the viewing system arranges the copied contents of the second object on the right side of the first object in accordance with the preview display effect illustrated in (b) of fig. 11, and makes the size of the second object icon and the size of the first object icon as described above.
In other embodiments, the GUI shown in fig. 11 (c) is another exemplary preview display interface of the mobile phone 10 during the process of dragging the second object to the second APP by the user. As shown in fig. 11 (c), the window manager of the mobile phone 10 detects that the position touched by the user is on the left side of the first object 61, and the view system sets the snapshot 64 on the left side of the snapshot 63, and resizes the icon of the first object 61 and the snapshot 63 according to the corresponding size in fig. 11 (b), so as to present another preview display interface of the first object 61 and the second object. Further, after detecting that the user is away from the hand, the viewing system arranges the copied contents of the second object on the left side of the first object in accordance with the preview display effect illustrated in (c) of fig. 11 so that the size of the second object icon and the size of the first object icon are as described above.
It is to be understood that fig. 11 is a schematic illustration only, and is not to be construed as limiting the embodiments of the present application. In other embodiments, the icons or snapshots of the two objects may also be arranged in an up-down positional relationship, which is not limited herein.
In addition, in some other embodiments, if the user continues to drag a third and more objects to the second window 60, the mobile phone 10 may adaptively adjust the size of the icon of the received object and the size of the snapshot to be away from the hand according to the total number of the objects received by the second APP. And the mobile phone 10 can adjust the arrangement order of the icons in response to the position touched by the user's finger. And will not be described in detail herein.
By adopting the implementation manner, the mobile phone 10 can adaptively adjust the size and the arrangement sequence of each icon according to the size of the receiving window and the number of the received objects in the dragging process of the user, so that the user can preview the size and the arrangement sequence of the received objects, and the operation experience of the user can be further improved.
In addition, although any of the above-described scenarios one to four is described by taking a non-flexible screen mobile phone as an example, the embodiments illustrated in the above scenarios are also applicable to a flexible screen mobile phone. And will not be described in detail herein.
It is to be understood that fig. 5 to 11 are only schematic illustrations and do not limit the embodiments of the present application. In other embodiments, if the electronic device 100 is implemented as another device, the electronic device 100 may further determine the dragging operation of the user according to an instruction entered by the user through a touch panel or a mouse. In addition, in a scenario where the second window does not receive the object (that is, the second APP does not support the corresponding file type), the electronic device 100 may also notify the user through other interface display effects. For example, in the first scenario, if the second window does not receive the image 13, the mobile phone 10 may further display a dialog box, and the content of the dialog box may be, for example, a reminder message "x cannot receive the image 13", where "x" is the name of the second APP. And are not limited herein.
The above is a description taking an example in which one electronic device presents two windows of APPs through split-screen. In other embodiments, the interface display method according to the embodiment of the present application is also applicable to a scenario in which at least two devices work cooperatively.
In one collaborative office scenario, as shown in fig. 12, a first device may be wirelessly connected to a second device, and then, in response to user input of a collaborative office setting, a collaboration window of the second device may be displayed on an interface of the first device, for example. Further, the first device may localize the application of the second device, so that the user may operate the second device at the first device side.
For example, taking the first device as a notebook computer and the second device as a mobile phone as an example, after the first device and the second device establish a connection for a collaborative office, an operation interface of the notebook computer 30 is shown as a GUI illustrated in fig. 13.
The GUI shown in fig. 13 (a) is a display interface of the notebook computer after the notebook computer and the mobile phone establish the cooperative office connection. As shown in fig. 13 (a), the screen of the notebook computer 30 displays the application program of the notebook computer 30 and the collaboration interface 71 of the mobile phone. The collaboration interface of the mobile phone comprises interface elements such as APP icons contained in the main interface of the mobile phone. The notebook computer receives an instruction of clicking an application program in the notebook computer by a user, and a window of the corresponding application program can be displayed on a screen of the notebook computer. The notebook computer receives an instruction of clicking the APP in the collaboration interface 71 by the user, and can display a window of the corresponding APP in the collaboration interface 71. For example, in response to an instruction of the user to click a mailbox in the notebook computer and an instruction of the user to click a gallery in the collaboration interface 71, the GUI shown in (a) in fig. 13 is updated to the GUI shown in (b) in fig. 13.
The GUI shown in (b) of fig. 13 displays a display interface of two windows for the notebook computer. As shown in fig. 13 (b), the screen of the notebook computer displays a window 72 of the mailbox and a window 73 of the gallery, wherein the window 73 is displayed in the collaboration interface 71, and at least one image is displayed in the window 73. In response to an operation instruction of a user to drag any image from the window 73 to the window 72, the notebook computer may execute an embodiment illustrated in any one of scenes one to four. Similarly, the notebook computer may further respond to an operation instruction of a user dragging any object from the window 72 to the window 73, and execute the embodiment illustrated in any one of the scenes one to four. And will not be described in detail herein.
It is to be understood that fig. 12 to 13 are only schematic illustrations and do not limit the embodiments of the present application. In other embodiments, in response to the user clicking to open the other application, the notebook computer may also display windows of the other application. In addition, if the device in the collaborative office is another device, the display interface may be different from the above-described embodiment. For example, the co-office interface black may display more windows. And are not limited herein.
In summary, by adopting the implementation manner of the embodiment of the application, the electronic device can process the snapshot of the object and present different display effects according to whether the target APP supports the file type of the object and the size of the target APP window in the process of dragging the object by the user. Therefore, the electronic equipment can present the display effect of whether the target APP receives the object to the user in the process of dragging by the user, and therefore the use experience of the user can be improved.
The above embodiments have introduced various aspects of the interface display method provided in the present application from the perspective of the hardware structure of the electronic device and the actions performed by each piece of software and hardware. Those skilled in the art should readily appreciate that the processing steps of receiving a dragging instruction, detecting whether the APP corresponding to the second window supports the file type of the dragged object, displaying different snapshots, and the like, which are described in connection with the embodiments disclosed herein, can be implemented not only in hardware, but also in a combination of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
For example, the electronic device 100 may implement the corresponding functions in the form of functional modules. In some embodiments, an electronic device may include a receiving module, a processing module, and a display module. The receiving module may be configured to perform the receiving of the instructions and information in any of the embodiments illustrated in fig. 4 to 13. The display module may be configured to perform the display of the window and the snapshot in any of the embodiments illustrated in fig. 4-13 above. The processing module may be configured to perform operations other than the receipt of instructions and information and the display of windows and snapshots in any of the embodiments illustrated in fig. 4-13 above.
It is understood that the above division of the modules is only a division of logical functions, and in actual implementation, the functions of the receiving module may be implemented by being integrated into a receiver, the functions of the processing module may be implemented by being integrated into a processor, and the functions of the display module may be implemented by being integrated into a display. As shown in fig. 14A, the electronic device 140 includes a receiver 1401, a processor 1402, and a display 1403. The receiver 1401 may perform the reception of instructions and information as described above in any of the embodiments illustrated in fig. 4-13. The display 1403 may be used to perform the display of the windows and snapshots in any of the embodiments illustrated in fig. 4-13 above. The processor 1402 may be used to perform operations other than the receipt of instructions and information and the display of windows and snapshots in any of the embodiments illustrated in fig. 4-13 described above.
For example, the display 1403 displays a first window and a second window. Based on this, the receiver 1401 may be configured to receive a drag instruction input by a user, where the drag instruction is used to instruct the electronic device to drag a first object in the first window into the second window. The processor 1402 may be configured to control the display to display a first snapshot of the first object when the application corresponding to the second window supports the file type of the first object. The processor 1402 may further be configured to control the display to display a second snapshot of the first object when the application program corresponding to the second window does not support the file type of the first object, where the second snapshot includes a first identifier, and the first identifier is used to indicate that the second window does not accept the first object.
Fig. 14A is a diagram illustrating an electronic device according to the present application from the perspective of a separate functional entity. In another implementation scenario, the functional entities running independently may be integrated into one hardware entity. As shown in fig. 14B, in this implementation scenario, the electronic device 141 may include a processor 1411, a transceiver 1412, and a memory 1413.
It should be understood that the electronic device 141 of the embodiment of the present application may correspond to the electronic device shown in fig. 4, the mobile phone 10 in the method shown in fig. 5 to 11, and the notebook computer in the embodiment shown in fig. 12 and 13. The transceiver 1412 is configured to execute the reception of the instruction executed by the electronic device illustrated in fig. 4 to 13, the memory 1413 may be configured to store a code, and the processor 1411 is configured to execute the code stored in the memory 1413, so as to implement other processing performed by the electronic device illustrated in fig. 4 to 13 except for the reception of the instruction and the snapshot display, which is not described herein again.
For specific content, reference may be made to the related description of the embodiments corresponding to fig. 4 to fig. 13, which is not repeated herein.
In a specific implementation, a computer storage medium is further provided corresponding to the electronic device in the embodiments of the present application, where the computer storage medium provided in the electronic device may store a program, and when the program is executed, part or all of the steps in each embodiment of the interface display method provided in fig. 4 to 13 may be implemented. The storage medium in any device may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
One or more of the above modules or units may be implemented in software, hardware or a combination of both. When any of the above modules or units are implemented in software, which is present as computer program instructions and stored in a memory, a processor may be used to execute the program instructions and implement the above method flows. The processor may include, but is not limited to, at least one of: various computing devices that run software, such as a Central Processing Unit (CPU), a microprocessor, a Digital Signal Processor (DSP), a Microcontroller (MCU), or an artificial intelligence processor, may each include one or more cores for executing software instructions to perform operations or processing. The processor may be built in an SoC (system on chip) or an Application Specific Integrated Circuit (ASIC), or may be a separate semiconductor chip. The processor may further include a necessary hardware accelerator such as a Field Programmable Gate Array (FPGA), a PLD (programmable logic device), or a logic circuit for implementing a dedicated logic operation, in addition to a core for executing software instructions to perform an operation or a process.
When the above modules or units are implemented in hardware, the hardware may be any one or any combination of a CPU, a microprocessor, a DSP, an MCU, an artificial intelligence processor, an ASIC, an SoC, an FPGA, a PLD, a dedicated digital circuit, a hardware accelerator, or a discrete device that is not integrated, which may run necessary software or is independent of software to perform the above method flows.
When the above modules or units are implemented using software, they may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It should be understood that, in the various embodiments of the present application, the size of the serial number of each process does not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments.
All parts of the specification are described in a progressive mode, the same and similar parts of all embodiments can be referred to each other, and each embodiment is mainly introduced to be different from other embodiments. In particular, as to the apparatus and system embodiments, since they are substantially similar to the method embodiments, the description is relatively simple and reference may be made to the description of the method embodiments in relevant places.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the present invention should be included in the scope of the present invention.

Claims (19)

1. An interface display method is applied to an electronic device, wherein a screen of the electronic device displays a first window and a second window, and the method comprises the following steps:
receiving a dragging instruction input by a user, wherein the dragging instruction is used for instructing the electronic equipment to drag a first object in the first window into the second window;
if the application program corresponding to the second window supports the file type of the first object, displaying a first snapshot of the first object;
and if the application program corresponding to the second window does not support the file type of the first object, displaying a second snapshot of the first object, wherein the second snapshot comprises a first identifier, and the first identifier is used for indicating that the second window does not accept the first object.
2. The method of claim 1, wherein displaying the second snapshot of the first object comprises:
adding a layer to a third snapshot of the first object to obtain the second snapshot, wherein the third snapshot is generated after the electronic device receives the dragging instruction and before a position triggered by a user is determined to enter the second window.
3. The method of claim 1 or 2, wherein displaying the second snapshot of the first object comprises:
and adding a first mark to a third snapshot of the first object to obtain the second snapshot, wherein the third snapshot is generated after the electronic equipment receives the dragging instruction and before a position triggered by a user is determined to enter the second window.
4. The method of claim 3, wherein the first indicia is a graphical identifier, or a label, the first object comprising at least one object, the label indicating a total number of first objects not received by the second window.
5. The method of claim 1, wherein displaying the first snapshot of the first object comprises:
after determining that a third snapshot of the first object enters the second window, acquiring a first size of the third snapshot and a size of the second window, wherein the third snapshot is generated after the electronic device receives the dragging instruction and before a position triggered by a user is determined to enter the second window;
and adjusting the third snapshot from the first size to a second size to obtain the first snapshot, wherein the ratio of the second size to the size of the second window is greater than or equal to a first threshold and smaller than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and smaller than 1.
6. The method of claim 5, wherein adjusting the third snapshot from the first size to a second size, resulting in the first snapshot, comprises:
reducing the third snapshot from the first size to the second size to obtain the first snapshot; or,
and enlarging the third snapshot from the first size to the second size to obtain the first snapshot.
7. The method of claim 1, wherein the second window includes an icon of a second object, and wherein displaying the first snapshot corresponding to the first object comprises:
after determining that the third snapshot of the first object enters the second window, acquiring a first size of the third snapshot, a size of the second window, and a third size of the icon;
adjusting the third snapshot from the first size to a fourth size to obtain the first snapshot, and adjusting the icon from the third size to the fourth size, wherein a ratio of twice the fourth size to the size of the second window is greater than or equal to a first threshold and less than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and less than 1;
and arranging the first snapshot and the icon after the size adjustment according to the position triggered by the user.
8. The method of any of claims 5-7, wherein the first object includes at least one object, the displaying a first snapshot corresponding to the first object, further comprising:
and adding a second identifier to the third snapshot to obtain the first snapshot, wherein the second identifier is used for indicating the total number of the first objects which can be received by the second window.
9. The method of any of claims 1-8, after receiving a user-input drag instruction, prior to displaying the first snapshot or the second snapshot, further comprising:
generating a third snapshot of the first object, the first object comprising at least one object;
adding a third identifier to the third snapshot, wherein the third identifier is used for indicating the total number of the first objects.
10. An electronic device comprising a processor, a receiver, and a display, the display displaying a first window and a second window,
the receiver is used for receiving a dragging instruction input by a user, and the dragging instruction is used for instructing the electronic equipment to drag a first object in the first window into the second window;
the processor is configured to control the display to display a first snapshot of the first object when the application program corresponding to the second window supports the file type of the first object; and the display is further configured to control the display to display a second snapshot of the first object when the application program corresponding to the second window does not support the file type of the first object, where the second snapshot includes a first identifier, and the first identifier is used to indicate that the second window does not accept the first object.
11. The electronic device of claim 10,
the processor is further configured to add a layer to a third snapshot of the first object to obtain the second snapshot, where the third snapshot is generated after the electronic device receives the drag instruction and before a position triggered by a user is determined to enter the second window.
12. The electronic device of claim 10 or 11,
the processor is further configured to add a first mark to a third snapshot of the first object to obtain the second snapshot, where the third snapshot is generated after the electronic device receives the drag instruction and before a position triggered by a user is determined to enter the second window.
13. The electronic device of claim 12,
the first mark is a graphic identifier or a label, the first object comprises at least one object, and the label is used for indicating the total number of the first objects which are not received by the second window.
14. The electronic device of claim 10,
the processor is further configured to obtain a first size of a third snapshot of the first object and a size of the second window after determining that the third snapshot enters the second window, where the third snapshot is generated after the electronic device receives the drag instruction and before determining that a position triggered by a user enters the second window;
the processor is further configured to adjust the third snapshot from the first size to a second size to obtain the first snapshot, where a ratio of the second size to the size of the second window is greater than or equal to a first threshold and is less than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and less than 1.
15. The electronic device of claim 14,
the processor is further configured to reduce the third snapshot from the first size to the second size, so as to obtain the first snapshot; or,
and enlarging the third snapshot from the first size to the second size to obtain the first snapshot.
16. The electronic device of claim 10,
the processor is further configured to, after it is determined that the third snapshot of the first object enters the second window, obtain a first size of the third snapshot, a size of the second window, and a third size of the icon;
the processor is further configured to adjust the third snapshot from the first size to a fourth size to obtain the first snapshot, and adjust the icon from the third size to the fourth size, where a ratio of twice the fourth size to the size of the second window is greater than or equal to a first threshold and smaller than or equal to a second threshold, and both the first threshold and the second threshold are greater than 0 and smaller than 1;
the processor is further configured to arrange the first snapshot and the resized icon according to a user-triggered position.
17. The electronic device of any one of claims 14-16,
the processor is further configured to add a second identifier to the third snapshot to obtain the first snapshot, where the first object includes at least one object, and the second identifier is used to indicate a total number of first objects that can be received by the second window.
18. The electronic device of any of claims 10-17,
the processor is further configured to generate a third snapshot of the first object, the first object comprising at least one object;
the processor is further configured to add a third identifier to the third snapshot, where the third identifier is used to indicate a total number of the first objects.
19. A computer-readable storage medium, comprising a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 9.
CN202010940657.1A 2020-09-09 2020-09-09 Interface display method and electronic equipment Pending CN114237778A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010940657.1A CN114237778A (en) 2020-09-09 2020-09-09 Interface display method and electronic equipment
PCT/CN2021/110459 WO2022052677A1 (en) 2020-09-09 2021-08-04 Interface display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010940657.1A CN114237778A (en) 2020-09-09 2020-09-09 Interface display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN114237778A true CN114237778A (en) 2022-03-25

Family

ID=80632595

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010940657.1A Pending CN114237778A (en) 2020-09-09 2020-09-09 Interface display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114237778A (en)
WO (1) WO2022052677A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114895815A (en) * 2022-06-17 2022-08-12 维沃移动通信有限公司 Data processing method and electronic equipment
CN117724780A (en) * 2023-07-03 2024-03-19 荣耀终端有限公司 Information acquisition method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118502625A (en) * 2022-10-24 2024-08-16 荣耀终端有限公司 Information display method, electronic device and readable storage medium
CN115794413B (en) * 2023-01-09 2024-05-14 荣耀终端有限公司 Memory processing method and related device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100010072A (en) * 2008-07-22 2010-02-01 엘지전자 주식회사 Controlling method of user interface for multitasking of mobile devices
WO2018120533A1 (en) * 2016-12-27 2018-07-05 华为技术有限公司 Multi-screen display method and apparatus
CN110727382A (en) * 2019-09-06 2020-01-24 华为技术有限公司 Split-screen display method and electronic equipment
CN111221453A (en) * 2019-10-31 2020-06-02 华为技术有限公司 Function starting method and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5362307B2 (en) * 2008-09-30 2013-12-11 富士フイルム株式会社 Drag and drop control device, method, program, and computer terminal
CN103279269B (en) * 2013-05-31 2016-03-02 华为技术有限公司 Data interactive method between a kind of application program and device, terminal device
CN104216607B (en) * 2013-09-05 2017-10-20 侯金涛 The icon dragging method and system of virtual opetrating system based on HTML5
CN104932796B (en) * 2015-06-02 2018-05-08 无锡天脉聚源传媒科技有限公司 A kind of control method and device of component drag and drop
CN109782976B (en) * 2019-01-15 2020-12-22 Oppo广东移动通信有限公司 File processing method, device, terminal and storage medium
CN110221759A (en) * 2019-05-31 2019-09-10 广州视源电子科技股份有限公司 Element dragging method and device, storage medium and interactive intelligent panel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100010072A (en) * 2008-07-22 2010-02-01 엘지전자 주식회사 Controlling method of user interface for multitasking of mobile devices
WO2018120533A1 (en) * 2016-12-27 2018-07-05 华为技术有限公司 Multi-screen display method and apparatus
CN110727382A (en) * 2019-09-06 2020-01-24 华为技术有限公司 Split-screen display method and electronic equipment
CN111221453A (en) * 2019-10-31 2020-06-02 华为技术有限公司 Function starting method and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘俊涛的博客: "Win10 导航窗口不能移动文件win10 资源管理器 无法拖动文件到左侧驱动器", pages 1 - 5, Retrieved from the Internet <URL:https://www.cnblogs.com/lovebing/p/7396237.html> *
百度经验: "复制黏贴删除文件", pages 1 - 5, Retrieved from the Internet <URL:https://jingyan.baidu.com/article/fa4125acd7d1e628ad709211.html> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114895815A (en) * 2022-06-17 2022-08-12 维沃移动通信有限公司 Data processing method and electronic equipment
CN117724780A (en) * 2023-07-03 2024-03-19 荣耀终端有限公司 Information acquisition method

Also Published As

Publication number Publication date
WO2022052677A1 (en) 2022-03-17

Similar Documents

Publication Publication Date Title
WO2020211709A1 (en) Method and electronic apparatus for adding annotation
WO2021159922A1 (en) Card display method, electronic device, and computer-readable storage medium
CN105683895B (en) User terminal device for providing user interaction and method thereof
KR102220085B1 (en) Operating Method For Multi-Window And Electronic Device supporting the same
US11687235B2 (en) Split-screen method and electronic device
CN110119296B (en) Method for switching parent page and child page and related device
JP2024020334A (en) Screen taking-in method and related device
CN111966252A (en) Application window display method and electronic equipment
CN114237778A (en) Interface display method and electronic equipment
EP3964937A1 (en) Method for generating user profile photo, and electronic device
WO2022062898A1 (en) Window display method and device
WO2021008334A1 (en) Data binding method, apparatus, and device of mini program, and storage medium
CN114816167A (en) Application icon display method, electronic device and readable storage medium
WO2022213831A1 (en) Control display method and related device
CN115661301A (en) Method for adding annotations, electronic device, storage medium and program product
CN115801943A (en) Display method, electronic device, and storage medium
WO2024125301A1 (en) Display method and electronic device
EP4421630A1 (en) Window display method and electronic device
WO2024087808A1 (en) Interface display method and electronic device
WO2023160455A1 (en) Object deletion method and electronic device
WO2024193402A1 (en) Display method and electronic device
WO2024187796A1 (en) Method and apparatus for magnification in display interface
CN118444867A (en) Display method and related device
CN115774511A (en) Annotating method and electronic equipment
KR20210022027A (en) Operating Method For Multi-Window And Electronic Device supporting the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination