CN115185440A - Control display method and related equipment - Google Patents

Control display method and related equipment Download PDF

Info

Publication number
CN115185440A
CN115185440A CN202110372085.6A CN202110372085A CN115185440A CN 115185440 A CN115185440 A CN 115185440A CN 202110372085 A CN202110372085 A CN 202110372085A CN 115185440 A CN115185440 A CN 115185440A
Authority
CN
China
Prior art keywords
interface
electronic device
target
control
controls
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110372085.6A
Other languages
Chinese (zh)
Other versions
CN115185440B (en
Inventor
周雪怡
徐杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110372085.6A priority Critical patent/CN115185440B/en
Priority to PCT/CN2022/083215 priority patent/WO2022213831A1/en
Publication of CN115185440A publication Critical patent/CN115185440A/en
Application granted granted Critical
Publication of CN115185440B publication Critical patent/CN115185440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a control display method and related equipment, wherein the method comprises the following steps: displaying a first interface; the first interface comprises a target object, the target object comprises a first target point, and the first target point is located at a first position of the first interface; receiving and responding to a first dragging operation aiming at a first target point, determining a first area where a first position is located, and determining that the first target point is located at a second position of a first interface; if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first time threshold value, displaying a target control at a third position of the first interface; receiving and responding to a second dragging operation aiming at the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; the target frame comprises N controls; and receiving and responding to the click operation aiming at one control in the N controls, and displaying a second interface. By the embodiment of the application, the operation experience of a user can be improved.

Description

Control display method and related equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to a control display method and a related device.
Background
With the improvement of living standard of people, electronic equipment such as mobile phones, tablets and computers can be seen everywhere in any daily life. When a user uses an intelligent device, a drag operation is often involved to move an object such as a window or a file displayed in a device interface to a certain position. The position may correspond to a short or long distance movement within the current interface, or a movement across the device or across the screen.
In order to implement the cross-device or cross-screen movement, the electronic device often triggers and displays a corresponding function control based on a dragging operation of a user for a window, and at this time, the user needs to continuously drag the window until a certain function control is selected. However, as described above, in the dragging process, the user often needs to keep clicking the selected window and not release the selected window until the selected window is moved to the desired function control and then releases the selected window, so that the operation burden of the user is greatly increased and the operation efficiency and the operation experience are greatly reduced under the condition that the long-path dragging is required, especially under the environment of operating by using a touch pad. In addition, the frequency of moving the dragging window or the file in the current interface is often greater than that of moving across equipment or across screens, so that if the dragging window triggers displaying of a corresponding function control each time, visual interference is caused to actual operation of a user, and the use experience of the user is reduced.
Therefore, how to improve the operation experience of the user is an urgent problem to be solved.
Disclosure of Invention
The embodiment of the application provides a control display method and related equipment, so as to improve the operation experience of a user.
In a first aspect, an embodiment of the present application provides a control display method, which is applied to a first electronic device, and may include:
displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface; receiving and responding to a first dragging operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining that the first target point is located at a second position of the first interface; if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold, displaying a target control at a third position of the first interface; receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1; and receiving and responding to the click operation of one control in the N controls, and displaying a second interface.
By the method provided by the first aspect, the electronic device may determine the operation intention of the user by detecting the time when the user performs the drag operation and the movement range. If the user drags a window or a file (i.e., a target object) quickly away from the home position in a short time, it may be considered that the user only wants to move the window or the file within the interface, and the display of a control (i.e., a target control, or a transfer gate) for sharing or cross-screen display (or switching screen display) may not be triggered. On the contrary, if the user does not drag the window or the file and the like away from a certain range (i.e. the first area) where the home position is located within a period of time, it may be considered that the user wants to perform cross-device sharing or cross-screen display on the window or the file, and then the target control may be triggered to be displayed. Then, after the target control is triggered and displayed, the user can continuously drag a window or a file until the target control is touched, then, the electronic device displays a target frame, the target frame can comprise a plurality of controls specifically used for sharing to each device, a plurality of controls used for screen changing display and the like, and finally, the user can click corresponding controls in the target frame according to requirements to achieve corresponding functions. Therefore, compared with the prior art, when a user executes a dragging operation on a window or a file, a control displayed or shared across screens is triggered and displayed very easily, so that the user does not have the operation intention, and only wants to move the window or the file within the interface, visual interference in actual operation of the user is caused, and the operation experience of the user is reduced.
In a possible implementation manner, the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the receiving and responding to the click operation for one of the N controls, and displaying a second interface specifically includes: receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the receiving and responding to the click operation for one of the N controls, and displaying a second interface, specifically including: receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls in one-to-one correspondence with K second electronic devices; the receiving and responding to the click operation for one of the N controls, and displaying a second interface, specifically including: receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible embodiment, the method further comprises:
receiving and responding to a third dragging operation aiming at an x-th sharing control in the K sharing controls, forming a snapshot corresponding to the x-th sharing control, and generating multi-device sharing controls corresponding to an x-th second electronic device and a y-th second electronic device when detecting that the snapshot corresponding to the x-th sharing control is located in a third area where the y-th sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
In a possible implementation manner, the receiving and responding to a click operation on one of the N controls and displaying a second interface specifically includes: receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In a possible implementation manner, the receiving and responding to the second drag operation for the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position specifically includes: receiving and responding to a second dragging operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to complete the second dragging operation.
In a possible embodiment, the direction in which the first position points towards the third position is different from the direction in which the first position points towards the second position.
In one possible embodiment, the target object is any one of a window, a file, a business card, and a link.
In a second aspect, an embodiment of the present application provides an electronic device, which is a first electronic device and includes a first display screen, a memory, and one or more processors; the first display screen, the memory coupled with the one or more processors, the memory to store computer program code, the computer program code including computer instructions, the one or more processors to invoke the computer instructions to cause the electronic device to perform: displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface; receiving and responding to a first dragging operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining that the first target point is located at a second position of the first interface; if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold, displaying a target control at a third position of the first interface; receiving and responding to a second dragging operation aiming at the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1; and receiving and responding to the click operation of one control in the N controls, and displaying a second interface.
In one possible implementation, the electronic device further includes M second display screens; the first display screen, the M second display screens, the memory and the one or more processors are coupled; the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform: receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one to one; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a third dragging operation aiming at an x-th sharing control in the K sharing controls, forming a snapshot corresponding to the x-th sharing control, and generating multi-device sharing controls corresponding to an x-th second electronic device and a y-th second electronic device when detecting that the snapshot corresponding to the x-th sharing control is located in a third area where the y-th sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; and the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to finish the second dragging operation.
In a possible embodiment, the direction in which the first position points towards the third position is different from the direction in which the first position points towards the second position.
In one possible embodiment, the target object is any one of a window, a file, a business card, and a link.
In a third aspect, an embodiment of the present application provides a control display apparatus, which is applied to a first electronic device, and may include:
the first display unit is used for displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface;
a first determining unit, configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first location is located, and determine that the first target point is located at a second location of the first interface;
the second display unit is used for displaying a target control at a third position of the first interface if the second position is detected to be in the first area and the duration of the first dragging operation is greater than a first duration threshold;
the third display unit is used for receiving and responding to a second dragging operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
and the fourth display unit is used for receiving and responding to the click operation of one of the N controls and displaying the second interface.
In a possible implementation manner, the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the fourth display unit is specifically configured to:
receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the fourth display unit is specifically configured to:
receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one to one; the fourth display unit is specifically configured to:
receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In a possible embodiment, the apparatus further comprises:
the generating unit is used for receiving and responding to a third dragging operation aiming at an x-th sharing control in the K sharing controls, forming a snapshot corresponding to the x-th sharing control, and generating multi-device sharing controls corresponding to an x-th second electronic device and a y-th second electronic device when the fact that the snapshot corresponding to the x-th sharing control is located in a third area where the y-th sharing control is located is detected; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
In a possible implementation manner, the fourth display unit is specifically configured to:
receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In a possible implementation manner, the third display unit is specifically configured to:
receiving and responding to a second dragging operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; and the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to finish the second dragging operation.
In a possible embodiment, the direction in which the first position points towards the third position is different from the direction in which the first position points towards the second position.
In one possible embodiment, the target object is any one of a window, a file, a business card, and a link.
In a fourth aspect, an embodiment of the present application provides a computer storage medium for storing computer software instructions for a control display apparatus provided in the third aspect, where the computer software instructions include a program designed to execute the foregoing aspects.
In a fifth aspect, the present application provides a computer program, where the computer program includes instructions, and when the computer program is executed by a computer, the computer may execute the process executed by the control display apparatus in the third aspect.
It should be appreciated that the description of technical features, aspects, advantages, or similar language in the specification does not imply that all of the features and advantages may be realized in any single embodiment. Rather, it is to be understood that the description of a feature or advantage is intended to include the inclusion of a particular feature, aspect or advantage in at least one embodiment. Thus, descriptions of technical features, technical solutions or advantages in this specification do not necessarily refer to the same embodiment. Furthermore, the technical features, aspects and advantages described in the following embodiments may be combined in any suitable manner. One skilled in the relevant art will recognize that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
FIG. 1 is a schematic diagram of a cross-screen toggle display.
Fig. 2 a-2 d are a set of user interface diagrams.
Fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application.
Fig. 4 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating a control display method according to an embodiment of the present application.
Fig. 6 a-6 c are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 7 is a schematic diagram of a process of changing a transfer gate according to an embodiment of the present application.
Fig. 8 is a schematic diagram of another variation process of the transfer gate according to the embodiment of the present application.
Fig. 9a is a schematic view of a transfer door and a button back plate according to an embodiment of the present disclosure.
Fig. 9b is a schematic diagram of a button back plate according to an embodiment of the present application.
Fig. 10 a-10 h are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 11 is a schematic view of a transfer gate according to an embodiment of the present application.
Fig. 12 is a schematic view of another transfer gate provided in the embodiments of the present application.
Fig. 13 a-13 e are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 14 a-14 c are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 15 a-15 e are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 16 a-16 d are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Fig. 17 a-17 d are a set of schematic user interfaces provided by embodiments of the present application.
Fig. 18 is a schematic structural diagram of a control display device according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings. The terminology used in the description of the embodiments of the examples herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
The terms "first," "second," "third," and the like in the description, claims, and drawings of the present application are used for distinguishing between different objects and not necessarily for describing a particular order. The terminology used in the following examples of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
The following introduces an electronic deviceEmbodiments of a user interface for such an electronic device, and methods for using such an electronic device. In some embodiments, the electronic device may be a portable electronic device, such as a cell phone, a tablet, a wearable electronic device with wireless communication capabilities (e.g., a smart watch), and the like, that also includes other functionality, such as cross-screen display, cross-device sharing, and the like. Exemplary embodiments of the portable electronic device include, but are not limited to, a mount
Figure BDA0003009675400000061
Or other operating system. The portable electronic device may also be other portable electronic devices such as laptop computers (laptop) with touch sensitive surfaces or touch panels, etc. It should also be understood that in other embodiments, the electronic device may not be a portable electronic device, but may be a desktop computer, a vehicle-mounted computer, or the like having a touch-sensitive surface or touch panel. It can be understood that, in the embodiment of the present application, a smart phone is taken as an example for introduction, but the embodiment is not limited to the smart phone, and the smart phone may also be other smart devices with a communication function, such as a smart watch, a smart band, virtual Reality (VR) glasses, and the like.
The term "User Interface (UI)" in the specification, claims and drawings of the present application is a medium interface for interaction and information exchange between an application program or operating system and a user, and it implements conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written by a specific computer language such as java, extensible markup language (XML), and the like, and the interface source code is analyzed and rendered on the terminal device, and finally presented as content that can be identified by the user, such as controls such as pictures, characters, buttons, and the like. Controls (controls), also called widgets, are basic elements of user interfaces, and typical controls are tool bars (toolbar), menu bars (menu bar), text boxes (text box), buttons (button), scroll bars (scrollbar), pictures, and texts. The properties and contents of the controls in the interface are defined by tags or nodes, such as XML defining the controls contained by the interface by nodes < Textview >, < ImgView >, < VideoView >, and the like. A node corresponds to a control or attribute in the interface, and the node is displayed as content visible to a user after being parsed and rendered. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their interfaces. A web page, also called a page, can be understood as a special control embedded in an application program interface, where the web page is a source code written by a specific computer language, such as hypertext markup language (HTML), cascading Style Sheets (CSS), java script (JavaScript, JS), etc., and the web page source code can be loaded and displayed as content recognizable to a user by a browser or a web page display component similar to a browser function. The specific content contained in the web page is also defined by tags or nodes in the source code of the web page, such as HTML, which defines elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
First, some terms in the present application are explained so as to be easily understood by those skilled in the art.
(1) Drag and drop (drag and drop), otherwise known as drag, refers to the entire operation process of "clicking and not releasing an object with a touch pad, a mouse or a finger in a laptop, a desktop, a tablet computer or a smart phone, and moving the touch pad, the mouse or the finger in the original plane and then releasing the object". The drag operation is generally used to move an object such as a window or a file to a certain position, and may be short-distance or long-distance movement within the current interface, or movement across devices or screens. For example, referring to fig. 1, fig. 1 is a schematic diagram of cross-screen switching display. As shown in fig. 1, the electronic device may include two display screens, for example, two display screens connected by a host computer, two display screens provided on a notebook computer or a tablet computer, and the like, which is not limited in this embodiment of the present disclosure. Then, a corresponding drag operation may be performed on the window shown in fig. 1, so that the window may be displayed in a switched manner in the first display screen and the second display screen, which may specifically refer to the following solutions in the prior art, and details are not described here.
As described above, in order to implement the cross-device or cross-screen movement of a window or a file through a drag operation, in the prior art, various technical solutions are included, and the following exemplary embodiments exemplify app switches (app switches) proposed in products of related companies on the market at present.
Referring to fig. 2a to 2d, fig. 2a to 2d are schematic diagrams of a set of user interfaces, and as shown in fig. 2a to 2d, the electronic device (a notebook computer is taken as an example in fig. 2a to 2 d) may include a plurality of display screens, and specifically may include a first display screen 01 and a second display screen 02. As shown in fig. 2a and 2b, an interface displayed on the first display screen 01 of the electronic device may include a window 03, and when a user drags the window 03 in a window bar of the window 03 using a mouse, a touch pad, or a finger (for example, a finger in the drawing, a dot in the drawing is a position touched by the finger, and the electronic device may be an electronic device with a touch screen function) (that is, as shown in fig. 2b, when the electronic device receives an input operation 04a of the user on the window 03), the electronic device may display a control combination 05 (or referred to as a button combination, that is, the app switch) in a moving direction of the dragging operation. As shown in fig. 2c, the control combination 05 may include two controls 06a and 06b, where the control 06a may be a control for switching to the second display 02, and the control 06b may be a control for switching to the dual screen and performing continuous full-screen display. As shown in fig. 2c, the electronic device receives an input operation 04b of the user (for example, on the basis of fig. 2b, the user does not release the finger and continuously drags the window until the finger touches the control 06 a), and in response to the input operation 04b, the electronic device displays the secondary menu 07 related to the function of switching to the second display screen 02 below the control combination 05, and at this time, the control combination 05 and the secondary menu 07 may be in a floating state. As shown in fig. 2d, the two-level menu 07 may include a control 07a (which may be used to switch to the left half screen of the second display 02 for display), a control 07b (which may be used to switch to the right half screen of the second display 02 for display), a control 07c (which may be used to switch to the two-thirds left screen of the second display 02 for display), and a control 07d (which may be used to switch to the two-thirds right screen of the second display 02 for display). As shown in fig. 2d, the electronic device receives an input operation 04c of the user (for example, on the basis of fig. 2c, the user does not release the finger and continuously drags the window until the finger touches the control 07 a), and in response to the input operation 04c, the electronic device switches the window 03 to the left half screen of the second display screen 02 for display.
However, although the above-described solution can implement dragging one window from the current screen to another screen for display, there are still many problems when the solution is actually landed in a product and used in a user. Firstly, because the frequency of small-range movement of the window dragged by the user in the current interface is much greater than the frequency of execution of switching the screen, as can be seen from the prior art described in fig. 2a to 2d, if a control appears in each window dragging, great interference is inevitably generated on the user in terms of visual and operation experience. Based on this, the problems of the prior art mentioned above may specifically include the following points:
1. as shown in FIG. 2b, the app switch appears at a position that coincides with the window movement direction, making it extremely easy for the user to miss. For example, when the user drags the window with the target of the non-app switch, the window inevitably passes through the control combination 05 in the moving path, and at this time, the floating state of the control combination 05 is inevitably triggered, and the secondary menu 07 (as shown in fig. 2 c) is displayed, thereby creating great visual interference to the user. And the subsequent user needs to cancel displaying the control combination 05 and the secondary menu 07 through other operations (for example, clicking a blank in the display interface of the first display screen 01, and the like, which is not specifically limited in the embodiment of the present application), so that the operation amount and the operation load of the user are greatly increased, and the operation efficiency of the user is reduced.
2. As shown in fig. 2b, the control combination 05 is large in size, so that the actual operation target of the user is easily blocked. For example, when the user drags the window with a target that does not use the app switch, the control combination 05 is large in size, and may possibly block the actual operation target of the user. For example, when a user wants to drag to an edge in a windows (a "window" operating system) to trigger a split screen function, in the process of dragging to the edge by the user, the electronic device triggers and displays the control combination 05, and at this time, the user needs to leave the original path direction to execute the split screen operation in order to avoid the control combination 05, so that the operation burden of the user is greatly increased, and the operation efficiency of the user is reduced.
3. As shown in fig. 2b, fig. 2c and fig. 2d, a user finally achieves a target of cross-screen display by performing a long-time and accurate dragging operation, and especially under the touch of a touch pad and a finger, the operation burden of the user is heavy. In the use environment of the touch pad, because the area of the touch pad is small, the user needs to press the touch pad harder to keep dragging under the condition that part of the touch pad is in a mechanical structure. For example, as can be seen from fig. 2b, fig. 2c, and fig. 2d, in the prior art, a user is required to keep pressing a touch pad or a mouse to move in a process from dragging to evoking a control combination 05, to dragging to a cross-screen display control 06a, and to dragging to a specific cross-screen display scheme control 07a button, or keep touching a finger to touch a display screen to move until touching the control 07a and then releasing the control. Therefore, with the increase of cross-screen display schemes or the increase of sharable devices, the number of controls is increased, and the operation burden of the user is increased accordingly. In addition, under the operation of the touch screen, because the finger is large, and the size of the control in the prior art is not larger than that of the finger, when the user moves to the control area, the user cannot see what the actually selected control is, so that misoperation and invalid operation of the user are easily caused, the user is required to operate more cautiously, the operation burden of the user is greatly increased, and the operation efficiency of the user is reduced.
Therefore, in order to solve the problem that the actual requirements of the user are not met in the prior art, the technical problem to be actually solved by the present application may include the following aspects: the problems that the operation burden of a user is too large (especially in an environment of operating by using a touch pad) and the operation efficiency is too low under the long-path dragging are solved; it is differentiated whether the user desires to move within the current interface, or move across devices or screens to provide a more efficient and quick solution. Therefore, based on the existing electronic equipment, the dragging intention of the user is identified through detecting the dragging operation duration and the moving range of the user, the transfer gate (namely the corresponding control) of the required target is displayed at the proper position and time, the operation path of the user is shortened, and the experience of the user in the dragging process is optimized and improved in the aspects of vision, interaction and the like.
Hereinafter, exemplary electronic apparatuses involved in embodiments of the present application will be described in detail.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure, where the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a Subscriber Identity Module (SIM) card interface 195. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The display screen 194 is used to display exemplary user interfaces provided in subsequent embodiments of the present application. The detailed description of the user interface may refer to the following.
In particular, the display screen 194 may be used to display target objects such as windows and files, and also to display related controls for cross-device and cross-screen displays, etc., and will not be described in detail herein.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a Neural-network Processing Unit (Neural-network Processing Unit), and can rapidly process input information by referring to a biological Neural network structure, for example, by referring to a transfer mode between neurons in a human brain, and can also continuously learn by self. Applications such as intelligent recognition of the electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and perform directional recording.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, when the user touches a certain position in the display screen 194 and keeps the touch state from the position and moves to another position, the electronic device 100 may further calculate the distance between the two positions, the duration of the moving operation, and the like according to the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense ambient light brightness.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine a touch event type. Visual output related to touch operations may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
For example: in the embodiment of the present application, the touch screen may detect a user operation for a file, a window, a link, or the like, where the user operation may be a drag operation for the file. The touch screen may detect a user operation for the control, where the user operation may be a click operation for the control, and the like, and the user operation may also have other implementation forms, which is not limited in the embodiment of the present application. For the specific implementation of the user operation, reference may be made to the detailed description of the subsequent method embodiment, which is not repeated herein.
In this embodiment, the processor 110 may trigger the display of the target control in the display screen 194 according to a certain rule in response to a user operation on a file, a window, a link, or the like. For a specific implementation of the policy and a specific implementation of the user operation received by the electronic device, reference may be made to the related description of the subsequent embodiments, which is not repeated herein.
The bone conduction sensor 180M can acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects when it is applied to touch operations in different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
In summary, the electronic device 100 may be a smart wearable device, a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like, which have the above functions, and this is not particularly limited in this embodiment.
The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100. Referring to fig. 4, fig. 4 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present disclosure.
It should be understood that the software block diagram shown in the embodiment of the present application does not specifically limit the software block diagram of the electronic device 100.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, g.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The software system shown in fig. 4 relates to application presentation (such as gallery and file manager) using sharing capability, an instant sharing module providing sharing capability, print service (print service) and print background service (print spooller) providing printing capability, an application framework layer providing print framework, WLAN service and bluetooth service, and a kernel and an underlying layer providing WLAN bluetooth capability and basic communication protocol.
Referring to fig. 5, fig. 5 is a flowchart illustrating a control display method according to an embodiment of the present disclosure. The method is applicable to the electronic device 100 described in fig. 3 and 4, and the electronic device 100 may be configured to support and execute the method steps S101 to S105 shown in fig. 5. As shown in fig. 5, a control display method provided in an embodiment of the present application may include:
step S101, displaying a first interface; the first interface comprises a target object, the target object comprises a first target point, and the first target point is located at a first position of the first interface.
Specifically, the electronic device 100 (i.e., the first electronic device) displays a first interface, which may include a target object, and the target object may be, for example, a window, a file (which may include a document, a picture, a folder, and the like), a business card, a connection (e.g., a web page link, and the like), and this is not particularly limited in this embodiment of the application. Optionally, the target object may include a plurality of target points, wherein a first target point may be included, and the first target point may be located at a first position of the first interface.
Optionally, please refer to fig. 6a to 6c, and fig. 6a to 6c are schematic diagrams of a set of user interfaces provided in an embodiment of the present application. As shown in fig. 6a, taking the target object as a window as an example, the position of the finger/cursor (or referred to as a mouse) therein may be a first position of a first target point in the target object.
Step S102, receiving and responding to a first drag operation for a first target point on the target object, determining a first area where the first position is located, and determining that the first target point is located at a second position of the first interface.
Specifically, as shown in fig. 6b, the electronic device 100 receives and responds to a first drag operation for a first target point on the target object, determines a first area (e.g., a dashed circle in fig. 6 b) where a first position (e.g., a finger/cursor home position in fig. 6 b) is located, and determines that the first target point is located at a second position of the first interface. As shown in FIG. 6b, the user drag window moves to the lower right.
Step S103, if it is detected that the second position is within the first area and the duration of the first dragging operation is greater than the first duration threshold, displaying the target control at a third position of the first interface.
Specifically, the electronic apparatus 100 may determine the operation intention of the user according to the speed, the duration, the range, and the like of the user's drag operation. If the electronic device 100 detects that the second position corresponding to the first drag operation is in the first area and the duration of the first drag operation is greater than the first time threshold (e.g., 1 second, 2 seconds, 3 seconds, etc.), the target control is displayed at the third position of the first interface. Optionally, as shown in fig. 6b, when the position touched by the finger/cursor of the user is still in the first area within the first duration threshold, it may be considered that the user has an operation intention to perform cross-device sharing or cross-screen display, at this time, the target control (i.e., the transfer gate in fig. 6 b) may be displayed in different directions of a movement track of the finger/cursor, that is, a direction in which the first position points to the third position is different from a direction in which the first position points to the second position (for example, as shown in fig. 6b, the target control may be displayed on a track opposite to the movement track of the finger/cursor), so that, after the user does not want to perform cross-device sharing or cross-screen display, but still triggers to display the target control, the user is prevented from mistakenly touching the target control in the positive direction of the movement track, and is prevented from blocking and visually interfering with an actual drag path of the user. On the contrary, as shown in fig. 6c, if the user drags the window to leave the first area quickly within the first time threshold, it may be considered that the user only wants to move the window within the interface, and does not intend to perform cross-device sharing or cross-screen display, and the target control (i.e., the transfer gate) may not be triggered to be displayed, so that interference of the target control which is not necessarily displayed on the actual operation of the user is reduced, and the operation experience and the operation efficiency of the user are improved.
Alternatively, the shape of the first region may be, in addition to the circle shown in fig. 6b and 6c, an ellipse, a square, a triangle, and the like, which is not specifically limited in this embodiment of the present application. In addition, compared to the control combination 05 in the prior art, the embodiment of the present application reduces the control combination into a circular target control, which represents all functions that may be triggered in the drag scene.
Optionally, since the screen sizes of different electronic devices are different, the operation intention of the user may be further determined in an auxiliary manner according to the actual screen size of the product and the actually set distance between the delivery door and the initial position (i.e., the first position) of the dragging operation, and the boundary range (i.e., the size of the first area) and the delay time (i.e., the first duration threshold) may be set.
Optionally, as shown in fig. 6b, if the user does not want to perform cross-device sharing or cross-screen display under the condition that the target control is triggered and displayed, the finger or the mouse may be released at the third position at this time, and the electronic device receives and cancels display of the target control in response to an operation of the user releasing the finger or the mouse. Optionally, after the target control is displayed, if the user releases the finger or the mouse at the third position at this time, the target control may also be continuously displayed, in this case, if the user does not want to perform cross-device sharing or cross-screen display, the target control may also be cancelled and displayed by clicking a blank on the first interface, and the like, which is not specifically limited in this embodiment of the application.
Step S104, receiving and responding to a second drag operation aiming at a first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in the second area where the third position is located; the target box includes N controls.
Specifically, the electronic device 100 receives and responds to the second drag operation for the first target point, determines that the first target point is located at the fourth position of the first interface, and displays a target frame at the third position; the fourth position is in the second area where the third position is located; the target box includes N controls. Wherein N is an integer greater than or equal to 1.
Optionally, please refer to fig. 7, wherein fig. 7 is a schematic diagram of a process of changing a transmission gate according to an embodiment of the present application. As shown in fig. 7, an actual hot zone of the target control (i.e., the transfer gate in fig. 7) may be larger than the size of the transfer gate, and when the finger/cursor (i.e., the position of the first target point) moves into the hot zone along with the dragging operation of the user (or during the whole second dragging operation that does not enter the hot zone), the electronic device 100 may gradually enlarge the transfer gate and gradually reduce the window, so that the user generates a visual effect of dragging the target object into the transfer gate on the visual feedback, thereby improving the operation experience of the user and ensuring that the user can easily operate even on an object that looks very small. Then, as shown in fig. 7, when the user continuously drags the window until the finger/cursor touches the transfer gate, that is, when the first target point is located in the second area where the target control is located, the size of the transfer gate may be enlarged to the size of the actual hot area, and needs to be larger than the size of the finger/cursor, so as to ensure the user's identification of the target control, improve the visual experience, and avoid the user's misoperation. Alternatively, parameters such as the shape, transparency, and size of the transfer gate may be customized by the user through the electronic device 100.
Optionally, please refer to fig. 8, wherein fig. 8 is a schematic diagram of another variation process of a transmission gate according to an embodiment of the present application. As shown in FIG. 8, when the user continues to drag the window until the finger/cursor touches the transfer gate, the transfer gate may change color cues, prompting the user that the finger or mouse may be released at this time to trigger a subsequent function, and so on.
Optionally, referring to fig. 9a, fig. 9a is a schematic view of a transfer gate and a button backplane according to an embodiment of the present disclosure. As shown in fig. 9a, in the case of the color change indication of the transfer door, the user may release the finger or the mouse (i.e., release the dragged target object), and complete the second dragging operation, at which point the electronic device 100 may cancel displaying the transfer door and display the button backplane at the position of the transfer door. Thus, the user can see in the visual experience that the transfer gate is deformed from a circular shape as shown in fig. 9a to a button back plate (i.e., target box). As shown in fig. 9a, the button backplane may include a plurality of controls, for example, a control for switching to the left half screen of the second display screen for display, a control for switching to the right half screen of the second display screen for display, and a full screen display control, etc. Therefore, compared with the prior art, by adopting the embodiment of the application, the user does not need to face the long-time dragging burden, and the corresponding function can be realized by clicking the control called out after the hand is loosened. In addition, compared with the prior art, the embodiment of the application can accommodate more function controls. Referring to fig. 9b, fig. 9b is a schematic diagram of a button back plate according to an embodiment of the present disclosure. As shown in fig. 9b, the space of the button back plate may be enlarged, and more controls (for example, button 1, button 2, button 3, and the like in fig. 9 b) may be added to implement more functions, and the like, which is not specifically limited in this embodiment of the application. Optionally, the number of the transfer gates may be two or more, and different transfer gates may correspond to different button backplanes, for example, a user may custom set two transfer gates through the electronic device 100, one transfer gate may correspond to a button backplane including multiple sharing buttons, and may share the drag content with other devices, another transfer gate may correspond to a button backplane including multiple display buttons of a switch screen, and may switch the drag content to another display screen for displaying, and the like, which is not specifically limited in this embodiment of the present application.
And step S105, receiving and responding to the click operation of one control in the N controls, and displaying a second interface.
Specifically, the electronic device 100 receives and responds to a click operation for one of the N controls, and displays the second interface. For example, the electronic device 100 may include a first display screen and M second display screens, where M is an integer greater than or equal to 1, and then the N controls may include M switching screen display controls in one-to-one correspondence with the M second display screens, and even more controls may be included in consideration of different display schemes for each of the second display screens, so as to greatly enrich the selection of the user and meet different requirements of the user. The electronic device 100 receives and responds to the clicking operation of the control by the user, and the target object can be switched to the second display screen to be displayed, and the like. It should be noted that the first display screen and the second display screen in the drawings in the embodiment of the present application are only examples, and the size of the second display screen may be smaller than that of the first display screen as shown in the drawings, or may be equal to or larger than that of the first display screen, and so on, which is not specifically limited in the embodiment of the present application. In addition, the "first" and "second" of the first display screen and the second display screen described in the embodiment of the present application do not constitute a limitation on the specification, primary and secondary or priority of the display screens, and are only used for distinguishing the display screen where the current target object is located and other display screens.
Optionally, please refer to fig. 10a to 10h, and fig. 10a to 10h are schematic diagrams of a set of user interfaces provided in an embodiment of the present application. Taking the electronic device 100 as a notebook computer and a mouse/touch pad operating environment as an example, as shown in fig. 10a, the electronic device 100 may include two display screens, i.e., a first display screen and a second display screen. The electronic device 100 displays a first interface through the first display screen, where the first interface includes a target object (window), and the window may be a common browser window or a chat window, and the like, which is not specifically limited in this embodiment of the application. As shown in fig. 10b and 10c, the user presses the mouse to select the window and starts to drag, and if the electronic device 100 detects that the cursor is still located in the first area where the original cursor is located within the first duration threshold range, the electronic device 100 may display the target control in the reverse direction of the movement of the cursor. As shown in fig. 10d, when the electronic device 100 detects that the cursor moves into the hot zone of the target control, the target control may be enlarged and the target object (window) may be reduced. As shown in fig. 10e, when the electronic device 100 detects that the cursor touches the target control, the target control may be controlled to change color (e.g., from original gray to yellow, etc.) to indicate that the user may release the mouse/touch pad. As shown in fig. 10f, in response to the operation of releasing the mouse/touch pad by the user, the electronic device 100 displays the target frame at the position where the target control is originally located, and cancels the display of the target control, or may display the target control at any suitable position in the interface, and so on, which is not limited in this embodiment of the present application. Optionally, the target frame may include multiple controls, which may implement corresponding multiple functions, for example, the target frame may specifically include a control of two split screens/three split screens/full screen/continuous screen (up-down structure/left-right structure); the content of the file/link/text and the like in the window may also be shared with other devices, for example, for a browser window, the link of all the tabs opened in the browser may be directly shared with identifiable devices of the person/others, including a mobile phone/a computer/a television/a tablet, and the like, which is not specifically limited in this embodiment of the present application. As shown in fig. 10g and 10h, the electronic device 100 receives and responds to a user's click operation on the control for switching to the display of the second display screen on the left half in fig. 10g, displays a second interface through the first display screen, the second interface not including the target object, and displays a third interface through the second display screen, the third interface including the target object. Namely, the target object (window) is switched to be displayed from the left half of the first display screen to the left half of the second display screen.
Optionally, various prompts are further designed visually in the embodiments of the present application to ensure the awareness of the above functions. For example, please refer to fig. 11, fig. 11 is a schematic diagram of a transmission gate according to an embodiment of the present disclosure. As shown in fig. 11, the lightness of the interface background is used as a determination criterion, and the background is colored when a transfer gate (i.e., a target control) appears, so that the color of the transfer gate is correspondingly adjusted to ensure the identifiability of the transfer gate and improve the operation experience of the user. For example, please refer to fig. 12, wherein fig. 12 is a schematic diagram of another transmission gate according to an embodiment of the present application. As shown in fig. 12, the ripple cue may be provided; when the capsule is used for the first time, a capsule-shaped prompt is adopted, and a specific function prompt can be marked on the capsule; when the cursor/finger moves to touch the transfer gate (when the transfer gate changes color), the cursor corner mark prompts the user to open a plurality of corresponding function controls by releasing the mouse/touch pad/finger, and the like, thereby ensuring that the user can still recognize and know the function of an abstract dot which is not conveyed by specific intention when seeing the abstract dot.
Hereinafter, the operation steps in the embodiment of the present application will be described in more detail and intuitively by taking examples of a plurality of application scenarios.
The application scene one: referring to fig. 13 a-13 e, fig. 13 a-13 e are schematic diagrams of a set of user interfaces provided by an embodiment of the present application. As shown in fig. 13a, taking the electronic device 100 as a notebook computer as an example, the electronic device 100 displays a user interface 21, and the user interface 21 may include a plurality of files (e.g., file 11, file 12, file 13, file 14, etc.). As shown in fig. 13b and 13c, the electronic device 100 receives a drag operation of a user on the file 16, determines a first area 22 (the first area 22 may be a square shown in fig. 13 b) where the original position of the file 16 is located in response to the drag operation, and forms a snapshot of the file 16 as the drag operation of the user drags out. Optionally, the parameters such as the transparency and the size of the snapshot are not specifically limited in this embodiment, and may be a default value of the system, or may be set by the user through the electronic device 100 by self-definition. When the electronic device 100 detects that the snapshot (or cursor) of the file 16 remains within the first region 22 for a certain length of time, the target control 23 is displayed in a different direction than the drag operation. As shown in fig. 13d, the electronic device 100 receives a drag operation of the user for the snapshot of the file 16, gradually enlarges the size of the target control 23 and gradually reduces the size of the snapshot of the file 16 in response to the drag operation, and when the electronic device detects that the snapshot of the file 16 is located on the target control 23 (or when the cursor is located on the target control 23), controls the target control 23 to change color to prompt the user that the mouse/touch pad can be released at this time, and ends the drag operation to trigger more functions. As shown in fig. 13e, when the electronic device 100 detects that the user releases the mouse/touch pad, the electronic device 100 may display the target frame 24 in the center of the interface (or other suitable location) and cancel displaying the target control 23 (the user may visually recognize that the target control 23 is deformed into the target frame 24). Optionally, the target frame 24 may be a floating window, a floating back plate, or the like, which is not particularly limited in this embodiment of the application. As shown in fig. 13e, the target frame 24 may include a plurality of sharing controls, specifically including a sharing control 25, a sharing control 26, a sharing control 27, a sharing control 28, a sharing control 29, and a sharing control 30, that is, the N controls may include a plurality of sharing controls corresponding to a plurality of other devices (i.e., second electronic devices) one to one. The electronic device 100 receives a click operation of the user on the sharing control 28, and in response to the click operation, sends the file 16 to the other device "my mobile phone", and then, the electronic device 100 may display the same interface as the user interface 21, or the electronic device 100 cancels the display of the target frame 24 by receiving and responding to a click operation of the user on a blank of the interface, so as to display the same interface as the user interface 21. As shown in fig. 13e, the other device may receive the file 16.
The electronic device 100 and other devices may establish a connection with each other through a wired or Wireless network (for example, a Wireless-Fidelity (WiFi), bluetooth, and a mobile network), which is not limited in this embodiment of the present invention.
It is understood that fig. 13 a-13 e illustrate the process of synchronizing files from the computer side to the mobile phone side, and likewise, the sharing operation between all electronic devices can be designed as such. In addition, sharing of multimedia such as photos/videos, sharing of contents such as business cards/calendars/memos, and the like can also be included. Optionally, the target frame 24 may further include a shortcut control for sending the dragged content to a printer for printing, a shortcut control for moving the dragged content to a certain folder, a shortcut control for sending the dragged content in a manner of an email, information, various three-party applications (e.g., a communication application, etc., a short video application, etc.), etc., which is not specifically limited in this embodiment of the present application.
In one possible embodiment, after the target control 23 is transformed into the target frame 24, the user may also perform a clustering operation on the controls within the target frame 24. For example, referring to fig. 14 a-14 c, fig. 14 a-14 c are schematic diagrams of a set of user interfaces provided by an embodiment of the present application. As shown in fig. 14a, the electronic device 100 receives a drag operation of the user with respect to the sharing control 28, forms a snapshot 28' of the sharing control 28 in response to the drag operation, and drags out along with the drag operation of the user. As shown in fig. 14b and 14c, in response to a drag operation performed by the user on the snapshot 28', the electronic device 100 detects that the snapshot 28' is located above the sharing control 30, and at this time, the user releases the mouse/touch pad, and creates and displays a multi-device sharing control 31 (i.e., a device group including "my mobile phone" and "mobile phone xxx") corresponding to the sharing control 28 and the sharing control 30, that is, the N controls may further include a multi-device sharing control for synchronously sharing to multiple other devices (i.e., a second electronic device). As shown in fig. 14c, the electronic device 100 receives a click operation of the user on the multi-device sharing control 31, and in response to the click operation, the file 16 may be respectively sent to the other devices "my mobile phone" and "mobile phone xxx". Other devices, namely my mobile phone, and other devices, namely mobile phone xxx, can receive the file 16, so that the file sharing efficiency can be greatly improved, and convenience is brought to user operation.
Optionally, after the multi-device sharing control 31 is created, the user may also drag another sharing control, for example, drag the sharing control 25 to the multi-device sharing control 31 and release the sharing control, so as to create a control (that is, a device group including "my computer", "my mobile phone", and "mobile phone xxx") that can be shared with three devices at a time, and further improve the sharing efficiency. As described above, the function is mainly aimed at sharing of more than 2 common devices, such as quick sharing among colleagues/classmates of the same department/friends going to travel together, and brings convenience to user operation.
Optionally, the multi-device sharing control 31 (i.e. the device group containing "my cell phone" and "cell phone xxx") may be saved to the background. The electronic device 100 may always display the multi-device sharing control 31 in the target frame 24, or when the electronic device 100 recognizes any device in the device group through bluetooth or the like, the multi-device sharing control 31 may be displayed in the target frame 24, and optionally, devices that are not recognized in the device group may be grayed out to prompt the user.
Application scenario two: under the global search function of the present day, a user can only open searched contents, but cannot perform shortcut operations on the contents. The embodiment of the application further provides a scheme that a user can call out a transmission door (also called a control) by dragging the searched content and share the searched content. Referring to fig. 15a to 15e, fig. 15a to 15e are schematic diagrams of a set of user interfaces provided in an embodiment of the present application. As shown in fig. 15a, taking the electronic device 100 as a notebook computer as an example, the electronic device 100 displays a user interface 32, and a search window may be included in the user interface 32, and a plurality of pieces of search content may be included in the search window, and the plurality of pieces of search content may be obtained by a global search function of the electronic device 100 according to a keyword search, and may be, for example, a file, a business card, a browser connection, and the like. As shown in fig. 15b and 15c, the electronic device 100 receives a drag operation of the user with respect to the search content 1, forms a snapshot of the search content 1 in response to the drag operation, and drags out as the drag operation of the user drags. When the electronic device 100 detects that the snapshot (or the cursor) of the search content 1 is still located in a certain area where the original position of the search content 1 is located within a certain time period, the target control 23 is displayed in a direction different from the direction of the drag operation. As shown in fig. 15d, the electronic device 100 receives a drag operation of the user for the snapshot of the search content 1, gradually enlarges the size of the target control 23 in response to the drag operation, gradually reduces the size of the snapshot of the search content 1, and when the electronic device detects that the snapshot of the search content 1 is located on the target control 23 (or when the cursor is located on the target control 23), controls the target control 23 to change color to prompt the user that the mouse/touch pad can be released at this time, and ends the drag operation to trigger more functions. As shown in fig. 15e, when the electronic device 100 detects that the user releases the mouse/touch pad, the electronic device 100 may display the target frame 24 in the center of the interface (or other suitable location) and cancel displaying the target control 23. As shown in fig. 13e, the target box 24 may include a plurality of sharing controls therein, which is not described herein again. The electronic device 100 receives a click operation of the user on the sharing control 28, and in response to the click operation, sends the search content 1 to the other device "my mobile phone". As shown in fig. 15e, the other device may receive the search content 1 (in the figure, the file "graduation album" is taken as an example). For details of the application scenario two, reference may be made to the description of the embodiment corresponding to fig. 13a to fig. 13e, which is not repeated herein.
Referring to fig. 16a to 16d, fig. 16a to 16d are schematic diagrams of a set of user interfaces provided in an embodiment of the present application. As shown in fig. 16a, taking the electronic device 100 as a tablet computer as an example, the electronic device 100 displays a user interface 33, and the user interface 33 may include applications (e.g., weather, music, video, application store, mail, gallery, etc.). The electronic device 100 receives a user click operation 34 for the gallery of applications, and in response to the click operation 34, displays a user 35, where the user interface 35 may include multiple pictures. As shown in fig. 16b, the electronic device 100 receives a drag operation 36 of the user for the picture 8, and in response to the drag operation 36, when the electronic device 100 detects that the picture 8 (or the finger) is still located in a certain area where the original position of the finger was located at the beginning of the drag operation 36 within a certain time length range, the target control 23 is displayed in a direction different from that of the drag operation 36. As shown in fig. 16c, the electronic device 100 receives a drag operation 37 of the user for the picture 8, gradually enlarges the size of the target control 23 and gradually reduces the size of the picture 8 in response to the drag operation 37, and when the electronic device 100 detects that the picture 8 is located on the target control 23 (or when the finger is located on the target control 23), controls the target control 23 to change color to prompt the user that the finger can be released at this time, so as to finish the drag operation 37. As shown in fig. 16d, when the electronic device 100 detects that the user releases the finger, the electronic device 100 may display the target frame 24 at the position where the target control 23 is located (or at another suitable position) and cancel displaying the target control 23. As shown in fig. 16d, the target box 24 may include a plurality of sharing controls therein, which is not described herein. The electronic device 100 receives the user's click operation 38 for the sharing control 26, and in response to the click operation 38, sends the picture 8 to the other device "my computer", which may receive the picture 8. For details of the third application scenario, reference may be made to the description of the embodiment corresponding to fig. 13a to 13e, which is not repeated herein.
And an application scene four: referring to fig. 17a to 17d, fig. 17a to 17d are schematic diagrams of a set of user interfaces provided in an embodiment of the present application. As shown in fig. 17a, taking the electronic device 100 as a smartphone as an example, the electronic device 100 displays a user interface 39, and the user interface 39 may be an explorer interface, which may include a plurality of files, documents, and the like (e.g., file 1, file 2, and the like, and document 1, document 2, document 3, and the like). As shown in fig. 17b, the electronic device 100 receives a drag operation 40 of the user for the document 1, and in response to the drag operation 40, when the electronic device 100 detects that the document 1 (or the finger) is still located in a certain area where the original position of the finger was located when the drag operation 40 started within a certain time length range, the target control 23 is displayed in a direction different from that of the drag operation 40. As shown in fig. 17c, the electronic device 100 receives a drag operation 41 of the user for the document 1, gradually enlarges the size of the target control 23 and gradually reduces the size of the document 1 in response to the drag operation 41, and when the electronic device 100 detects that the document 1 is located on the target control 23 (or when the finger is located on the target control 23), controls the target control 23 to change color to prompt the user that the finger can be released at this time, so as to finish the drag operation 41. As shown in fig. 17d, when the electronic device 100 detects that the user releases the finger, the electronic device 100 may display the target frame 24 in the center of the interface (or other suitable location) and cancel displaying the target control 23. As shown in fig. 17d, the target box 24 may include a plurality of sharing controls therein, which is not described herein again. The electronic device 100 receives the user's click operation 42 on the sharing control 27, and in response to the click operation 42, sends the document 1 to the other device "my tablet", which may receive the picture 8. For some details of the application scenario four, reference may be made to the description of the embodiment corresponding to fig. 13a to 13e, which is not repeated herein.
In summary, the embodiment of the application provides a control display method, which can identify the operation intention of a user through detection of the duration and the moving range of the user dragging operation, display the control timely, reduce the visual interference on the user operation, shorten the path for the user to execute the dragging operation, reduce the operation burden of the user, and improve the operation efficiency of the user. According to the embodiment of the application, more visual expressions and interaction modes are mined, and more draggable contents and specific embodiment scenes are explored.
The method of the embodiments of the present application is explained in detail above, and the related apparatus of the embodiments of the present application is provided below.
Referring to fig. 18, fig. 18 is a schematic structural diagram of a control display apparatus according to an embodiment of the present application, where the control display apparatus 20 may include a first display unit 201, a first determining unit 202, a second display unit 203, a third display unit 204, and a fourth display unit 205, and may further include a generating unit 206. The details of each unit are as follows.
A first display unit 201 for displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface;
a first determining unit 202, configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first location is located, and determine that the first target point is located at a second location of the first interface;
a second display unit 203, configured to display a target control at a third position of the first interface if it is detected that the second position is within the first area and the duration of the first dragging operation is greater than a first duration threshold;
a third display unit 204, configured to receive and respond to a second drag operation for the first target point, determine that the first target point is located at a fourth position of the first interface, and display a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
and the fourth display unit 205 is configured to receive and respond to a click operation on one of the N controls, and display a second interface.
In one possible implementation, the first electronic device includes a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the fourth display unit 205 is specifically configured to:
receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the fourth display unit 205 is specifically configured to:
receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one to one; the fourth display unit 205 is specifically configured to:
receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In a possible embodiment, the apparatus further comprises:
a generating unit 206, configured to receive and respond to a third dragging operation for an xth sharing control of the K sharing controls, form a snapshot corresponding to the xth sharing control, and generate multiple device sharing controls corresponding to the xth second electronic device and the yth second electronic device when it is detected that the snapshot corresponding to the xth sharing control is located in a third area where the yth sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
In a possible implementation manner, the fourth display unit 205 is specifically configured to:
receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In a possible implementation manner, the third display unit 204 is specifically configured to:
receiving and responding to a second dragging operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to complete the second dragging operation.
In a possible embodiment, the direction in which the first position points towards the third position is different from the direction in which the first position points towards the second position.
In one possible embodiment, the target object is any one of a window, a file, a business card, and a link.
It should be noted that, for implementation of each unit in the embodiment of the present application, reference may also be made to the related description of the embodiment shown in fig. 3 to fig. 17d, and details are not repeated here.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, and may specifically be a processor in the computer device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. The storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a magnetic disk, an optical disk, a Read-Only Memory (ROM) or a Random Access Memory (RAM).
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present application.

Claims (19)

1. A control display method is applied to a first electronic device and comprises the following steps:
displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface;
receiving and responding to a first dragging operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining that the first target point is located at a second position of the first interface;
if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold, displaying a target control at a third position of the first interface;
receiving and responding to a second dragging operation aiming at the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
and receiving and responding to the click operation of one control in the N controls, and displaying a second interface.
2. The method of claim 1, wherein the first electronic device comprises a first display screen and M second display screens, and wherein the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the receiving and responding to the click operation for one of the N controls, and displaying a second interface, specifically including:
receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
3. The method of claim 1, wherein the N controls include a full screen display control; the receiving and responding to the click operation for one of the N controls, and displaying a second interface, specifically including:
receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
4. The method according to claim 1, wherein the N controls include K sharing controls corresponding to K second electronic devices one to one; the receiving and responding to the click operation for one of the N controls, and displaying a second interface, specifically including:
receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
5. The method of claim 4, further comprising:
receiving and responding to a third dragging operation aiming at an x-th sharing control in the K sharing controls, forming a snapshot corresponding to the x-th sharing control, and generating multi-device sharing controls corresponding to an x-th second electronic device and a y-th second electronic device when detecting that the snapshot corresponding to the x-th sharing control is located in a third area where the y-th sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
6. The method according to claim 4, wherein the receiving and displaying a second interface in response to a click operation for one of the N controls comprises:
receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
7. The method according to any one of claims 1 to 6, wherein the receiving and responding to a second drag operation for the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position, specifically comprises:
receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to complete the second dragging operation.
8. The method of any of claims 1-7, wherein the direction in which the first location points to the third location is different from the direction in which the first location points to the second location.
9. The method according to any one of claims 1 to 8, wherein the target object is any one of a window, a file, a business card, and a link.
10. An electronic device, wherein the electronic device is a first electronic device comprising a first display screen, a memory, and one or more processors; the first display screen, the memory coupled with the one or more processors, the memory to store computer program code, the computer program code including computer instructions, the one or more processors to invoke the computer instructions to cause the electronic device to perform:
displaying a first interface; wherein the first interface comprises a target object comprising a first target point, the first target point being located at a first location of the first interface;
receiving and responding to a first dragging operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining that the first target point is located at a second position of the first interface;
if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold, displaying a target control at a third position of the first interface;
receiving and responding to a second dragging operation aiming at the first target point, determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is located; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
and receiving and responding to the clicking operation of one control in the N controls, and displaying a second interface.
11. The electronic device of claim 10, further comprising M second display screens; the first display screen, the M second display screens, the memory and the one or more processors are coupled; the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to the click operation aiming at the ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; a third interface displayed on an ith second display screen corresponding to the ith switching screen display control comprises the target object; i is an integer greater than or equal to 1 and less than or equal to M.
12. The electronic device of claim 10, wherein the N controls include a full screen display control; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to the click operation aiming at the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
13. The electronic device according to claim 10, wherein the N controls include K sharing controls corresponding to K second electronic devices one to one; the one or more processors are further to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to a click operation aiming at a jth sharing control in the K sharing controls, sending the target object to jth second electronic equipment corresponding to the jth sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
14. The electronic device of claim 13, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to a third dragging operation aiming at an x-th sharing control in the K sharing controls, forming a snapshot corresponding to the x-th sharing control, and generating multiple device sharing controls corresponding to an x-th second electronic device and a y-th second electronic device when detecting that the snapshot corresponding to the x-th sharing control is located in a third area where the y-th sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers which are greater than or equal to 1 and less than or equal to K.
15. The electronic device of claim 13, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to a click operation aiming at the multi-device sharing control, respectively sending the target object to the xth second electronic device and the yth second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
16. The electronic device of any of claims 10-15, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to a second dragging operation aiming at the first target point, gradually reducing the size of the target object, and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in the second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting the user that the first target point is located in the second area where the third position is located so as to complete the second dragging operation.
17. The electronic device of any one of claims 10-16, wherein the first location points in a different direction from the third location than the first location points in the second location.
18. The electronic device of any one of claims 10-17, wherein the target object is any one of a window, a file, a business card, and a link.
19. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-9.
CN202110372085.6A 2021-04-07 2021-04-07 Control display method and related equipment Active CN115185440B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110372085.6A CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment
PCT/CN2022/083215 WO2022213831A1 (en) 2021-04-07 2022-03-26 Control display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110372085.6A CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment

Publications (2)

Publication Number Publication Date
CN115185440A true CN115185440A (en) 2022-10-14
CN115185440B CN115185440B (en) 2024-05-10

Family

ID=83512323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110372085.6A Active CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment

Country Status (2)

Country Link
CN (1) CN115185440B (en)
WO (1) WO2022213831A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116680019A (en) * 2022-10-26 2023-09-01 荣耀终端有限公司 Screen icon moving method, electronic equipment, storage medium and program product
CN116820229A (en) * 2023-05-17 2023-09-29 荣耀终端有限公司 XR space display method, XR equipment, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452365A (en) * 2007-12-06 2009-06-10 Lg电子株式会社 Terminal and method of controlling the same
CN103324404A (en) * 2012-03-20 2013-09-25 宇龙计算机通信科技(深圳)有限公司 Method for moving icon and communication terminal
CN105653178A (en) * 2015-05-28 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Information sharing method and apparatus
CN108228053A (en) * 2017-12-29 2018-06-29 努比亚技术有限公司 A kind of information sharing method, intelligent terminal and storage medium
CN109684110A (en) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 Multimedia resource sharing method, device and storage medium
CN109725789A (en) * 2018-12-27 2019-05-07 维沃移动通信有限公司 A kind of application icon archiving method and terminal device
WO2020006669A1 (en) * 2018-07-02 2020-01-09 华为技术有限公司 Icon switching method, method for displaying gui, and electronic device
CN111443842A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Method for controlling electronic equipment and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615102A (en) * 2008-06-26 2009-12-30 鸿富锦精密工业(深圳)有限公司 Input method based on touch-screen
KR20160143135A (en) * 2015-06-04 2016-12-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6114792B2 (en) * 2015-09-16 2017-04-12 Kddi株式会社 User interface device capable of scroll control according to contact degree, image scrolling method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452365A (en) * 2007-12-06 2009-06-10 Lg电子株式会社 Terminal and method of controlling the same
CN103324404A (en) * 2012-03-20 2013-09-25 宇龙计算机通信科技(深圳)有限公司 Method for moving icon and communication terminal
CN105653178A (en) * 2015-05-28 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Information sharing method and apparatus
CN108228053A (en) * 2017-12-29 2018-06-29 努比亚技术有限公司 A kind of information sharing method, intelligent terminal and storage medium
WO2020006669A1 (en) * 2018-07-02 2020-01-09 华为技术有限公司 Icon switching method, method for displaying gui, and electronic device
CN109725789A (en) * 2018-12-27 2019-05-07 维沃移动通信有限公司 A kind of application icon archiving method and terminal device
CN109684110A (en) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 Multimedia resource sharing method, device and storage medium
CN111443842A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Method for controlling electronic equipment and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116680019A (en) * 2022-10-26 2023-09-01 荣耀终端有限公司 Screen icon moving method, electronic equipment, storage medium and program product
CN116820229A (en) * 2023-05-17 2023-09-29 荣耀终端有限公司 XR space display method, XR equipment, electronic equipment and storage medium
CN116820229B (en) * 2023-05-17 2024-06-07 荣耀终端有限公司 XR space display method, XR equipment, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2022213831A1 (en) 2022-10-13
CN115185440B (en) 2024-05-10

Similar Documents

Publication Publication Date Title
CN110471639B (en) Display method and related device
US11922005B2 (en) Screen capture method and related device
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
WO2021129326A1 (en) Screen display method and electronic device
US20220342850A1 (en) Data transmission method and related device
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
EP4131911A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
CN110119296B (en) Method for switching parent page and child page and related device
CN110362244B (en) Screen splitting method and electronic equipment
WO2021104030A1 (en) Split-screen display method and electronic device
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
WO2021063098A1 (en) Touch screen response method, and electronic device
EP4261680A1 (en) Widget display method and electronic device
CN112068907A (en) Interface display method and electronic equipment
CN113949803A (en) Photographing method and electronic equipment
EP4163782A1 (en) Cross-device desktop management method, first electronic device, and second electronic device
WO2022213831A1 (en) Control display method and related device
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
WO2022002213A1 (en) Translation result display method and apparatus, and electronic device
WO2023160455A1 (en) Object deletion method and electronic device
WO2023098417A1 (en) Interface display method and apparatus
WO2023207799A1 (en) Message processing method and electronic device
CN114356186A (en) Method for realizing dragging shadow animation effect and related equipment
CN117631939A (en) Touch input method, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant