CN115185440B - Control display method and related equipment - Google Patents

Control display method and related equipment Download PDF

Info

Publication number
CN115185440B
CN115185440B CN202110372085.6A CN202110372085A CN115185440B CN 115185440 B CN115185440 B CN 115185440B CN 202110372085 A CN202110372085 A CN 202110372085A CN 115185440 B CN115185440 B CN 115185440B
Authority
CN
China
Prior art keywords
interface
electronic device
target
controls
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110372085.6A
Other languages
Chinese (zh)
Other versions
CN115185440A (en
Inventor
周雪怡
徐杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110372085.6A priority Critical patent/CN115185440B/en
Priority to PCT/CN2022/083215 priority patent/WO2022213831A1/en
Publication of CN115185440A publication Critical patent/CN115185440A/en
Application granted granted Critical
Publication of CN115185440B publication Critical patent/CN115185440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a control display method and related equipment, wherein the method comprises the following steps: displaying a first interface; the first interface comprises a target object, the target object comprises a first target point, and the first target point is positioned at a first position of the first interface; receiving and responding to a first drag operation aiming at a first target point, determining a first area where a first position is located, and determining a second position where the first target point is located on a first interface; if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold, displaying a target control at a third position of the first interface; receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the target frame comprises N controls; and receiving and responding to clicking operation for one of the N controls, and displaying a second interface. By implementing the embodiment of the application, the operation experience of the user can be improved.

Description

Control display method and related equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to a control display method and related devices.
Background
With the improvement of living standard of people, electronic devices such as mobile phones, tablet computers and the like are seen everywhere in any daily life. When a user uses a smart device, a drag operation is often involved to move an object such as a window or a file displayed in a device interface to a certain position. The location may correspond to a short distance or long distance movement within the current interface, or movement across a device or across a screen.
In order to realize movement across devices or screens, the electronic device often triggers and displays corresponding functional controls based on a drag operation of a user on a window, and at this time, the user is required to continuously drag the window until a certain functional control is selected. However, as described above, in the dragging process, the user often needs to keep clicking the selected window and not release the selected window until the selected window is moved to the desired functional control and released, so that in the case where long-path dragging is required, especially in the environment where the user uses the touch pad to perform operation, the operation burden of the user is greatly increased, and the operation efficiency and the operation experience of the user are also greatly reduced. In addition, the frequency of moving the dragged window or file in the current interface is often greater than that of moving the dragged window or file across devices or screens, so that if the corresponding functional control is triggered and displayed each time the window is dragged, visual interference is caused to the actual operation of a user, and the use experience of the user is reduced.
Therefore, how to improve the operation experience of the user is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a control display method and related equipment, which are used for improving the operation experience of a user.
In a first aspect, an embodiment of the present application provides a control display method, which is applied to a first electronic device, and may include:
Displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface; receiving and responding to a first drag operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining a second position where the first target point is located on the first interface; if the second position is detected to be in the first area and the duration of the first drag operation is longer than a first time threshold, displaying a target control at a third position of the first interface; receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is; the target frame comprises N controls, wherein N is an integer greater than or equal to 1; and receiving and responding to clicking operation for one of the N controls, and displaying a second interface.
By the method provided by the first aspect, the electronic device can judge the operation intention of the user through detecting the time and the movement range of the drag operation performed by the user. If the user drags the window or the file, etc. (i.e. the target object) away from the original position in a short time, it can be considered that the user only wants to move the window or the file in the interface, and the control (i.e. the target control, or called the transfer gate) for sharing or displaying across screens (or switching the screen display) may not be triggered. Otherwise, if the user does not drag the window or the file and the like out of a certain range where the home position is located (i.e. the first area) within a period of time, the user can consider that the user wants to share the window or the file across devices or display the file across screens, and the target control can be triggered to be displayed. Then, after triggering and displaying the target control, the user can continuously drag the window or the file until touching the target control, then the electronic device displays a target frame, the target frame can include a plurality of controls specifically used for sharing to each device, a plurality of controls used for screen changing display and the like, and finally, the user can click on the corresponding control in the target frame according to the requirement to realize the corresponding function. Compared with the prior art, when a user executes drag operation on a window or a file, the control displayed in a cross-screen display or sharing manner is triggered and displayed very easily, so that when the user does not have the operation intention, the user only wants to move the window or the file in the interface, the visual interference in actual operation of the user is caused, and the operation experience of the user is reduced.
In a possible implementation manner, the first electronic device comprises a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises: receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises: receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one by one; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises: receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible embodiment, the method further comprises:
Receiving and responding to a third dragging operation aiming at an xth sharing control in the K sharing controls, forming a snapshot corresponding to the xth sharing control, and generating multi-device sharing controls corresponding to the xth second electronic device and the xth second electronic device when the snapshot corresponding to the xth sharing control is detected to be positioned in a third area where the xth sharing control is positioned; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
In one possible implementation manner, the receiving and responding to the clicking operation for one of the N controls displays a second interface, specifically including: receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In one possible implementation manner, the receiving and responding to the second drag operation for the first target point determines that the first target point is located at a fourth position of the first interface, and displays a target frame at the third position, which specifically includes: receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
In one possible embodiment, the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
In one possible implementation, the target object is any one of a window, a file, a business card, and a link.
In a second aspect, an embodiment of the present application provides an electronic device, where the electronic device is a first electronic device, and includes a first display screen, a memory, and one or more processors; the first display screen, the memory, and the one or more processors are coupled, the memory is used for storing computer program codes, the computer program codes comprise computer instructions, and the one or more processors call the computer instructions to cause the electronic device to execute: displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface; receiving and responding to a first drag operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining a second position where the first target point is located on the first interface; if the second position is detected to be in the first area and the duration of the first drag operation is longer than a first time threshold, displaying a target control at a third position of the first interface; receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in a second area where the third position is; the target frame comprises N controls, wherein N is an integer greater than or equal to 1; and receiving and responding to clicking operation for one of the N controls, and displaying a second interface.
In a possible implementation manner, the electronic device further comprises M second display screens; the first display screen, the M second display screens, the memory, and the one or more processors; the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform:
Receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one by one; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a third dragging operation aiming at an xth sharing control in the K sharing controls, forming a snapshot corresponding to the xth sharing control, and generating multi-device sharing controls corresponding to the xth second electronic device and the xth second electronic device when the snapshot corresponding to the xth sharing control is detected to be positioned in a third area where the xth sharing control is positioned; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In one possible implementation, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
In one possible embodiment, the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
In a possible implementation manner, the target object is any one of a window, a file, a business card and a link.
In a third aspect, an embodiment of the present application provides a control display apparatus, applied to a first electronic device, including:
The first display unit is used for displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface;
A first determining unit, configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first position is located, and determine that the first target point is located at a second position of the first interface;
The second display unit is used for displaying a target control at a third position of the first interface if the second position is detected to be in the first area and the duration of the first dragging operation is longer than a first duration threshold value;
A third display unit, configured to receive and respond to a second drag operation for the first target point, determine that the first target point is located at a fourth position of the first interface, and display a target frame at the third position; the fourth position is in a second area where the third position is; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
and the fourth display unit is used for receiving and responding to clicking operation for one of the N controls and displaying a second interface.
In a possible implementation manner, the first electronic device comprises a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the fourth display unit is specifically configured to:
Receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the fourth display unit is specifically configured to:
Receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one by one; the fourth display unit is specifically configured to:
receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible embodiment, the apparatus further comprises:
The generating unit is used for receiving and responding to a third dragging operation of an xth sharing control in the K sharing controls to form a snapshot corresponding to the xth sharing control, and generating multi-device sharing controls corresponding to the xth second electronic device and the xth second electronic device when the snapshot corresponding to the xth sharing control is detected to be located in a third area where the xth sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
In a possible implementation manner, the fourth display unit is specifically configured to:
Receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In a possible implementation manner, the third display unit is specifically configured to:
receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
In one possible embodiment, the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
In one possible implementation, the target object is any one of a window, a file, a business card, and a link.
In a fourth aspect, an embodiment of the present application provides a computer storage medium storing computer software instructions for a control display device provided in the third aspect, where the computer storage medium includes a program designed to execute the method according to the third aspect.
In a fifth aspect, an embodiment of the present application provides a computer program including instructions that, when executed by a computer, cause the computer to perform the flow performed by the control display apparatus in the third aspect described above.
It should be understood that the description of technical features, technical solutions, advantages, or similar language does not imply that all of the features and advantages may be realized with any single embodiment. Rather, the description of features or advantages is understood to mean that a particular feature, aspect, or advantage is included in at least one embodiment. Thus, descriptions of features, aspects, or advantages in this specification do not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantages described in the respective embodiments below may be combined in any appropriate manner. Those skilled in the art will appreciate that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other embodiments, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
In order to more clearly describe the embodiments of the present application or the technical solutions in the background art, the following description will describe the drawings that are required to be used in the embodiments of the present application or the background art.
Fig. 1 is a schematic diagram of a cross-screen switching display.
Fig. 2 a-2 d are a set of user interface diagrams.
Fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application.
Fig. 4 is a block diagram of a software architecture of an electronic device 100 according to an embodiment of the present application.
Fig. 5 is a flowchart of a control display method according to an embodiment of the present application.
Fig. 6 a-6 c are a set of user interface diagrams provided by an embodiment of the present application.
Fig. 7 is a schematic diagram of a transfer gate change process according to an embodiment of the present application.
Fig. 8 is a schematic diagram of another transfer gate change process according to an embodiment of the present application.
Fig. 9a is a schematic diagram of a transfer door and a button back plate according to an embodiment of the present application.
Fig. 9b is a schematic diagram of a button back plate according to an embodiment of the present application.
FIGS. 10 a-10 h are a set of user interface diagrams provided by embodiments of the present application.
Fig. 11 is a schematic diagram of a transfer gate according to an embodiment of the present application.
Fig. 12 is a schematic view of another transfer gate according to an embodiment of the present application.
Fig. 13 a-13 e are a set of user interface diagrams provided by an embodiment of the present application.
Fig. 14 a-14 c are a set of user interface diagrams provided by an embodiment of the present application.
Fig. 15 a-15 e are a set of user interface diagrams provided by an embodiment of the present application.
FIGS. 16 a-16 d are a set of user interface diagrams provided by embodiments of the present application.
FIGS. 17 a-17 d are a set of user interface diagrams provided by embodiments of the present application.
Fig. 18 is a schematic structural diagram of a control display device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. The terminology used in the description of the embodiments of the application is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
The terms first, second, third and the like in the description and in the claims and in the drawings, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
Embodiments of an electronic device, a user interface for such an electronic device, and for using such an electronic device are described below. In some embodiments, the electronic device may be a portable electronic device such as a cell phone, tablet computer, a wearable electronic device (e.g., a smart watch) with wireless communication capabilities, and the like, that also includes other functions such as cross-screen display, cross-device sharing, and the like. Exemplary embodiments of portable electronic devices include, but are not limited to, piggy-backOr other operating system. The portable electronic device described above may also be other portable electronic devices, such as a laptop computer (laptop) or the like having a touch-sensitive surface or touch panel. It should also be appreciated that in other embodiments, the electronic device described above may not be a portable electronic device, but rather a desktop computer, a vehicle-mounted computer, or the like having a touch-sensitive surface or touch panel. It will be appreciated that the embodiment of the present application is described by taking a smart phone as an example, but the present application is not limited to a smart phone, and may be other smart devices with communication functions, such as a smart watch, a smart bracelet, virtual Reality (VR) glasses, and the like.
The term "User Interface (UI)" in the description and claims of the present application and in the drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, the interface source code is analyzed and rendered on the terminal equipment, and finally the interface source code is presented as content which can be identified by a user, such as a control of pictures, words, buttons and the like. Controls (controls), also known as parts (widgets), are basic elements of a user interface, typical controls being a toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), picture and text. The properties and content of the controls in the interface are defined by labels or nodes, such as XML specifying the controls contained in the interface by nodes < Textview >, < ImgView >, < VideoView >, etc. One node corresponds to a control or attribute in the interface, and the node is rendered into visual content for a user after being analyzed and rendered. In addition, many applications, such as the interface of a hybrid application (hybrid application), typically include web pages. A web page, also referred to as a page, is understood to be a special control embedded in an application interface, the web page being source code written in a specific computer language, such as hypertext markup language (hyper text markup language, HTML), cascading style sheets (CASCADING STYLE SHEETS, CSS), java script (JavaScript, JS), etc., the web page source code being loadable and displayable as user identifiable content by a browser or web page display component similar to the browser functionality. The specific content contained in a web page is also defined by tags or nodes in the web page source code, such as HTML defines the elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
First, some terms in the present application will be explained in order to be understood by those skilled in the art.
(1) Drag and drop (drag), or drag, refers to the entire operation of "in a notebook, desktop, tablet, or smart phone, clicking an object with a touch pad, mouse, or finger operation and keeping it unreleased, and moving the touch pad, mouse, or finger in the original plane, and releasing it again". The drag operation is generally used to execute an object such as a moving window or a file to a certain position, and may be a short distance or a long distance movement in the current interface, or a movement across devices or across screens. For example, referring to fig. 1, fig. 1 is a schematic diagram of a cross-screen switching display. As shown in fig. 1, the electronic device may include two display screens, i.e., a first display screen and a second display screen, which may be two display screens connected by a host computer, or two display screens on a notebook computer or a tablet computer, etc., which are not limited in particular in the embodiment of the present application. Then, a corresponding drag operation may be performed on the window shown in fig. 1, so that the window may be displayed in a switching manner on the first display screen and the second display screen, and specific reference may be made to the following schemes in the prior art, which will not be described in detail herein.
As described above, in order to implement a cross-device or cross-screen movement of a window or a file through a drag operation, various technical solutions are included in the prior art, and the following exemplary solution is exemplified by APP SWITCHER (application switcher) proposed in products of related companies currently on the market.
Referring to fig. 2 a-2 d, fig. 2 a-2 d are a set of user interface diagrams, and as shown in fig. 2 a-2 d, the electronic device (for example, a notebook computer in fig. 2 a-2 d) may include a plurality of display screens, and specifically may include a first display screen 01 and a second display screen 02. As shown in fig. 2a and fig. 2b, the interface displayed on the first display screen 01 of the electronic device may include a window 03, and when the user uses a mouse, a touch pad, or a finger (in the figure, a dot in the figure is a position touched by the finger, and the electronic device may be an electronic device with a touch screen function), and when a window column of the window 03 drags the window 03 (i.e. when the electronic device receives an input operation 04a of the user for the window 03 as shown in fig. 2 b), the electronic device displays a control combination 05 (or referred to as a button combination, or referred to as APP SWITCHER) in a moving direction of the drag operation. As shown in fig. 2c, the control combination 05 may include two controls 06a and 06b, where the control 06a may be a control for switching to the second display screen 02 for display, and the control 06b may be a control for switching to the dual screen and performing continuous full screen display. As shown in fig. 2c, the electronic device receives an input operation 04b from the user (e.g., on the basis of fig. 2b, the user does not loosen his finger and continues to drag the window until his finger touches the control 06 a), and in response to this input operation 04b, the electronic device displays a secondary menu 07 under the control combination 05, which is functionally related to switching to the second display 02, and at this time the control combination 05 and the secondary menu 07 may be in a suspended state. As shown in fig. 2d, the secondary menu 07 may include a control 07a (which may be used to switch to the left half of the second display 02 for display), a control 07b (which may be used to switch to the right half of the second display 02 for display), a control 07c (which may be used to switch to two-thirds of the second display 02 for display on the left screen), and a control 07d (which may be used to switch to two-thirds of the second display 02 for display on the right screen). As shown in fig. 2d, the electronic device receives an input operation 04c from the user (e.g., on the basis of fig. 2c, the user does not loosen the finger and continues to drag the window until the finger touches the control 07 a), and in response to the input operation 04c, the electronic device switches the window 03 to be displayed in the left half screen of the second display screen 02.
However, while the above-described solution may enable one window to be dragged from the current screen to another screen for display, the solution actually lands in the product and still presents a number of problems when used in the user. Firstly, since the frequency of the user dragging the window in the current interface to move in a small range is far greater than the frequency of executing the switching screen, as can be seen from the prior art described in fig. 2 a-2 d, if the control appears in each dragging of the window, a large disturbance is necessarily formed to the user in terms of vision and operation experience. Based on this, the problems of the above prior art may include the following:
1. As shown in fig. 2b, APP SWITCHER appears in a position consistent with the window movement direction, resulting in a high probability of false touch by the user. For example, when the user drags the window with the object of APP SWITCHER, the user must pass through the control combination 05 in the moving path, and at this time, the suspended state of the control combination 05 must be triggered, and the secondary menu 07 is displayed (as shown in fig. 2 c), so that a larger visual disturbance is formed for the user. And the subsequent user needs to cancel the display of the control combination 05 and the secondary menu 07 through other operations (for example, clicking a blank in the display interface of the first display screen 01, etc., which is not particularly limited in the embodiment of the present application), so that the operation amount and the operation burden of the user are greatly increased, and the operation efficiency of the user is reduced.
2. As shown in fig. 2b, the control combination 05 is large in size, so that the actual operation target of the user is easily blocked. For example, when the user drags the window with the object not using APP SWITHER, the actual operation object of the user is most likely to be blocked due to the large size of the control combination 05. For example, when a user wants to drag to an edge in windows (a window operating system) to trigger a split screen function, in the process that the user drags to the edge, the electronic device triggers and displays the control combination 05, and at this time, the user needs to leave the original path direction to execute the split screen operation in order to avoid the control combination 05, so that the operation burden of the user is greatly increased, and the operation efficiency of the user is reduced.
3. As shown in fig. 2b, 2c and 2d, the user needs to perform a longer and more accurate drag operation to finally achieve a target displayed across the screen, especially under the touch of the touch pad and the finger, the operation burden of the user is heavy. In the use environment of the touch pad, because the area of the touch pad is smaller, the user needs to press the touch pad harder to keep dragging under the condition that part of the touch pad is of a mechanical structure. For example, as can be seen from fig. 2b, 2c and 2d, the prior art requires the user to keep pressing the touch pad or the mouse to move from dragging the control combination 05 to dragging the cross-screen display control 06a to dragging the specific button of the cross-screen display scheme control 07a, or keep the finger touching the display to move until touching the control 07a and then releasing. In this way, with the increase of the cross-screen display schemes or the increase of sharable devices, the number of controls is increased, and the operation burden of the user is increased. In addition, under the operation of the touch screen, as the finger is larger, the size of the control in the prior art is not larger than that of the finger, so that when a user moves to a control area, what the actually selected control is cannot be seen, misoperation and invalid operation of the user are very easy to be caused, the user is often required to operate cautiously, the operation burden of the user is greatly increased, and the operation efficiency of the user is reduced.
Therefore, in order to solve the problem that the actual requirement of the user is not met in the current technology, the technical problem to be actually solved by the present application may include the following aspects: the problem that the operation efficiency is low due to the fact that the operation load of a user is overlarge (particularly in an environment using a touch pad for operation) under long-path dragging is avoided; distinguishing whether the user desires to move within the current interface or across devices or screens provides a more efficient and quick solution. Therefore, the embodiment of the application is based on the existing electronic equipment, the dragging intention of the user is identified through detecting the duration and the moving range of the dragging operation of the user, the transfer door (namely the corresponding control) of the required target is displayed at a proper position and time, the operation path of the user is shortened, and the experience of the user in the dragging process is optimized and improved in aspects of vision, interaction and the like.
Hereinafter, exemplary electronic devices involved in the embodiments of the present application will be described in detail.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application, where the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The display 194 is used to display an exemplary user interface provided by a subsequent embodiment of the present application. For a specific description of the user interface reference is made to the following.
In particular, the display 194 may be used to display target objects such as windows and files, related controls for cross-device and cross-screen displays, and the like, and will not be described in detail herein.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural network calculation processing unit (Neural-network Processing Unit), and can be used for rapidly processing input information by referencing the biological neural network structure, for example, referencing the transmission mode among human brain neurons, and can be used for continuous self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, when the user touches a certain position in the display 194 and starts to maintain the touch state from the position and moves to another position, the electronic device 100 may also calculate the distance between the two positions, the duration of the movement operation, and the like, based on the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
For example: in the embodiment of the application, the touch screen can detect the user operation on the file, the window or the link and the like, and the user operation can be the drag operation on the file. The touch screen may detect a user operation on a control, where the user operation may be a click operation on the control, and so on, and the above user operation may also have other implementation forms, which are not limited by the embodiment of the present application. The specific implementation of the above user operation may refer to the detailed description of the following method embodiments, which is not repeated here.
In an embodiment of the present application, the processor 110 may trigger a display target control in the display screen 194 according to a certain rule in response to a user operation for a file, window, link, or the like. The specific implementation of the policy and the specific implementation of the user operation received by the electronic device may refer to the related description of the subsequent embodiments, which is not described herein in detail.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 can also correspond to different vibration feedback effects by the touch operation on different areas of the display 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
In summary, the electronic device 100 may be a smart wearable device, a smart phone, a tablet computer, a notebook computer, a desktop computer, or the like, which is not particularly limited in the embodiment of the present application.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated. Referring to fig. 4, fig. 4 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the application.
It should be understood that the software architecture diagram illustrated in the embodiments of the present application is not intended to be limiting in any way with respect to the software architecture diagram of the electronic device 100.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, g.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The software system shown in fig. 4 involves application rendering (e.g., gallery, file manager) using shared capabilities, an instant sharing module providing shared capabilities, a print service (PRINT SERVICE) and a print background service (print spooler) providing print capabilities, and an application framework layer providing print frameworks, WLAN services, bluetooth services, and kernels and underlying layers providing WLAN bluetooth capabilities and basic communication protocols.
Referring to fig. 5, fig. 5 is a flowchart illustrating a control display method according to an embodiment of the present application. The method may be applied to the electronic device 100 described in fig. 3 and 4, and the electronic device 100 may be used to support and execute the method steps S101-S105 shown in fig. 5. As shown in fig. 5, a control display method provided by an embodiment of the present application may include:
Step S101, displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is located at a first position of the first interface.
Specifically, the electronic device 100 (i.e., the first electronic device) displays a first interface, where the first interface may include a target object, where the target object may be, for example, a window, a file (may include a document, a picture, a folder, etc.), a business card, a connection (e.g., a web page link, etc.), and so on. Alternatively, the target object may comprise a plurality of target points, wherein a first target point may be included, which may be located at a first position of the first interface.
Optionally, referring to fig. 6 a-6 c, fig. 6 a-6 c are a set of user interface diagrams provided by an embodiment of the present application. As shown in fig. 6a, taking the example that the target object is a window, the position of the finger/cursor (or referred to as a mouse) therein can be the first position of the first target point in the target object.
Step S102, a first area where the first position is located is determined and a second position where the first target point is located on the first interface is determined by receiving and responding to a first drag operation for the first target point on the target object.
Specifically, as shown in fig. 6b, the electronic device 100 receives and responds to a first drag operation for a first target point on the target object, determines a first area (e.g., a dashed circle in fig. 6 b) where a first position (e.g., a finger/cursor home position in fig. 6 b) is located, and determines a second position where the first target point is located at the first interface. As shown in fig. 6b, the user drags the window to move downward and rightward.
Step S103, if the second position is detected to be in the first area and the duration of the first drag operation is greater than the first duration threshold, displaying the target control at the third position of the first interface.
Specifically, the electronic apparatus 100 may determine the operation intention of the user according to the speed, the duration, the range, and the like of the user drag operation. If the electronic device 100 detects that the second position corresponding to the first drag operation is in the first area and the duration of the first drag operation is greater than the first duration threshold (for example, 1 second, 2 seconds, 3 seconds, etc.), the target control is displayed at the third position of the first interface. Optionally, as shown in fig. 6b, when the position touched by the finger/cursor of the user is still within the first area within the first time length threshold, the user may be considered to have an operation intention of performing cross-device sharing or cross-screen display, at this time, the target control (that is, the transfer door in fig. 6 b) may be displayed in a different direction of the movement track of the finger/cursor, that is, the direction in which the first position points to the third position is different from the direction in which the first position points to the second position (for example, as shown in fig. 6b, the target control may be displayed in a track opposite to the movement track of the finger/cursor), so after the user does not want to perform cross-device sharing or cross-screen display, but the target control is still triggered and displayed, the user is prevented from touching the target control by mistake in the positive direction of the movement track, and from blocking and visually interfering with the actual dragging path of the user. In contrast, as shown in fig. 6c, if the user drags the window to leave the first area rapidly within the first time length threshold, the user may consider that the user only wants to move the window in the interface and does not have the intention of sharing across devices or displaying across screens, and the target control (i.e. the transfer gate) may not be triggered to be displayed, so that interference caused by the target control that is not necessarily displayed to the actual operation of the user is reduced, and the operation experience and the operation efficiency of the user are improved.
Alternatively, the shape of the first area may be oval, square, triangle, etc. besides the circular shape shown in fig. 6b and 6c, which is not particularly limited in the embodiment of the present application. In addition, compared to the control combination 05 in the prior art, the embodiment of the present application reduces the control combination to a circular target control, which represents all the functions that may be triggered in the drag scene.
Optionally, due to the different screen sizes of different electronic devices, the operation intention of the user may be further assisted in determining the boundary range (i.e., the size of the first area) and the number of delay seconds (i.e., the first time length threshold) according to the actual product screen size and the distance between the actually set transfer door and the initial position (i.e., the first position) of the drag operation.
Optionally, as shown in fig. 6b, if the user does not want to share across devices or display across screens when the target control is triggered and displayed, the electronic device may also release the finger or the mouse at the third position, receive and respond to the operation of releasing the finger or the mouse by the user, and cancel displaying the target control. Optionally, after displaying the target control, if the user releases the finger or the mouse at the third position at this time, the target control may also be continuously displayed, in this case, if the user does not want to share across devices or display across screens, the user may also cancel displaying the target control by clicking a blank on the first interface, and so on, which is not limited in this embodiment of the present application.
Step S104, receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position; the fourth position is in the second area where the third position is; the target box includes N controls.
Specifically, the electronic device 100 receives and responds to the second drag operation for the first target point, determines that the first target point is located at the fourth position of the first interface, and displays the target frame at the third position; the fourth position is in the second area where the third position is; the target box includes N controls. Wherein N is an integer greater than or equal to 1.
Optionally, referring to fig. 7, fig. 7 is a schematic diagram illustrating a transfer gate changing process according to an embodiment of the application. As shown in fig. 7, the actual hot zone of the target control (i.e., the transfer door in fig. 7) may be larger than the size of the transfer door, and when the finger/cursor (i.e., the position where the first target point is located) moves into the hot zone along with the drag operation of the user (or during the whole second drag operation that does not enter the hot zone), the electronic device 100 may gradually enlarge the transfer door and gradually reduce the window, so that the user generates a visual effect of dragging the target object into the transfer door on visual feedback, improving the operation experience of the user, and ensuring that the user can easily operate on even a small object. Then, as shown in fig. 7, when the user continues to drag the window until the finger/cursor touches the transfer door, that is, when the first target point is located in the second area where the target control is located, the size of the transfer door may be enlarged to the size of the actual hot area, and needs to be larger than the size of the finger/cursor, so as to ensure the user to identify the target control, promote visual perception, and avoid misoperation of the user. Optionally, parameters such as the shape, transparency, and size of the transfer door may be customized by the user through the electronic device 100.
Optionally, referring to fig. 8, fig. 8 is a schematic diagram illustrating another transfer gate modification process according to an embodiment of the application. As shown in fig. 8, when the user continues to drag the window until the finger/cursor touches the transfer door, the transfer door may change color to indicate to the user that the finger or mouse may be released at this time, to trigger a subsequent function, and so on.
Optionally, referring to fig. 9a, fig. 9a is a schematic diagram of a transmission door and a button back plate according to an embodiment of the present application. As shown in fig. 9a, in the case of a color change indication of the transfer door, the user may release a finger or a mouse (i.e., release the target object of the drag) to complete the second drag operation, and at this time, the electronic device 100 may cancel the display of the transfer door and display the button back plate at the position where the transfer door is located. Thus, the user can visually experience that the transfer door is deformed by the circular shape as shown in fig. 9a into the button back plate (i.e., the target frame). As shown in fig. 9a, the button back panel may include a plurality of controls, for example, a control for switching to a left half screen of the second display screen for display, a control for switching to a right half screen of the second display screen for display, a full screen display control, and so on. Compared with the prior art, the embodiment of the application has the advantages that the user does not need to face a long-time dragging load, and the corresponding function can be realized by clicking the control which is called after the user loosens the hands. Moreover, compared with the prior art, the embodiment of the application can accommodate more functional controls. Referring to fig. 9b, fig. 9b is a schematic diagram of a button back plate according to an embodiment of the application. As shown in fig. 9b, more controls (e.g., button 1, button 2, button 3, etc. in fig. 9 b) may also be added by expanding the space of the button back plate, thereby implementing more functions, etc., which is not particularly limited in the embodiment of the present application. Alternatively, the number of the transmission gates may be two or more, and different transmission gates may correspond to different button backplanes, for example, a user may customize two transmission gates through the electronic device 100, one transmission gate may correspond to a button backplane including a plurality of sharing buttons, may share the drag content to other devices, another transmission gate may correspond to a button backplane including a plurality of switching screen display buttons, may switch the drag content to other display screens for display, and so on, which is not limited in the embodiment of the present application.
Step S105, receiving and responding to the clicking operation for one of the N controls, and displaying the second interface.
Specifically, the electronic device 100 receives and displays the second interface in response to a click operation for one of the N controls. For example, the electronic device 100 may include a first display screen and M second display screens, where M is an integer greater than or equal to 1, and then the N controls may include M switching screen display controls corresponding to the M second display screens one to one, or even, considering different display schemes for each second display screen, further more controls may be included, so that user choices are greatly enriched, and different requirements of users are satisfied. The electronic device 100 receives and responds to the clicking operation of the control by the user, and can switch the target object to the second display screen for display, and so on. It should be noted that, the first display screen and the second display screen in the respective drawings in the embodiments of the present application are merely examples, and the size of the second display screen may be smaller than that of the first display screen as shown in the drawings, may be equal to or larger than that of the first display screen, and so on, which is not particularly limited in the embodiments of the present application. In addition, the "first" and "second" in the first display screen and the second display screen described in the embodiments of the present application do not form a limitation on the specification, primary and secondary or priority of the display screen, but only for distinguishing the display screen where the current target object is located from other display screens.
Optionally, referring to fig. 10a to 10h, fig. 10a to 10h are schematic views of a set of user interfaces according to an embodiment of the present application. Taking the electronic device 100 as a notebook computer, and the mouse/touch pad operating environment as an example, as shown in fig. 10a, the electronic device 100 may include two display screens, namely a first display screen and a second display screen. The electronic device 100 displays a first interface through a first display screen, where the first interface includes a target object (window), and the window may be a common browser window or a chat window, and the embodiment of the application is not limited in detail. As shown in fig. 10b and fig. 10c, when the user presses the mouse selection window and starts to drag, if the electronic device 100 detects that the cursor is still located in the first area where the original cursor position is located within the first time length threshold range, the electronic device 100 may display the target control in the opposite direction of the cursor movement. As shown in fig. 10d, when the electronic device 100 detects that the cursor has moved into the hot zone of the target control, the target control may be enlarged and the target object (window) may be reduced. As shown in fig. 10e, when the electronic device 100 detects that the cursor touches the target control, the target control may be controlled to change color (e.g., change from original gray to yellow, etc.) indicating that the user may release the mouse/touch pad. As shown in fig. 10f, in response to the user releasing the mouse/touch pad, the electronic device 100 displays the target frame at the position where the original target control is located, and cancels the display of the target control, or may display the target control at any suitable position in the interface, etc., which is not limited in particular by the embodiment of the present application. Optionally, the target frame may include a plurality of controls, which may implement a corresponding plurality of functions, for example, may include controls of two-split screen/three-split screen/full screen/continuous screen (up-down structure/left-right structure); the method and the device can also share the contents such as files/links/words in the window to other devices, for example, for a browser window, all the links of the paging sign opened in the browser can be directly shared to identifiable devices of the person/other people, including mobile phones/computers/televisions/tablets, and the like, and the embodiment of the application is not limited in detail. As shown in fig. 10g and 10h, the electronic device 100 receives and responds to a click operation of the user on the control for switching to the left half second display screen display in fig. 10g, displays a second interface through the first display screen, wherein the second interface does not include the target object, and displays a third interface through the second display screen, wherein the third interface includes the target object. That is, the target object (window) is switched from the first display screen to the left half of the second display screen for display.
Optionally, the embodiment of the application also designs various prompts visually so as to ensure the awareness of the functions. For example, referring to fig. 11, fig. 11 is a schematic diagram of a transfer gate according to an embodiment of the application. As shown in fig. 11, the brightness of the interface background is used as a judgment standard, and the background is colored when the transmission door (i.e., the target control) appears, so that the color of the transmission door is correspondingly adjusted, the identifiability of the transmission door is ensured, and the operation experience of the user is improved. For example, referring to fig. 12, fig. 12 is a schematic diagram of another transfer gate according to an embodiment of the application. As shown in fig. 12, this can be indicated by a ripple; when in first use, a capsule-shaped prompt is adopted, and a specific function prompt can be marked on the capsule; when the cursor/finger moves to touch the transfer door (when the transfer door changes color), the cursor corner mark prompts the user to loosen the mouse/touch pad/finger to open a plurality of corresponding functional controls, and the like, so that the user can still recognize and know the functions of the abstract dots when seeing the abstract dots which are not specifically transmitted.
The following describes the operation steps in the embodiment of the present application in more detail and intuitively by way of examples of a plurality of application scenarios.
Application scenario one: referring to fig. 13 a-13 e, fig. 13 a-13 e are a set of user interface diagrams according to an embodiment of the present application. As shown in fig. 13a, taking the electronic device 100 as a notebook computer for example, the electronic device 100 displays a user interface 21, and a plurality of files (for example, the file 11, the file 12, the file 13, the file 14, etc.) may be included in the user interface 21. As shown in fig. 13b and 13c, the electronic device 100 receives a drag operation of a user on the file 16, determines a first area 22 where the file 16 is located in place (the first area 22 may be square as shown in fig. 13 b) in response to the drag operation, and forms a snapshot of the file 16, which is dragged along with the drag operation of the user. Optionally, parameters such as transparency and size of the snapshot are not limited in particular, and may be a default value of the system, or may be set by a user through the electronic device 100 in a user-defined manner. When the electronic device 100 detects that a snapshot (or cursor) of the file 16 remains within the first region 22 for a certain period of time, a target control 23 is displayed in a different direction than the drag operation. As shown in fig. 13d, the electronic device 100 receives a drag operation of a user on a snapshot of the file 16, and in response to the drag operation, gradually enlarges the size of the target control 23 and gradually reduces the size of the snapshot of the file 16, and when the electronic device detects that the snapshot of the file 16 is located on the target control 23 (or when the cursor is located on the target control 23), the target control 23 is controlled to change color, so as to prompt the user to release the mouse/touch pad at this time, and end the drag operation to trigger more functions. As shown in fig. 13e, when the electronic device 100 detects that the user releases the mouse/touch pad, the electronic device 100 may display the target frame 24 at the center of the interface (or other suitable location) and cancel the display of the target control 23 (the user may visually recognize that the target control 23 has changed to the target frame 24). Alternatively, the target frame 24 may be a floating window, a floating back plate, or the like, which is not particularly limited in the embodiments of the present application. As shown in fig. 13e, the target frame 24 may include a plurality of sharing controls, specifically including a sharing control 25, a sharing control 26, a sharing control 27, a sharing control 28, a sharing control 29, and a sharing control 30, that is, the N controls may include a plurality of sharing controls corresponding to a plurality of other devices (i.e., the second electronic device) one by one. The electronic device 100 receives a click operation of the user on the sharing control 28, and in response to the click operation, sends the file 16 to the other device "my mobile phone", and then the electronic device 100 may display the same interface as the user interface 21, or the electronic device 100 may cancel the display of the target frame 24 by receiving and in response to the click operation of the user on the interface blank, so as to display the same interface as the user interface 21. As shown in fig. 13e, other devices may receive the file 16.
The electronic device 100 and other devices may be connected to each other through a wired or Wireless network (such as Wireless-Fidelity (WiFi), bluetooth, and mobile network), which is not limited in detail in the embodiment of the present application.
It will be appreciated that fig. 13 a-13 e illustrate the process of synchronizing files from the computer side to the cell phone side, and similarly, the sharing operation between all electronic devices may be designed as such. In addition, sharing of multimedia such as photos/videos, sharing of contents such as business cards/calendars/memos, and the like can be included. Optionally, the object box 24 may further include a shortcut control for sending the drag content to the printer for printing, a shortcut control for moving the drag content to a folder, a shortcut control for sending the drag content in a manner of mail/information/various three-party applications (such as a communication application, etc. and a short video application, etc.), etc., which is not limited in particular by the embodiment of the present application.
In one possible implementation, after the target control 23 is morphed into the target frame 24, the user may also perform a clustering operation on the controls within the target frame 24. For example, referring to fig. 14 a-14 c, fig. 14 a-14 c are a set of user interface diagrams provided by an embodiment of the present application. As shown in fig. 14a, the electronic device 100 receives a drag operation of the user on the sharing control 28, forms a snapshot 28' of the sharing control 28 in response to the drag operation, and drags along with the drag operation of the user. As shown in fig. 14b and 14c, in response to a drag operation of the user on the snapshot 28', the electronic device 100 detects that the snapshot 28' is located above the sharing control 30, and at this time, the user releases the mouse/touch pad to create and display a multi-device sharing control 31 (i.e. a device group including "my mobile phone" and "mobile phone xxx") corresponding to the sharing control 28 and the sharing control 30, that is, the N controls may further include a multi-device sharing control for synchronously sharing to a plurality of other devices (i.e. the second electronic device). As shown in fig. 14c, the electronic device 100 receives a click operation of the user on the multi-device sharing control 31, and in response to the click operation, the file 16 may be sent to the other devices "my mobile phone" and the other devices "mobile phone xxx", respectively. The file 16 can be received by other devices 'my mobile phones' and other devices 'mobile phone xxx', so that the file sharing efficiency can be greatly improved, and convenience is brought to user operation.
Optionally, after creating the multi-device sharing control 31, the user may drag other sharing controls, for example, drag the sharing control 25 to the multi-device sharing control 31 and release the same, thereby creating a control that can be shared to three devices at a time (i.e. a device group including "my computer", "my mobile phone" and "mobile phone xxx"), and further improving the sharing efficiency. As described above, the function is mainly aimed at sharing more than 2 commonly used devices, such as fast sharing among friends of the same department, colleagues of the same class, and travel together, and brings convenience to user operation.
Optionally, the multi-device sharing control 31 (i.e., the group of devices that contain "my cell phone" and "cell phone xxx") may be saved to the background. The electronic device 100 may always display the multi-device sharing control 31 in the target frame 24, or when the electronic device 100 identifies any device in the device group by bluetooth or the like, the multi-device sharing control 31 may be displayed in the target frame 24, or alternatively, a device not identified in the device group may be grayed out to prompt the user.
And (2) an application scene II: under the global search function of today, a user can only open searched contents, but cannot perform quick operation on the contents. The embodiment of the application further provides a scheme that a user can call out the transmission door (namely the control) by dragging the searched content and share the searched content. Referring to fig. 15 a-15 e, fig. 15 a-15 e are a set of user interface diagrams according to an embodiment of the present application. As shown in fig. 15a, taking the electronic device 100 as a notebook computer for example, the electronic device 100 displays a user interface 32, where a search window may be included in the user interface 32, and a plurality of pieces of search content may be included in the search window, where the plurality of pieces of search content may be obtained by searching through a global search function of the electronic device 100 according to keywords, and may be, for example, a file, a business card, a browser connection, and so on. As shown in fig. 15b and 15c, the electronic device 100 receives a drag operation of the user with respect to the search content 1, forms a snapshot of the search content 1 in response to the drag operation, and drags with the drag operation of the user. When the electronic device 100 detects that the snapshot (or cursor) of the search content 1 is still located within a certain area where the search content 1 is located in place within a certain time period, the target control 23 is displayed in a different direction from the drag operation. As shown in fig. 15d, the electronic device 100 receives a drag operation of the user on the snapshot of the search content 1, and in response to the drag operation, gradually enlarges the size of the target control 23 and gradually reduces the size of the snapshot of the search content 1, when the electronic device detects that the snapshot of the search content 1 is located on the target control 23 (or when the cursor is located on the target control 23), the target control 23 is controlled to change color, so as to prompt the user to release the mouse/touch pad at this time, and the drag operation is ended, so as to trigger more functions. As shown in fig. 15e, when the electronic device 100 detects that the user releases the mouse/touch pad, the electronic device 100 may display the target frame 24 at the center of the interface (or other suitable location) and cancel the display of the target control 23. As shown in fig. 13e, the target frame 24 may include a plurality of sharing controls, which will not be described herein. The electronic device 100 receives a click operation of the sharing control 28 by a user, and sends the search content 1 to the other device "my cell phone" in response to the click operation. As shown in fig. 15e, other devices may receive the search content 1 (shown as a file "graduation album" for example). Reference may be made to the above description of the corresponding embodiments of fig. 13a to 13e for partial details of the application scenario, and details are not repeated here.
Application scenario three referring to fig. 16 a-16 d, fig. 16 a-16 d are a set of user interface diagrams according to an embodiment of the present application. As shown in fig. 16a, taking the electronic device 100 as a tablet computer, the electronic device 100 displays a user interface 33, and applications (such as weather, music, video, application malls, mails, gallery, etc.) may be included in the user interface 33. The electronic device 100 receives a click operation 34 of the user against the application gallery, and in response to the click operation 34, displays a user 35, which user interface 35 may include a plurality of pictures. As shown in fig. 16b, the electronic device 100 receives a drag operation 36 of a user for a picture 8, and in response to the drag operation 36, when the electronic device 100 detects that the picture 8 (or a finger) is still located within a certain area where the finger home position is located when the drag operation 36 starts within a certain duration range, the target control 23 is displayed in a different direction from the drag operation 36. As shown in fig. 16c, the electronic device 100 receives a drag operation 37 of the user on the picture 8, and in response to the drag operation 37, gradually enlarges the size of the target control 23 and gradually reduces the size of the picture 8, when the electronic device 100 detects that the picture 8 is located on the target control 23 (or when a finger is located on the target control 23), the target control 23 is controlled to change color, so as to prompt the user to release the finger at this time and end the drag operation 37. As shown in fig. 16d, when the electronic device 100 detects that the user releases his finger, the electronic device 100 may display the target frame 24 at the location of the target control 23 (or other suitable location) and cancel the display of the target control 23. As shown in fig. 16d, the target frame 24 may include a plurality of sharing controls, which will not be described herein. The electronic device 100 receives a click operation 38 of the user on the sharing control 26, and in response to the click operation 38, sends the picture 8 to other devices "my computer" which may receive the picture 8. The third part of the application scenario may be described with reference to the corresponding embodiments of fig. 13a to 13e, and will not be described herein.
And application scene IV: referring to fig. 17 a-17 d, fig. 17 a-17 d are a set of user interface diagrams according to an embodiment of the present application. As shown in fig. 17a, taking electronic device 100 as an example of a smartphone, electronic device 100 displays a user interface 39, which user interface 39 may be a resource manager interface, which may include a plurality of files, documents, etc. (e.g., file 1, file 2, etc., as well as document 1, document 2, document 3, etc.). As shown in fig. 17b, the electronic device 100 receives a drag operation 40 of a user on a document 1, and in response to the drag operation 40, when the electronic device 100 detects that the document 1 (or a finger) is still located within a certain area where the finger home position is located when the drag operation 40 starts within a certain period of time, the target control 23 is displayed in a different direction from the drag operation 40. As shown in fig. 17c, the electronic device 100 receives a drag operation 41 of a user on the document 1, and in response to the drag operation 41, gradually enlarges the size of the target control 23 and gradually reduces the size of the document 1, when the electronic device 100 detects that the document 1 is located on the target control 23 (or when a finger is located on the target control 23), the target control 23 is controlled to change color, so as to prompt the user to loosen the finger at the moment and end the drag operation 41. As shown in fig. 17d, when the electronic device 100 detects that the user releases his finger, the electronic device 100 may display the target frame 24 in the center of the interface (or other suitable location) and cancel the display of the target control 23. As shown in fig. 17d, a plurality of sharing controls may be included in the target frame 24, which will not be described herein. The electronic device 100 receives a click operation 42 of the user on the sharing control 27, and in response to the click operation 42, sends the document 1 to the other device "my tablet", which may receive the picture 8. The details of the fourth part of the application scenario may be referred to the above description of the corresponding embodiment of fig. 13a to 13e, and will not be repeated here.
In summary, the embodiment of the application provides a control display method, which can identify the operation intention of a user through detecting the duration and the movement range of the user's drag operation, and timely display the control, thereby reducing the visual interference to the user's operation, simultaneously shortening the path of the user's drag operation, reducing the operation burden of the user, and improving the operation efficiency of the user. The embodiment of the application digs more visual expressions and interaction modes and explores more draggable contents and specific embodiment scenes.
The foregoing details the method according to the embodiments of the present application, and the following provides relevant apparatuses according to the embodiments of the present application.
Referring to fig. 18, fig. 18 is a schematic structural diagram of a control display device according to an embodiment of the present application, where the control display device 20 may include a first display unit 201, a first determining unit 202, a second display unit 203, a third display unit 204, and a fourth display unit 205, and may further include a generating unit 206. Among them, the detailed description of each unit is as follows.
A first display unit 201 for displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface;
A first determining unit 202, configured to receive and respond to a first drag operation for the first target point on the target object, determine a first area where the first position is located, and determine that the first target point is located at a second position of the first interface;
A second display unit 203, configured to display a target control at a third position of the first interface if the second position is detected to be in the first area and the duration of the first drag operation is greater than a first duration threshold;
A third display unit 204, configured to receive and respond to a second drag operation for the first target point, determine that the first target point is located at a fourth position of the first interface, and display a target frame at the third position; the fourth position is in a second area where the third position is; the target frame comprises N controls, wherein N is an integer greater than or equal to 1;
And a fourth display unit 205, configured to receive and respond to a clicking operation for one of the N controls, and display a second interface.
In a possible implementation manner, the first electronic device comprises a first display screen and M second display screens, and the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the fourth display unit 205 is specifically configured to:
Receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
In one possible implementation, the N controls include a full screen display control; the fourth display unit 205 is specifically configured to:
Receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
In a possible implementation manner, the N controls include K sharing controls corresponding to the K second electronic devices one by one; the fourth display unit 205 is specifically configured to:
receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
In one possible embodiment, the apparatus further comprises:
The generating unit 206 is configured to receive and respond to a third drag operation for an xth sharing control in the K sharing controls, form a snapshot corresponding to the xth sharing control, and generate a multi-device sharing control corresponding to the xth second electronic device and the xth second electronic device when detecting that the snapshot corresponding to the xth sharing control is located in a third area where the xth sharing control is located; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
In a possible implementation manner, the fourth display unit 205 is specifically configured to:
Receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
In a possible implementation manner, the third display unit 204 is specifically configured to:
receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
In one possible embodiment, the direction in which the first position points to the third position is different from the direction in which the first position points to the second position.
In one possible implementation, the target object is any one of a window, a file, a business card, and a link.
It should be noted that, the implementation of each unit in the embodiment of the present application may also refer to the related description of the embodiment shown in fig. 3 to 17d, which is not repeated here.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in the computer device) to perform all or part of the steps of the above-mentioned method according to the embodiments of the present application. Wherein the aforementioned storage medium may comprise: various media capable of storing program codes, such as a U disk, a removable hard disk, a magnetic disk, a compact disk, a Read-Only Memory (abbreviated as ROM), or a random access Memory (Random Access Memory, abbreviated as RAM), are provided.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (17)

1. The control display method is characterized by being applied to first electronic equipment and comprising the following steps of:
Displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface;
Receiving and responding to a first drag operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining a second position where the first target point is located on the first interface after the first drag operation;
If the second position is detected to be in the first area and the duration of the first drag operation is longer than a first time threshold, displaying a target control on a third position of the first interface, wherein the direction of the first position pointing to the third position is different from the direction of the first position pointing to the second position;
Receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position;
the fourth position is in a second area where the third position is, the target frame comprises N controls, and N is an integer greater than or equal to 1;
and receiving and responding to clicking operation for one of the N controls, and displaying a second interface.
2. The method of claim 1, wherein the first electronic device comprises a first display screen and M second display screens, the first interface and the second interface being interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises:
Receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
3. The method of claim 1, wherein the N controls comprise full screen display controls; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises:
Receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
4. The method of claim 1, wherein the N controls include K sharing controls in one-to-one correspondence with the K second electronic devices; the receiving and responding to the clicking operation for one of the N controls, and displaying a second interface specifically comprises:
receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
5. The method according to claim 4, wherein the method further comprises:
Receiving and responding to a third dragging operation aiming at an xth sharing control in the K sharing controls, forming a snapshot corresponding to the xth sharing control, and generating multi-device sharing controls corresponding to the xth second electronic device and the xth second electronic device when the snapshot corresponding to the xth sharing control is detected to be positioned in a third area where the xth sharing control is positioned; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
6. The method of claim 5, wherein the receiving and responding to the clicking operation for one of the N controls displays a second interface, specifically comprising:
Receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
7. The method according to any one of claims 1-6, wherein the receiving and responding to the second drag operation for the first target point determines that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position, specifically includes:
receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
8. The method of any one of claims 1-6, wherein the target object is any one of a window, a file, a business card, and a link.
9. An electronic device is characterized in that the electronic device is a first electronic device and comprises a first display screen, a memory and one or more processors; the first display screen, the memory, and the one or more processors are coupled, the memory is used for storing computer program codes, the computer program codes comprise computer instructions, and the one or more processors call the computer instructions to cause the electronic device to execute:
Displaying a first interface; the first interface comprises a target object, wherein the target object comprises a first target point, and the first target point is positioned at a first position of the first interface;
Receiving and responding to a first drag operation aiming at the first target point on the target object, determining a first area where the first position is located, and determining a second position where the first target point is located on the first interface after the first drag operation;
If the second position is detected to be in the first area and the duration of the first drag operation is longer than a first time threshold, displaying a target control on a third position of the first interface, wherein the direction of the first position pointing to the third position is different from the direction of the first position pointing to the second position;
Receiving and responding to a second drag operation aiming at the first target point, determining that the first target point is positioned at a fourth position of the first interface, and displaying a target frame at the third position;
the fourth position is in a second area where the third position is, the target frame comprises N controls, and N is an integer greater than or equal to 1;
and receiving and responding to clicking operation for one of the N controls, and displaying a second interface.
10. The electronic device of claim 9, further comprising M second display screens; the first display screen, the M second display screens, the memory, and the one or more processors; the first interface and the second interface are interfaces displayed on the first display screen; the N controls comprise M switching screen display controls which are in one-to-one correspondence with the M second display screens, and M is an integer greater than or equal to 1; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform:
Receiving and responding to clicking operation for an ith switching screen display control in the M switching screen display controls, and displaying a second interface: the target object is not included in the second interface; the target object is included in a third interface displayed on an ith second display screen corresponding to the ith switching screen display control; i is an integer greater than or equal to 1 and less than or equal to M.
11. The electronic device of claim 9, wherein the N controls comprise full screen display controls; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform:
Receiving and responding to clicking operation for the full-screen display control, and displaying a second interface; the second interface comprises the target object, and the size of the target object in the second interface is larger than that of the target object in the first interface.
12. The electronic device of claim 9, wherein the N controls include K sharing controls that are in one-to-one correspondence with the K second electronic devices; the one or more processors are also configured to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to clicking operation for a j-th sharing control in the K sharing controls, sending the target object to a j-th second electronic device corresponding to the j-th sharing control, and displaying a second interface; wherein the second interface is the same as the first interface; j is an integer greater than or equal to 1 and less than or equal to K.
13. The electronic device of claim 12, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
Receiving and responding to a third dragging operation aiming at an xth sharing control in the K sharing controls, forming a snapshot corresponding to the xth sharing control, and generating multi-device sharing controls corresponding to the xth second electronic device and the xth second electronic device when the snapshot corresponding to the xth sharing control is detected to be positioned in a third area where the xth sharing control is positioned; the N controls further comprise the multi-device sharing control; x and y are integers greater than or equal to 1 and less than or equal to K.
14. The electronic device of claim 13, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
Receiving and responding to clicking operation for the multi-device sharing control, respectively sending the target object to the x second electronic device and the y second electronic device corresponding to the multi-device sharing control, and displaying a second interface; wherein the second interface is the same as the first interface.
15. The electronic device of any one of claims 9-14, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
receiving and responding to a second drag operation aiming at the first target point, gradually reducing the size of the target object and gradually increasing the size of the target control; determining that the first target point is located at a fourth position of the first interface, and displaying a target frame at the third position; when the first target point is detected to be located in a second area where the third position is located, changing the color of the target control to a target color; the target color is used for prompting a user that the first target point is located in a second area where the third position is located, so that the second dragging operation is completed.
16. The electronic device of any of claims 9-14, wherein the target object is any of a window, a file, a business card, and a link.
17. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-8.
CN202110372085.6A 2021-04-07 2021-04-07 Control display method and related equipment Active CN115185440B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110372085.6A CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment
PCT/CN2022/083215 WO2022213831A1 (en) 2021-04-07 2022-03-26 Control display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110372085.6A CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment

Publications (2)

Publication Number Publication Date
CN115185440A CN115185440A (en) 2022-10-14
CN115185440B true CN115185440B (en) 2024-05-10

Family

ID=83512323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110372085.6A Active CN115185440B (en) 2021-04-07 2021-04-07 Control display method and related equipment

Country Status (2)

Country Link
CN (1) CN115185440B (en)
WO (1) WO2022213831A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116680019A (en) * 2022-10-26 2023-09-01 荣耀终端有限公司 Screen icon moving method, electronic equipment, storage medium and program product

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452365A (en) * 2007-12-06 2009-06-10 Lg电子株式会社 Terminal and method of controlling the same
CN103324404A (en) * 2012-03-20 2013-09-25 宇龙计算机通信科技(深圳)有限公司 Method for moving icon and communication terminal
CN105653178A (en) * 2015-05-28 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Information sharing method and apparatus
CN108228053A (en) * 2017-12-29 2018-06-29 努比亚技术有限公司 A kind of information sharing method, intelligent terminal and storage medium
CN109684110A (en) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 Multimedia resource sharing method, device and storage medium
CN109725789A (en) * 2018-12-27 2019-05-07 维沃移动通信有限公司 A kind of application icon archiving method and terminal device
WO2020006669A1 (en) * 2018-07-02 2020-01-09 华为技术有限公司 Icon switching method, method for displaying gui, and electronic device
CN111443842A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Method for controlling electronic equipment and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615102A (en) * 2008-06-26 2009-12-30 鸿富锦精密工业(深圳)有限公司 Input method based on touch-screen
KR20160143135A (en) * 2015-06-04 2016-12-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6114792B2 (en) * 2015-09-16 2017-04-12 Kddi株式会社 User interface device capable of scroll control according to contact degree, image scrolling method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452365A (en) * 2007-12-06 2009-06-10 Lg电子株式会社 Terminal and method of controlling the same
CN103324404A (en) * 2012-03-20 2013-09-25 宇龙计算机通信科技(深圳)有限公司 Method for moving icon and communication terminal
CN105653178A (en) * 2015-05-28 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Information sharing method and apparatus
CN108228053A (en) * 2017-12-29 2018-06-29 努比亚技术有限公司 A kind of information sharing method, intelligent terminal and storage medium
WO2020006669A1 (en) * 2018-07-02 2020-01-09 华为技术有限公司 Icon switching method, method for displaying gui, and electronic device
CN109725789A (en) * 2018-12-27 2019-05-07 维沃移动通信有限公司 A kind of application icon archiving method and terminal device
CN109684110A (en) * 2018-12-28 2019-04-26 北京小米移动软件有限公司 Multimedia resource sharing method, device and storage medium
CN111443842A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Method for controlling electronic equipment and electronic equipment

Also Published As

Publication number Publication date
CN115185440A (en) 2022-10-14
WO2022213831A1 (en) 2022-10-13

Similar Documents

Publication Publication Date Title
JP7385008B2 (en) Screen capture method and related devices
WO2021013158A1 (en) Display method and related apparatus
WO2021129326A1 (en) Screen display method and electronic device
KR102470275B1 (en) Voice control method and electronic device
US20220300154A1 (en) Split-Screen Display Processing Method and Apparatus, and Electronic Device
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
EP4131911A1 (en) Application interface interaction method, electronic device, and computer readable storage medium
WO2021000839A1 (en) Screen splitting method and electronic device
CN110119296B (en) Method for switching parent page and child page and related device
WO2021000881A1 (en) Screen splitting method and electronic device
WO2021104030A1 (en) Split-screen display method and electronic device
WO2021036770A1 (en) Split-screen processing method and terminal device
WO2021063098A1 (en) Touch screen response method, and electronic device
EP3958106A1 (en) Interface display method and electronic device
WO2021078032A1 (en) User interface display method and electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN115185440B (en) Control display method and related equipment
WO2022002213A1 (en) Translation result display method and apparatus, and electronic device
WO2023098417A1 (en) Interface display method and apparatus
WO2023160455A1 (en) Object deletion method and electronic device
EP4283446A1 (en) Application switching method, graphical interface, and related device
CN114356186A (en) Method for realizing dragging shadow animation effect and related equipment
CN117311580A (en) Screen splitting method and foldable electronic equipment
CN117631920A (en) Data selection method and related device
CN116166156A (en) Icon moving method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant