WO2021031843A1 - 对象位置调整方法及电子设备 - Google Patents
对象位置调整方法及电子设备 Download PDFInfo
- Publication number
- WO2021031843A1 WO2021031843A1 PCT/CN2020/106795 CN2020106795W WO2021031843A1 WO 2021031843 A1 WO2021031843 A1 WO 2021031843A1 CN 2020106795 W CN2020106795 W CN 2020106795W WO 2021031843 A1 WO2021031843 A1 WO 2021031843A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- mark
- objects
- preset
- user
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
Definitions
- the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to an object position adjustment method and electronic equipment.
- the icon management method in related technologies makes it difficult for users to manage multiple icons at the same time, especially when organizing icons across desktops is difficult to operate, and the operation is cumbersome and time-consuming when multiple desktop icons are exchanged many.
- the embodiments of the present disclosure provide an object position adjustment method and electronic device to solve the problem of complicated and time-consuming operation of adjusting the position of an object (for example, an application icon) on a display interface in the related art.
- embodiments of the present disclosure provide a method for adjusting the position of an object, including:
- the second object is marked by a second mark.
- the embodiments of the present disclosure also provide an electronic device, including:
- the first receiving module is used to receive the first input of the user
- the marking module is used to respond to the first input and mark the first object through the first mark;
- the second receiving module is used to receive the second input of the user
- the exchange module is used to exchange the positions of the first object and the second object in response to the second input.
- an embodiment of the present disclosure also provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and capable of running on the processor.
- the computer program is executed by the processor to realize the above The steps of the object position adjustment method.
- an embodiment of the present disclosure also provides an electronic device, including:
- a touch screen wherein the touch screen includes a touch-sensitive surface and a display screen;
- One or more processors are One or more processors;
- One or more memories are One or more memories
- One or more sensors are One or more sensors;
- embodiments of the present disclosure also provide a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the object position adjustment method described above is implemented. step.
- the embodiments of the present disclosure also provide a computer non-transitory storage medium, the computer non-transitory storage medium stores a computer program, and when the computer program is executed by a computing device, the steps of the above object position adjustment method are implemented .
- the embodiments of the present disclosure also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the steps of the above object position adjustment method.
- FIG. 1 shows a schematic flowchart of an object position adjustment method according to an embodiment of the present disclosure
- FIG. 2 shows a schematic diagram of a user's operation state entering the mode of adjusting the position of an application icon
- Figure 3 shows one of the schematic diagrams of the user operation state of marking application icons
- Figure 4 shows the second schematic diagram of the user operation state of marking application icons
- FIG. 5 shows a schematic diagram of the user operation state of adjusting the number of application icons
- FIG. 6 shows one of the schematic diagrams of the user operation state of the cross-screen mobile application icon
- FIG. 7 shows the second schematic diagram of the user operation state of the cross-screen mobile application icon
- FIG. 8 shows a schematic diagram of the user operation state of canceling the previous operation
- FIG. 9 shows a schematic diagram of a user operation state after exiting the mode of adjusting the position of an application icon
- FIG. 10 shows one of the schematic diagrams of modules of the electronic device of the embodiment of the present disclosure
- FIG. 11 shows the second schematic diagram of the module of the electronic device of the embodiment of the present disclosure
- FIG. 12 shows the third schematic diagram of the module of the electronic device of the embodiment of the present disclosure
- FIG. 13 shows the fourth module schematic diagram of the electronic device of the embodiment of the present disclosure
- FIG. 14 shows the fifth schematic diagram of the electronic device of the embodiment of the present disclosure.
- FIG. 15 shows the sixth schematic diagram of the electronic device of the embodiment of the present disclosure.
- FIG. 16 shows a schematic diagram of a module of a third marking unit of an electronic device according to an embodiment of the present disclosure
- FIG. 17 shows the seventh module diagram of the electronic device of the embodiment of the present disclosure.
- FIG. 19 shows the ninth schematic diagram of the electronic device of the embodiment of the present disclosure.
- FIG. 20 shows a schematic diagram of the hardware structure of an electronic device according to an embodiment of the present disclosure.
- an embodiment of the present disclosure provides an object position adjustment method, including:
- Step 101 Receive a user's first input
- the first input may be a click input, a long press input, a sliding input, etc., that is, the first input is an input or operation performed by the user to mark the first object on the interface of the electronic device, including Click operation, long press operation, sliding operation, zoom operation, etc.
- Step 102 In response to the first input, mark the first object through the first mark;
- the electronic device needs to mark the first object to distinguish the object to be moved from the object that does not need to be moved; at the same time, in order to easily distinguish the object when moving, it is necessary to determine the marking number of each object (that is, the first marking The specific content of ), the marked numbers are distinguished by different numbers, for example, the marked numbers are allocated for the first object according to the number from small to large.
- Step 103 Receive a second input from the user
- the second input may be a click input, a long press input, a sliding input, etc., that is, the first input is the exchange of the position of the first object and the second object performed by the user on the interface of the electronic device.
- Operations including click operations, long press operations, sliding operations, zoom operations, etc.
- Step 104 In response to the second input, exchange the positions of the first object and the second object;
- the second object is marked by a second mark.
- the above method can realize the position exchange of different objects on the electronic device interface.
- the object can be an application icon of an application, a folder icon, or a picture displayed on the electronic device interface. .
- the cross-desktop movement of application icons is mainly implemented.
- a predetermined operation which can be a sliding operation, a click operation, and a pressing operation
- the desktop operation reduces the complexity of users' operations.
- the embodiment of the present disclosure can directly perform the operation of adjusting the position of the application icon in the normal mode of the terminal (the normal mode means that the user can open the application on the desktop). It should be further explained that, In this case, the operation used in the embodiment of the present disclosure should be different from the operation in the related art to avoid false triggering; optionally, the electronic device can also enter the adjustment application before proceeding to the embodiment of the present disclosure.
- Icon position mode in this mode, only the position of the application icon can be adjusted, which is different from the normal mode mentioned above
- the user can perform a sliding operation on the desktop to enter the mode of adjusting the position of the application icon, as shown in Figure 2.
- the user slides upwards from the bottom of the screen with three fingers (the arrow direction in Figure 2 indicates the direction of the finger slide) to enter the mode of adjusting the position of the application icon.
- the application icons from left to right are: icon 21 corresponding to application A, icon 22 corresponding to application B, icon 23 corresponding to application C, and icon 24 corresponding to application D.
- step 102 of the embodiment of the present disclosure The specific description is as follows.
- Manner 1 The number of the first objects is N, N is an integer greater than or equal to 1, the first input includes N first sub-inputs, and each first sub-input acts on a first object
- step 102 the specific implementation of step 102 is:
- i is a positive integer, and i ⁇ N.
- the method of determining the i-th first label of the i-th first object is also different, when the first input feature is the i-th first sub-input
- the specific method for determining the i-th first mark of the i-th first object is:
- the i-th of the i-th first object is determined according to the input order of the i-th first sub-input One mark.
- a specific finger can be used to mark the first object in advance.
- the index finger is used to mark the first object.
- the application icon is marked as 1
- the index finger operates on another application icon once
- the application icon is marked as 2, and so on, in order to clearly let the user know the specific mark of each application icon, A corresponding mark number is displayed below each application icon.
- the first input feature is the fingerprint information of the i-th first sub-input
- determine the i-th first marker of the i-th first object The specific method is:
- the mark associated with the fingerprint information of the i-th first sub-input is determined as the i-th first mark of the i-th first object.
- a plurality of different fingerprint information is preset, and different fingerprint information corresponds to different first identifiers, and different fingerprint information is used to mark objects.
- the fingerprint information of the index finger corresponds to the mark 1
- the fingerprint information of the middle finger corresponds to the mark 2.
- the terminal obtains that the fingerprint information of the middle finger corresponds to the mark 2.
- the application icon is marked as 2. In order to clearly let users know the order of marking, a corresponding marking number is displayed below each application icon.
- a marker frame in order to facilitate the identification of the first object that the user has marked, a marker frame needs to be displayed on the first object.
- the marker frame is a preset shape, and the preset shape can be a circle, a square, a diamond, a triangle, etc.
- the regular shape may also be an irregular shape defined by the user.
- the marking frame is used to clearly indicate the marked first object for the user, so that the user can distinguish the marked first object.
- Manner 2 The number of the first objects is M, and M is an integer greater than or equal to 1
- step 102 the specific implementation of step 102 is:
- M first objects are marked by M first marks.
- this situation means that the user performs the first input once and can mark at least one object at the same time.
- the number of objects that can be marked can be preset by the user and stored in the electronic device.
- the first input is a touch input of the user in a target area on the target interface
- the target area does not include the first object and the second object
- the second input feature includes The M touch points, specifically, according to the second input feature, the M first markers are used to mark the M first objects in an implementation manner as follows:
- the preset objects include: objects of preset types or objects of preset positions.
- the M first objects are objects of the same type; that is, in this case, the electronic device marks objects of the same type, For example, if the type of the first object is online shopping, the first type of application icon of the electronic device on the target interface is started, and M application icons of the type of online shopping are sequentially marked.
- the M first objects are objects at adjacent arrangement positions or objects at a preset arrangement interval; that is, in this case, the electronic device marks Is the adjacent first object or the first object arranged in a preset rule (for example, every other application icon is marked with an application icon), for example, the user uses M fingers to perform the first input, and the electronic device starts from the target interface Starting from the first application icon on the target interface, the electronic device starts with the first application icon on the target interface, and marks one application icon every two application icons until M application icons are marked.
- a preset rule for example, every other application icon is marked with an application icon
- the electronic device determines that the marking number of the marked first object corresponds to the sorting order according to the sorting order of the first object on the desktop, that is, according to the sorting order from front to back, the marked first object will be marked Numbering is performed sequentially from front to back. Specifically, in order to facilitate the user to view the marked numbers, the marked number corresponding to the first object is displayed on one side of each marked first object.
- the marking numbers are distinguished by different numbers. For example, the marking number of the first object whose ranking order is the first is 1, the marking number of the first object whose ranking is the second is 2, and so on.
- the electronic device marks the top three application icons on the desktop.
- the marked numbers of the marked application icons on the desktop of the electronic device are 1, 2, 3 in order, and the corresponding application icons are in order: icon 21 corresponding to application A, icon 22 corresponding to application B , Application C corresponding icon 23.
- the operation mode of marking multiple objects at the same time avoids the individual marking of objects one by one, reduces the interaction time, and meets the needs of users for using electronic devices in more scenarios.
- the mark numbers of some first objects may not meet the adjustment requirements during subsequent adjustments of the objects. Therefore, after the first object is marked , The user can also realize the exchange of tags of different objects, and the specific implementation manner is: receiving the third input performed by the user on the first target object and the second target object among the N first objects;
- the first identifier of the first target object is exchanged with the first identifier of the second target object in the first object.
- the third input may be a click input, a long-press input, a sliding input, etc., specifically, the third input received by the user on the first target object and the second target object among the N first objects
- An optional implementation is:
- the marking numbers of the two application icons will be interchanged, thereby achieving application icon marking Modification of figures.
- the left image in FIG. 5 shows the state before the marking number of the application icon is adjusted.
- the marking number of the icon 23 corresponding to application C is 1, and the marking number of the icon 22 corresponding to application B It is 2, the user presses down the icon 23 corresponding to application C and slides it to the icon 22 corresponding to application B in the manner of the arrow in the figure.
- the icon 23 corresponding to application C and the icon 22 corresponding to application B are changed. Numbers and marked numbers
- the desktop state after the adjustment is complete is as shown in the right diagram of FIG. 5.
- the marked number of the icon 23 corresponding to application C is 2, and the marked number of icon 22 corresponding to application B is 1.
- step 104 The following describes the specific implementation of step 104 as follows.
- step 104 the specific implementation of step 104 is:
- the second object is marked by a second mark, and when the second mark matches the first mark, the positions of the first object and the second object are exchanged.
- marking the second object and exchanging the position of the object are performed at the same time, which can quickly realize the exchange process of the object.
- the second marking is used to mark the realization of the second object.
- the implementation manner of marking the first object through the first mark is similar, and will not be repeated here.
- the electronic device performs the identification matching method. For example, when the first mark is 1, the electronic device exchanges the positions of the objects whose first mark is 1 and the second mark is 1.
- the user marks icon A through the middle finger, index finger B, and thumb mark icon C.
- the corresponding mark numbers for the middle finger, index finger, and thumb are 1, 2 and 3 respectively.
- Method 2 Mark the second object first, and then exchange the object according to the user's input
- step 104 the specific implementation of step 104 is:
- marking the second object through the second mark is similar to the way of marking the first object through the first mark, specifically:
- Manner 1 The number of the second objects is H, H is an integer greater than or equal to 1, the second input includes H second sub-inputs, and each second sub-input acts on a second object
- j is a positive integer, and j ⁇ H.
- the method of determining the jth second label of the jth second object is also different, when the fourth input feature is the jth first sub-input
- the input sequence and fingerprint information of, and the specific method for determining the j-th second mark of the j-th second object is:
- the j-th second mark of the j-th second object is determined according to the input order of the j-th second sub-input
- the method for determining the j-th second mark of the j-th second object is specifically:
- the mark associated with the fingerprint information of the j-th second sub-input is determined as the j-th second mark of the j-th second object.
- a plurality of different fingerprint information is preset, and different fingerprint information corresponds to different second identifiers, and different fingerprint information is used to mark objects.
- the fingerprint information of the index finger corresponds to the mark 1
- the fingerprint information of the middle finger corresponds to the mark 2.
- the terminal obtains that the fingerprint information of the middle finger corresponds to the mark 2.
- the application icon is marked as 2. In order to clearly let users know the order of marking, a corresponding marking number is displayed below each application icon.
- a marker frame in order to facilitate the identification of the second object that the user has marked, a marker frame needs to be displayed on the second object.
- the marker frame is a preset shape, and the preset shape can be a circle, a square, a diamond, a triangle, etc.
- the regular shape may also be an irregular shape defined by the user.
- the marking frame is used to clearly indicate the marked second object for the user, so that the user can distinguish the marked second object.
- K second objects are marked by K second marks.
- this situation means that the user performs a second input once and can mark at least one object at the same time.
- the number of objects that can be marked can be preset by the user and stored in the electronic device.
- the second input is a touch input of the user in a target area on the target interface, the target area does not include the first object and the second object, and the fourth input feature includes K touch points, specifically, according to the fourth input feature, the implementation of marking K second objects through K second marks is as follows:
- the preset objects include: objects of preset types or objects of preset positions.
- the K second objects are objects of the same type; that is, in this case, the electronic device marks objects of the same type, For example, if the type of the second object is online shopping, the first type of application icon of the electronic device on the target interface is started, and K application icons of type of online shopping are sequentially marked.
- the K second objects are objects at adjacent arrangement positions or objects at a preset arrangement interval; that is, in this case, the electronic device marks Is the adjacent first object or the first object arranged in a preset rule (for example, every other application icon is marked with an application icon).
- the user uses K fingers to perform the second input, and the electronic device starts from the target interface
- the first application icon on the icon starts to mark K application icons at a time; or, the electronic device starts from the first application icon on the target interface, and marks one application icon every two application icons until K application icons are marked.
- the operation mode of marking multiple objects at the same time avoids the individual marking of objects one by one, reduces the interaction time, and meets the needs of users for using electronic devices in more scenarios.
- the specific implementation method is: receiving the user’s comments on the third target object and the fourth target in the second object.
- the preset input may be a click input, a long press input, a sliding input, etc., specifically, one of the preset inputs performed by the user on the third target object and the fourth target object in the second object is received
- the optional implementation is:
- the first object is an object on the first interface
- the second object is an object on the second interface, in order to make it clear to the user when the first object is exchanged with the second object Before step 104, it also includes:
- the marked first object is displayed.
- This situation means that when the first object is exchanged with the second object, the first object to be exchanged is first displayed in a blank position of the interface where the second object is located, so as to mark the user on the first interface.
- the reminder of the object allows the user to clearly know the marked object on the first interface, which can assist the user to accurately exchange the object.
- one implementation manner of displaying the marked first object is: acquiring a fifth input feature of the fifth input; If the input characteristics are the same, the marked first object is displayed.
- the input feature of the fifth input should be the same as that used when marking the first object.
- the finger with characteristic fingerprint information should also be used to perform the fifth input; when the first object is marked with unused fingerprint information, when the first object is displayed, which finger is used to perform the fifth input, the fingerprint with that finger is displayed The marked first object corresponding to the message.
- the marked first object is displayed.
- the fifth input for displaying the first object is not related to the input feature marking the first object.
- the fifth input fingerprint information can be a specific fingerprint Information, you can perform the fifth input once to display a first object, or perform the fifth input once and display multiple first objects. It should be noted that this method does not require the user to remember the input when marking the first object Features, there is no need for the user to record the correspondence between the fingerprint information and the input feature marking the first object, which can increase the display speed of the first object, and thereby shorten the time for object exchange.
- the left image in Figure 6 represents the first screen desktop of the electronic device
- the right image represents the second screen desktop of the electronic device.
- the user executes the image along the blank position on the second screen desktop.
- the sliding operation in the direction of the arrow when the first sliding operation is performed, the icon 23 corresponding to application C moves to a blank position on the second screen desktop, and when the second sliding operation is performed, the icon 22 corresponding to application B moves to the second screen
- a blank position on the desktop and application icons moved to the second screen desktop also have marked numbers, and the user can also perform adjustments to the marked numbers of the application icons on the second screen desktop.
- the positions of the marked application icons operated by the user on the second screen desktop and the marked application icons on the first screen desktop are interchanged. It should be noted here that the positions of the application icons are exchanged according to the mark Digitally, that is, after the second input is performed, the application icons with the same number on different desktops are exchanged.
- the left image in Figure 7 represents the first screen desktop of the electronic device
- the right image represents the second screen desktop of the electronic device.
- the icon 27 corresponding to the application G on the second screen desktop Perform a sliding operation in the direction of the arrow. Because the icon 27 corresponding to application G has a label number of 1, then the icon 27 corresponding to application G and the icon 23 corresponding to application C with the label number 1 on the first screen desktop are mutually positionally exchanged.
- Example 1 Use the same finger to perform multiple operations on both the first screen desktop and the second screen desktop to mark the icons one by one, and first mark the second object, and then exchange the objects according to the user's input , No need to display the marked first object on the second screen desktop
- the user marks icon A, icon B, and icon C with one finger, and the corresponding marked numbers are 1, 2 and 3; on the desktop on the second screen, the user marks icon E with one finger. , Icon F and icon G, the corresponding marked numbers are 1, 2 and 3 respectively; when the user presses and holds the marked icon E on the second screen desktop to perform a sliding operation, the icon E and the first screen desktop After the exchange, icon E is displayed in the position of icon A, and icon A is displayed in the position of icon E.
- Example 2 Use the same finger to perform multiple operations on both the first screen desktop and the second screen desktop to mark the icons one by one, and first mark the second object, and then exchange the objects according to the user's input , Need to display the marked first object on the second screen desktop during exchange
- the user marks icon A, icon B, and icon C with one finger, and the corresponding marked numbers are 1, 2 and 3; on the desktop on the second screen, the user marks icon E with one finger. , Icon F and icon G, the corresponding marked numbers are 1, 2 and 3 respectively; when the user slides his finger on the blank position of the second screen desktop, an icon on the first screen desktop is displayed in the blank position, for example , According to the order of the marked numbers on the first screen desktop, slide your finger to display an icon; on the second screen desktop, the user presses the marked icon E to perform a sliding operation, then the icon E and the first The icon A on the desktop of the screen is exchanged. After the exchange, the icon E is displayed at the position of the icon A, and the icon A is displayed at the position of the icon E.
- Example 3 Different fingers are used to perform operations on the first screen desktop and the second screen desktop to mark the icons one by one, and the second object is marked first, and then the objects are exchanged according to the user's input. No need to display the marked first object on the second screen desktop when swapping
- the user uses the middle finger mark icon A, index finger mark icon B, and thumb mark icon C.
- the mark numbers corresponding to the middle finger, index finger, and thumb are 1, 2 and 3, respectively.
- icon A, icon B and The marking numbers corresponding to icon C are also 1, 2 and 3 respectively;
- the user uses the middle finger marking icon E, index finger marking icon F and thumb marking icon G, and the marking numbers corresponding to the middle finger, index finger, and thumb.
- the corresponding marked numbers of icon E, icon F and icon G are also 1, 2 and 3 respectively; when the user is on the second screen desktop, press and hold the marked icon F with index finger to perform sliding Operation, the icon F and icon B on the first screen desktop are exchanged. After the exchange, icon F is displayed at the position of icon B, and icon B is displayed at the position of icon F.
- Example 4 Different fingers are used to perform operations on the first screen desktop and the second screen desktop to mark the icons one by one, and the second object is marked first, and then the objects are exchanged according to the user's input. When swapping, the marked first object needs to be displayed on the second screen desktop
- the user uses the middle finger mark icon A, index finger mark icon B, and thumb mark icon C.
- the mark numbers corresponding to the middle finger, index finger, and thumb are 1, 2 and 3, respectively.
- icon A, icon B and The marking numbers corresponding to icon C are also 1, 2 and 3 respectively;
- the user uses the middle finger marking icon E, index finger marking icon F and thumb marking icon G, and the marking numbers corresponding to the middle finger, index finger, and thumb.
- the corresponding marked numbers of icon E, icon F, and icon G are also 1, 2 and 3 respectively; when the user is in the blank position of the second screen desktop, slide with the middle finger to display in the blank position Icon A on the first screen desktop, swipe with your index finger to display the icon B on the first screen desktop in a blank position.
- Example 5 Use multiple fingers to perform an operation on both the first screen desktop and the second screen desktop to mark multiple icons at the same time, and use the second object to mark the second object first, and then perform the object marking according to the user's input. Exchange, there is no need to display the marked first object on the second screen desktop
- the user slides with three fingers to mark icon A, icon B, and icon C at the same time.
- the marking numbers of these three icons are determined according to the order in which they are arranged on the desktop, namely 1, 2 and 3 ;
- the second screen desktop the user slides with three fingers to mark icon E, icon F and icon G at the same time.
- the marking numbers of these three icons are determined according to the order of arrangement on the desktop, which are 1, 2, and 3;
- Example 6 Multiple fingers are used to perform an operation on both the first screen desktop and the second screen desktop to mark multiple icons at the same time, and the second object is marked first, and then the object is marked according to the user's input. Exchange, need to display the marked first object on the second screen desktop during exchange
- the user slides with three fingers to mark icon A, icon B, and icon C at the same time.
- the marking numbers of these three icons are determined according to the order in which they are arranged on the desktop, namely 1, 2 and 3 ;
- the user slides with three fingers to mark icon E, icon F and icon G at the same time.
- the marking numbers of these three icons are determined according to the order of arrangement on the desktop, which are 1, 2, and 3;
- the user can also cancel the last operation on the application icon.
- the user performs the operation of swapping the marked numbers of the icon 22 corresponding to application B and the icon 23 corresponding to application C, but the user does not want to To make such an exchange, as shown in Figure 8, the user performs a two-finger sliding down operation in the direction of the arrow on a blank position on the desktop, and the electronic device cancels the last operation of adjusting the marked number.
- Apply B The marked number of the corresponding icon 22 is 2, and the marked number of the icon 23 corresponding to application C is 1.
- the user when the user does not need to adjust the position of the application icon, the user needs to exit the mode of adjusting the position of the application icon.
- the user performs a predetermined operation on the desktop of the electronic device to exit the mode of adjusting the position of the application icon, for example, The user can perform a sliding operation on the desktop to exit the mode of adjusting the position of the application icon.
- the user uses three fingers to slide along the arrow direction in the figure to the bottom of the screen (where the arrow direction in Figure 9 indicates The sliding direction of the finger), you can exit the mode of adjusting the position of the application icon.
- sliding operations performed with different numbers of fingers can be used to distinguish different operation intentions. For example, when a user uses three fingers to perform a sliding operation, the electronic device executes entering or exiting the adjustment application. In the mode of icon position, when the user uses two fingers to perform a sliding operation, what the electronic device performs is to undo the last operation on the application icon.
- the embodiments of the present disclosure do not require the user to manually drag the desktop icons one by one, and can realize the rapid adjustment of application icons across the desktop, which is convenient and time-saving, and improves the user experience.
- an embodiment of the present disclosure also provides an electronic device, including:
- the first receiving module 1001 is configured to receive the first input of the user
- the marking module 1002 is used to respond to the first input and mark the first object through the first mark;
- the second receiving module 1003 is configured to receive the second input of the user
- the exchange module 1004 is configured to exchange the positions of the first object and the second object in response to the second input.
- the number of the first objects is N, N is an integer greater than or equal to 1, and the first input includes N first sub-inputs, and each time the first sub-input acts on a first object;
- the marking module 1002 includes:
- the first acquiring unit 10021 is configured to acquire the first input feature of the i-th first sub-input
- the first determining unit 10022 is configured to determine the i-th first marker of the i-th first object according to the first input feature
- the first marking unit 10023 is configured to mark the i-th first object through the i-th first marking
- i is a positive integer, and i ⁇ N.
- the first input feature is the input sequence and fingerprint information of the i-th first sub-input
- the first determining unit 10022 is configured to:
- the i-th of the i-th first object is determined according to the input order of the i-th first sub-input One mark.
- the first input feature is the fingerprint information of the i-th first sub-input, wherein each time the fingerprint information of the first sub-input is different;
- the first determining unit 10022 is configured to:
- the mark associated with the fingerprint information of the i-th first sub-input is determined as the i-th first mark of the i-th first object.
- the number of the first objects is M, and M is an integer greater than or equal to 1;
- the marking module 1002 includes:
- the second acquiring unit 10024 is configured to acquire the second input feature of the first input
- the second marking unit 10025 is configured to mark M first objects through M first marks according to the second input feature.
- the first input is a touch input of the user in a target area on the target interface, the target area does not include the first object and the second object, and the second input feature includes M touch inputs.
- the second marking unit 10025 is used for:
- the preset object includes: an object of a preset type or an object of a preset position; when the preset object is an object of a preset type, the M first objects are objects of the same type ; In the case where the preset object is an object at a preset position, the M first objects are objects at adjacent arrangement positions or objects at a preset arrangement interval.
- the method further includes:
- the third receiving module 1005 is configured to receive the third input performed by the user on the first target object and the second target object among the N first objects;
- the execution module 1006 is configured to exchange the first identifier of the first target object with the first identifier of the second target object in the first object in response to the third input.
- the third receiving module 1005 is configured to:
- the switching module 1004 includes:
- the third acquiring unit 10041 is configured to acquire the third input feature of the second input
- the execution unit 10042 is configured to mark a second object with a second mark according to the third input feature, and exchange the first object with the first mark when the second mark matches the first mark 2. The location of the object.
- the switching module 1004 includes:
- the fourth acquiring unit 10043 is configured to acquire the fourth input feature of the second input
- the third marking unit 10044 is configured to mark the second object through the second marking according to the fourth input feature
- the first receiving unit 10045 is configured to receive the fourth input of the user
- the exchange unit 10046 is configured to respond to the fourth input and exchange the positions of the first object and the second object when the first identifier matches the second identifier.
- the number of the second objects is H, H is an integer greater than or equal to 1, the second input includes H second sub-inputs, and each second sub-input acts on a second object;
- the third marking unit 10044 includes:
- the obtaining subunit 100441 is configured to obtain the fourth input feature of the j-th second sub-input
- the determining subunit 100442 is configured to determine the jth second label of the jth second object according to the fourth input feature
- j is a positive integer, and j ⁇ H.
- the fourth input feature is the input sequence and fingerprint information of the j-th first sub-input
- the determining subunit 100442 is configured to:
- the j-th second mark of the j-th second object is determined according to the input sequence of the j-th second sub-input.
- the fourth input feature is the fingerprint information of the j-th second sub-input, wherein each time the fingerprint information of the second sub-input is different;
- the determining subunit 100442 is configured to:
- the mark associated with the fingerprint information of the j-th second sub-input is determined as the j-th second mark of the j-th second object.
- the number of the second objects is K, and K is an integer greater than or equal to 1;
- the third marking unit 10044 is used for:
- K second objects are marked by K second marks.
- the second input is a touch input of the user in a target area on the target interface
- the target area does not include the first object and the second object
- the fourth input feature includes K touch inputs.
- the third marking unit 10044 uses K second markings to mark K second objects according to the fourth input feature for:
- the preset object includes: an object of a preset type or an object of a preset position; when the preset object is an object of a preset type, the K second objects are objects of the same type ; In the case where the preset object is an object at a preset position, the K second objects are objects at adjacent arrangement positions or objects at a preset arrangement interval.
- the first object is an object on a first interface
- the second object is an object on a second interface
- the method further includes:
- the fourth receiving module 1007 is configured to receive the fifth input of the user in the blank area of the second interface
- the display module 1008 is configured to display the marked first object in response to the fifth input.
- the display module 1008 includes:
- the fifth acquiring unit 10081 is configured to acquire the fifth input feature of the fifth input
- the first display unit 10082 is configured to display the marked first object when the fifth input feature is the same as the input feature of the first input.
- the display module 1008 includes:
- the sixth obtaining unit 10083 is configured to obtain the fingerprint information of the fifth input and the input mode of the fifth input;
- the second display unit 10084 is configured to display the marked first fingerprint information when the fifth input fingerprint information is the preset second fingerprint information and the fifth input input method is the preset input method. Object.
- the electronic device provided by the embodiment of the present disclosure can implement each process implemented by the electronic device in the method embodiment of FIG. 1, and to avoid repetition, details are not described herein again.
- the electronic device of the embodiment of the present disclosure receives the user's first input, responds to the first input, marks the first object with the first mark, receives the user's second input, and responds to the second input to exchange the first input.
- the position of the first object and the second object In this way, the user does not need to manually drag the objects on the desktop one by one, and the object position can be quickly adjusted; this way, the user does not need to manually drag the desktop icons one by one. , You can quickly adjust the application icons across the desktop, which is convenient and time-saving.
- FIG. 20 is a schematic diagram of the hardware structure of an electronic device implementing an embodiment of the present disclosure.
- the electronic device 200 includes, but is not limited to: a radio frequency unit 2010, a network module 2020, an audio output unit 2030, an input unit 2040, a sensor 2050, a display unit 2060, a user input unit 2070, an interface unit 2080, a memory 2090, a processor 2011, and Power supply 2012 and other components.
- the electronic device may include more or fewer components than those shown in the figure, or combine certain components, or different components. Layout.
- electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, in-vehicle electronic devices, wearable devices, and pedometers.
- the processor 2011 is configured to receive a user's first input through the user input unit 2070; in response to the first input, mark the first object through a first mark; receive the user's second input through the user input unit 2070; respond The second input, exchange the positions of the first object and the second object;
- the second object is marked by a second mark.
- the electronic device of the embodiment of the present disclosure receives the user's first input, responds to the first input, marks the first object with the first mark, receives the user's second input, and responds to the second input to exchange the first input.
- the position of the first object and the second object In this way, the user does not need to manually drag the objects on the desktop one by one, and the object position can be quickly adjusted; this way, the user does not need to manually drag the desktop icons one by one. , You can quickly adjust the application icons across the desktop, which is convenient and time-saving.
- the radio frequency unit 2010 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 2011; in addition, Uplink data is sent to the base station.
- the radio frequency unit 2010 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 2010 can also communicate with the network and other devices through a wireless communication system.
- Electronic devices provide users with wireless broadband Internet access through the network module 2020, such as helping users to send and receive emails, browse web pages, and access streaming media.
- the audio output unit 2030 may convert the audio data received by the radio frequency unit 2010 or the network module 2020 or stored in the memory 2090 into audio signals and output them as sounds. Moreover, the audio output unit 2030 may also provide audio output related to a specific function performed by the electronic device 200 (for example, call signal reception sound, message reception sound, etc.).
- the audio output unit 2030 includes a speaker, a buzzer, and a receiver.
- the input unit 2040 is used to receive audio or video signals.
- the input unit 2040 may include a graphics processing unit (GPU) 2041 and a microphone 2042, and the graphics processor 2041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in the video capture mode or the image capture mode. Data is processed.
- the processed image frame can be displayed on the display unit 2060.
- the image frames processed by the graphics processor 2041 may be stored in the memory 2090 (or other storage medium) or sent via the radio frequency unit 2010 or the network module 2020.
- the microphone 2042 can receive sound, and can process such sound into audio data.
- the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 2010 for output in the case of a telephone call mode.
- the electronic device 200 further includes at least one sensor 2050, such as a light sensor, a motion sensor, and other sensors.
- the light sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display panel 2061 according to the brightness of the ambient light.
- the proximity sensor can close the display panel 2061 and the display panel 2061 when the electronic device 200 is moved to the ear. / Or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; the sensor 2050 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
- the display unit 2060 is used to display information input by the user or information provided to the user.
- the display unit 206 may include a display panel 2061, and the display panel 2061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
- LCD liquid crystal display
- OLED organic light-emitting diode
- the user input unit 2070 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
- the user input unit 2070 includes a touch panel 2071 and other input devices 2072.
- the touch panel 2071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 2071 or near the touch panel 2071. operating).
- the touch panel 2071 may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it
- the processor 2011 receives and executes the command sent by the processor 2011.
- the touch panel 2071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 207 may also include other input devices 2072.
- other input devices 2072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
- the touch panel 2071 can be overlaid on the display panel 2061.
- the touch panel 2071 detects a touch operation on or near it, it is transmitted to the processor 2011 to determine the type of the touch event.
- the type of event provides corresponding visual output on the display panel 2061.
- the touch panel 2071 and the display panel 2061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 2071 and the display panel 2061 can be integrated
- the implementation of the input and output functions of the electronic device is not specifically limited here.
- the interface unit 2080 is an interface for connecting an external device and the electronic device 200.
- the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
- the interface unit 2080 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 200 or can be used to connect the electronic device 200 to an external device. Transfer data between devices.
- the memory 2090 can be used to store software programs and various data.
- the memory 2090 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
- the memory 2090 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the processor 2011 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes the software programs and/or modules stored in the memory 2090, and calls the data stored in the memory 2090 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
- the processor 2011 may include one or more processing units; optionally, the processor 2011 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
- the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 2011.
- the electronic device 200 may also include a power supply 2012 (such as a battery) for supplying power to various components.
- a power supply 2012 (such as a battery) for supplying power to various components.
- the power supply 2012 may be logically connected to the processor 2011 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
- the electronic device 200 includes some functional modules not shown, which will not be repeated here.
- an embodiment of the present disclosure further provides an electronic device, including a processor 2011, a memory 2090, a computer program stored on the memory 2090 and running on the processor 2011, and the computer program is executed by the processor 2011
- an electronic device including a processor 2011, a memory 2090, a computer program stored on the memory 2090 and running on the processor 2011, and the computer program is executed by the processor 2011
- the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
- a computer program is stored on the computer-readable storage medium.
- the computer program is executed by a processor, each process of the object position adjustment method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
- the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
- the embodiments of the present disclosure also provide an electronic device, including:
- a touch screen wherein the touch screen includes a touch-sensitive surface and a display screen;
- One or more processors are One or more processors;
- One or more memories are One or more memories
- One or more sensors are One or more sensors;
- Embodiments of the present disclosure also provide a computer non-transitory storage medium, the computer non-transitory storage medium stores a computer program, and when the computer program is executed by a computing device, the steps of the aforementioned object position adjustment method are implemented.
- the embodiments of the present disclosure also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the steps of the above object position adjustment method.
Abstract
Description
Claims (28)
- 一种对象位置调整方法,包括:接收用户的第一输入;响应所述第一输入,通过第一标记,标记第一对象;接收用户的第二输入;响应所述第二输入,交换所述第一对象和第二对象的位置;其中,所述第二对象通过第二标记进行标记。
- 根据权利要求1所述的对象位置调整方法,其中,所述第一对象的数量为N个,N为大于或等于1的整数,所述第一输入包括N次第一子输入,每次第一子输入作用于一个第一对象;所述响应所述第一输入,通过第一标记,标记第一对象,包括:获取第i次第一子输入的第一输入特征;根据所述第一输入特征,确定第i个第一对象的第i个第一标记;通过所述第i个第一标记,标记所述第i个第一对象;其中,i为正整数,且i≤N。
- 根据权利要求2所述的对象位置调整方法,其中,所述第一输入特征为所述第i次第一子输入的输入顺序和指纹信息;所述根据所述第一输入特征,确定第i个第一对象的第i个第一标记,包括:在所述第i次第一子输入的指纹信息为预设第一指纹信息的情况下,根据所述第i次第一子输入的输入顺序,确定第i个第一对象的第i个第一标记。
- 根据权利要求2所述的对象位置调整方法,其中,所述第一输入特征为所述第i次第一子输入的指纹信息,其中,每次第一子输入的指纹信息不同;所述根据所述第一输入特征,确定第i个第一对象的第i个第一标记,包括:将与所述第i次第一子输入的指纹信息相关联的标记确定为第i个第一对象的第i个第一标记。
- 根据权利要求1所述的对象位置调整方法,其中,所述第一对象的数量为M个,M为大于或等于1的整数;所述响应所述第一输入,通过第一标记,标记第一对象,包括:获取所述第一输入的第二输入特征;根据所述第二输入特征,通过M个第一标记,标记M个第一对象。
- 根据权利要求5所述的对象位置调整方法,其中,所述第一输入为用户在目标界面上的目标区域的触控输入,所述目标区域不包括所述第一对象和所述第二对象,所述第二输入特征包括M个触控点;所述根据所述第二输入特征,通过M个第一标记,标记M个第一对象,包括:从所述目标界面上的预设对象开始,顺序标记M个第一对象;其中,所述预设对象包括:预设类型的对象或预设位置的对象;在所述预设对象为预设类型的对象的情况下,所述M个第一对象为具有相同类型的对象;在所述预设对象为预设位置的对象的情况下,所述M个第一对象为相邻排列位置的对象或预设排列间隔的对象。
- 根据权利要求2所述的对象位置调整方法,其中,所述响应所述第一输入,通过第一标记,标记第一对象之后,还包括:接收用户对N个所述第一对象中的第一目标对象和第二目标对象执行的第三输入;响应所述第三输入,互换所述第一目标对象的第一标识与所述第一对象中的第二目标对象的第一标识。
- 根据权利要求7所述的对象位置调整方法,其中,所述接收用户对N个所述第一对象中的第一目标对象和第二目标对象执行的第三输入,包括:接收用户将所述第一目标对象移动至与所述第二目标对象至少部分重合的位置的第三输入。
- 根据权利要求1所述的对象位置调整方法,其中,所述响应所述第二输入,交换所述第一对象和第二对象的位置,包括:获取所述第二输入的第三输入特征;根据所述第三输入特征,通过第二标记,标记第二对象,并在所述第二 标记与所述第一标记匹配的情况下,交换所述第一对象和第二对象的位置。
- 根据权利要求1所述的对象位置调整方法,其中,所述响应所述第二输入,交换所述第一对象和第二对象的位置,包括:获取所述第二输入的第四输入特征;根据所述第四输入特征,通过第二标记,标记第二对象;接收用户的第四输入;响应所述第四输入,在第一标识和第二标识匹配的情况下,交换所述第一对象和第二对象的位置。
- 根据权利要求10所述的对象位置调整方法,其中,所述第二对象的数量为H个,H为大于或等于1的整数,所述第二输入包括H次第二子输入,每次第二子输入作用于一个第二对象;所述根据第四输入特征,通过第二标记,标记第二对象,包括:获取第j次第二子输入的第四输入特征;根据所述第四输入特征,确定第j个第二对象的第j个第二标记;通过所述第j个第二标记,标记所述第j个第二对象;其中,j为正整数,且j≤H。
- 根据权利要求11所述的对象位置调整方法,其中,所述第四输入特征为所述第j次第一子输入的输入顺序和指纹信息;所述根据所述第四输入特征,确定第j个第二对象的第j个第二标记,包括:在所述第j次第二子输入的指纹信息为预设第二指纹信息的情况下,根据所述第j次第二子输入的输入顺序,确定第j个第二对象的第j个第二标记。
- 根据权利要求11所述的对象位置调整方法,其中,所述第四输入特征为所述第j次第二子输入的指纹信息,其中,每次第二子输入的指纹信息不同;所述根据所述第四输入特征,确定第j个第二对象的第j个第二标记,包括:将与所述第j次第二子输入的指纹信息相关联的标记确定为第j个第二对象的第j个第二标记。
- 根据权利要求10所述的对象位置调整方法,其中,所述第二对象的数量为K个,K为大于或等于1的整数;所述根据所述第四输入特征,通过第二标记,标记第二对象,包括:根据所述第四输入特征,通过K个第二标记,标记K个第二对象。
- 根据权利要求14所述的对象位置调整方法,其中,所述第二输入为用户在目标界面上的目标区域的触控输入,所述目标区域不包括所述第一对象和所述第二对象,所述第四输入特征包括K个触控点;所述根据所述第四输入特征,通过K个第二标记,标记K个第二对象,包括:从所述目标界面上的预设对象开始,顺序标记K个第二对象;其中,所述预设对象包括:预设类型的对象或预设位置的对象;在所述预设对象为预设类型的对象的情况下,所述K个第二对象为具有相同类型的对象;在所述预设对象为预设位置的对象的情况下,所述K个第二对象为相邻排列位置的对象或预设排列间隔的对象。
- 根据权利要求1所述的对象位置调整方法,其中,所述第一对象为第一界面上的对象,所述第二对象为第二界面上的对象;所述响应所述第二输入,交换所述第一对象和第二对象的位置之前,还包括:接收用户在所述第二界面的空白区域中的第五输入;响应所述第五输入,显示已标记的所述第一对象。
- 根据权利要求16所述的对象位置调整方法,其中,所述响应所述第五输入,显示已标记的所述第一对象,包括:获取所述第五输入的第五输入特征;在所述第五输入特征与所述第一输入的输入特征相同的情况下,显示已标记的所述第一对象。
- 根据权利要求16所述的对象位置调整方法,其中,所述响应所述第五输入,显示已标记的所述第一对象,包括:获取所述第五输入的指纹信息和所述第五输入的输入方式;在所述第五输入的指纹信息为预设第二指纹信息且所述第五输入的输入 方式为预设输入方式的情况下,显示已标记的所述第一对象。
- 一种电子设备,包括:第一接收模块,用于接收用户的第一输入;标记模块,用于响应所述第一输入,通过第一标记,标记第一对象;第二接收模块,用于接收用户的第二输入;交换模块,用于响应所述第二输入,交换所述第一对象和第二对象的位置。
- 根据权利要求19所述的电子设备,其中,所述第一对象的数量为N个,N为大于或等于1的整数,所述第一输入包括N次第一子输入,每次第一子输入作用于一个第一对象;所述标记模块,包括:第一获取单元,用于获取第i次第一子输入的第一输入特征;第一确定单元,用于根据所述第一输入特征,确定第i个第一对象的第i个第一标记;第一标记单元,用于通过所述第i个第一标记,标记所述第i个第一对象;其中,i为正整数,且i≤N。
- 根据权利要求20所述的电子设备,其中,所述第一输入特征为所述第i次第一子输入的输入顺序和指纹信息;所述第一确定单元,用于:在所述第i次第一子输入的指纹信息为预设第一指纹信息的情况下,根据所述第i次第一子输入的输入顺序,确定第i个第一对象的第i个第一标记。
- 根据权利要求20所述的电子设备,其中,所述第一输入特征为所述第i次第一子输入的指纹信息,其中,每次第一子输入的指纹信息不同;所述第一确定单元,用于:将与所述第i次第一子输入的指纹信息相关联的标记确定为第i个第一对象的第i个第一标记。
- 根据权利要求19所述的电子设备,其中,所述第一对象的数量为M个,M为大于或等于1的整数;所述标记模块,包括:第二获取单元,用于获取所述第一输入的第二输入特征;第二标记单元,用于根据所述第二输入特征,通过M个第一标记,标记M个第一对象。
- 根据权利要求23所述的电子设备,其中,所述第一输入为用户在目标界面上的目标区域的触控输入,所述目标区域不包括所述第一对象和所述第二对象,所述第二输入特征包括M个触控点;所述第二标记单元,用于:从所述目标界面上的预设对象开始,顺序标记M个第一对象;其中,所述预设对象包括:预设类型的对象或预设位置的对象;在所述预设对象为预设类型的对象的情况下,所述M个第一对象为具有相同类型的对象;在所述预设对象为预设位置的对象的情况下,所述M个第一对象为相邻排列位置的对象或预设排列间隔的对象。
- 根据权利要求20所述的电子设备,其中,在所述标记模块响应所述第一输入,通过第一标记,标记第一对象之后,还包括:第三接收模块,用于接收用户对N个所述第一对象中的第一目标对象和第二目标对象执行的第三输入;执行模块,用于响应所述第三输入,互换所述第一目标对象的第一标识与所述第一对象中的第二目标对象的第一标识。
- 根据权利要求19所述的电子设备,其中,所述交换模块,包括:第三获取单元,用于获取所述第二输入的第三输入特征;执行单元,用于根据所述第三输入特征,通过第二标记,标记第二对象,并在所述第二标记与所述第一标记匹配的情况下,交换所述第一对象和第二对象的位置。
- 根据权利要求19所述的电子设备,其中,所述交换模块,包括:第四获取单元,用于获取所述第二输入的第四输入特征;第三标记单元,用于根据所述第四输入特征,通过第二标记,标记第二对象;第一接收单元,用于接收用户的第四输入;交换单元,用于响应所述第四输入,在第一标识和第二标识匹配的情况 下,交换所述第一对象和第二对象的位置。
- 根据权利要求19所述的电子设备,其中,所述第一对象为第一界面上的对象,所述第二对象为第二界面上的对象;在所述交换模块响应所述第二输入,交换所述第一对象和第二对象的位置之前,还包括:第四接收模块,用于接收用户在所述第二界面的空白区域中的第五输入;显示模块,用于响应所述第五输入,显示已标记的所述第一对象。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20853703.5A EP4016975A4 (en) | 2019-08-16 | 2020-08-04 | OBJECT POSITION ADJUSTMENT METHOD AND ELECTRONIC DEVICE |
KR1020227008786A KR102641922B1 (ko) | 2019-08-16 | 2020-08-04 | 객체 위치 조정 방법 및 전자기기 |
JP2022510182A JP7331245B2 (ja) | 2019-08-16 | 2020-08-04 | 対象位置調整方法及び電子機器 |
US17/673,671 US20220171522A1 (en) | 2019-08-16 | 2022-02-16 | Object position adjustment method and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910757105.4 | 2019-08-16 | ||
CN201910757105.4A CN110536006B (zh) | 2019-08-16 | 2019-08-16 | 一种对象位置调整方法及电子设备 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/673,671 Continuation US20220171522A1 (en) | 2019-08-16 | 2022-02-16 | Object position adjustment method and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021031843A1 true WO2021031843A1 (zh) | 2021-02-25 |
Family
ID=68663347
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/106795 WO2021031843A1 (zh) | 2019-08-16 | 2020-08-04 | 对象位置调整方法及电子设备 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220171522A1 (zh) |
EP (1) | EP4016975A4 (zh) |
JP (1) | JP7331245B2 (zh) |
KR (1) | KR102641922B1 (zh) |
CN (1) | CN110536006B (zh) |
WO (1) | WO2021031843A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110536006B (zh) * | 2019-08-16 | 2021-03-02 | 维沃移动通信有限公司 | 一种对象位置调整方法及电子设备 |
CN112698762B (zh) * | 2020-12-31 | 2023-04-18 | 维沃移动通信(杭州)有限公司 | 图标显示方法、装置及电子设备 |
CN113311969A (zh) * | 2021-05-27 | 2021-08-27 | 维沃移动通信有限公司 | 图标位置的调整方法、装置、电子设备及可读存储介质 |
CN113274731B (zh) * | 2021-06-01 | 2023-05-16 | 腾讯科技(深圳)有限公司 | 虚拟角色选择顺序调整方法、装置、设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019547A (zh) * | 2012-12-24 | 2013-04-03 | 广东欧珀移动通信有限公司 | 一种调整移动终端应用程序位置的方法及系统 |
WO2015131630A1 (zh) * | 2014-09-17 | 2015-09-11 | 中兴通讯股份有限公司 | 桌面图标的置换方法及装置 |
CN105094527A (zh) * | 2015-06-26 | 2015-11-25 | 小米科技有限责任公司 | 图标交换方法及装置 |
CN107967086A (zh) * | 2017-10-31 | 2018-04-27 | 维沃移动通信有限公司 | 一种移动终端的图标排列方法及装置、移动终端 |
CN109814772A (zh) * | 2018-12-26 | 2019-05-28 | 维沃移动通信有限公司 | 一种应用程序图标的移动方法及终端设备 |
CN110536006A (zh) * | 2019-08-16 | 2019-12-03 | 维沃移动通信有限公司 | 一种对象位置调整方法及电子设备 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8619038B2 (en) * | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
JP5114181B2 (ja) * | 2007-12-18 | 2013-01-09 | 株式会社Ecs | 電子商取引における商品情報選択画面の編集方法、電子商取引システムおよび電子商取引における商品情報選択画面の編集プログラム |
KR100923755B1 (ko) * | 2009-07-06 | 2009-10-27 | 라오넥스(주) | 멀티터치 방식 문자입력 방법 |
EP2482175A1 (en) * | 2009-09-23 | 2012-08-01 | Dingnan Han | Method and interface for man-machine interaction |
CN102033642B (zh) * | 2009-09-29 | 2012-10-10 | 联想(北京)有限公司 | 一种手势识别的方法及电子设备 |
CN202110524U (zh) * | 2011-06-14 | 2012-01-11 | 上海博泰悦臻电子设备制造有限公司 | 终端设备及其图标位置互换装置 |
CN202133989U (zh) * | 2011-06-14 | 2012-02-01 | 上海博泰悦臻电子设备制造有限公司 | 终端设备及其图标位置互换装置 |
TWI571790B (zh) * | 2011-11-10 | 2017-02-21 | 財團法人資訊工業策進會 | 依感測信號更改圖示座標值的方法與電子裝置 |
US20140160049A1 (en) * | 2012-12-10 | 2014-06-12 | Samsung Electronics Co., Ltd. | Clipboard function control method and apparatus of electronic device |
US10056006B1 (en) * | 2013-03-14 | 2018-08-21 | Allstate Insurance Company | Pre-license development tool |
US9104309B2 (en) * | 2013-04-25 | 2015-08-11 | Htc Corporation | Pattern swapping method and multi-touch device thereof |
CN104252301A (zh) * | 2013-06-26 | 2014-12-31 | 富泰华工业(深圳)有限公司 | 优化单手操作的系统、方法及电子装置 |
KR20150051278A (ko) * | 2013-11-01 | 2015-05-12 | 삼성전자주식회사 | 오브젝트 이동 방법 및 이를 구현하는 전자 장치 |
US10521079B2 (en) * | 2014-04-03 | 2019-12-31 | Clarion Co., Ltd. | Vehicle-mounted information device |
JP6377938B2 (ja) | 2014-04-03 | 2018-08-22 | 株式会社 ディー・エヌ・エー | サーバ及び方法 |
JP6500406B2 (ja) | 2014-12-01 | 2019-04-17 | セイコーエプソン株式会社 | 入出力制御装置、入出力制御プログラム |
CN104731501B (zh) * | 2015-03-25 | 2016-03-23 | 努比亚技术有限公司 | 控制图标的方法和移动终端 |
JP6532372B2 (ja) | 2015-10-06 | 2019-06-19 | キヤノン株式会社 | 表示制御装置、その制御方法およびプログラム |
CN105426042A (zh) * | 2015-11-05 | 2016-03-23 | 小米科技有限责任公司 | 图标位置互换方法及装置 |
CN105446598A (zh) * | 2015-12-09 | 2016-03-30 | 上海斐讯数据通信技术有限公司 | 一种图标位置切换方法、系统以及一种电子设备 |
CN106339165B (zh) * | 2016-08-18 | 2019-06-11 | 广州视睿电子科技有限公司 | 对象位置的调整方法和装置 |
CN107870705B (zh) * | 2016-09-28 | 2021-12-28 | 珠海金山办公软件有限公司 | 一种应用菜单的图标位置的改变方法及装置 |
CN109074171B (zh) * | 2017-05-16 | 2021-03-30 | 华为技术有限公司 | 输入方法及电子设备 |
CN107656677B (zh) * | 2017-06-16 | 2020-01-03 | 平安科技(深圳)有限公司 | 一种调整应用图标位置的方法、存储介质和一种移动终端 |
CN109885222B (zh) * | 2019-02-13 | 2021-03-02 | Oppo广东移动通信有限公司 | 图标处理方法、装置、电子设备及计算机可读介质 |
-
2019
- 2019-08-16 CN CN201910757105.4A patent/CN110536006B/zh active Active
-
2020
- 2020-08-04 KR KR1020227008786A patent/KR102641922B1/ko active IP Right Grant
- 2020-08-04 EP EP20853703.5A patent/EP4016975A4/en active Pending
- 2020-08-04 WO PCT/CN2020/106795 patent/WO2021031843A1/zh unknown
- 2020-08-04 JP JP2022510182A patent/JP7331245B2/ja active Active
-
2022
- 2022-02-16 US US17/673,671 patent/US20220171522A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019547A (zh) * | 2012-12-24 | 2013-04-03 | 广东欧珀移动通信有限公司 | 一种调整移动终端应用程序位置的方法及系统 |
WO2015131630A1 (zh) * | 2014-09-17 | 2015-09-11 | 中兴通讯股份有限公司 | 桌面图标的置换方法及装置 |
CN105094527A (zh) * | 2015-06-26 | 2015-11-25 | 小米科技有限责任公司 | 图标交换方法及装置 |
CN107967086A (zh) * | 2017-10-31 | 2018-04-27 | 维沃移动通信有限公司 | 一种移动终端的图标排列方法及装置、移动终端 |
CN109814772A (zh) * | 2018-12-26 | 2019-05-28 | 维沃移动通信有限公司 | 一种应用程序图标的移动方法及终端设备 |
CN110536006A (zh) * | 2019-08-16 | 2019-12-03 | 维沃移动通信有限公司 | 一种对象位置调整方法及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
EP4016975A1 (en) | 2022-06-22 |
JP2022545202A (ja) | 2022-10-26 |
CN110536006A (zh) | 2019-12-03 |
EP4016975A4 (en) | 2022-10-19 |
KR20220047624A (ko) | 2022-04-18 |
KR102641922B1 (ko) | 2024-02-27 |
CN110536006B (zh) | 2021-03-02 |
JP7331245B2 (ja) | 2023-08-22 |
US20220171522A1 (en) | 2022-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021115373A1 (zh) | 应用图标的位置调整方法及电子设备 | |
WO2021031843A1 (zh) | 对象位置调整方法及电子设备 | |
WO2020258929A1 (zh) | 文件夹界面切换方法及终端设备 | |
WO2020011077A1 (zh) | 通知消息显示方法及终端设备 | |
WO2021088720A1 (zh) | 信息发送方法及电子设备 | |
WO2020134744A1 (zh) | 图标移动方法及移动终端 | |
CN107943390B (zh) | 一种文字复制方法及移动终端 | |
WO2020156169A1 (zh) | 显示控制方法及终端设备 | |
CN111338530B (zh) | 应用程序图标的控制方法和电子设备 | |
WO2020238497A1 (zh) | 图标移动方法及终端设备 | |
WO2020259024A1 (zh) | 图标分类方法、移动终端及计算机可读存储介质 | |
EP3731077A1 (en) | Method for editing text, and mobile device | |
WO2020151512A1 (zh) | 图像存储方法及终端设备 | |
WO2019223492A1 (zh) | 信息显示方法、移动终端和计算机可读存储介质 | |
WO2021036553A1 (zh) | 图标显示方法及电子设备 | |
WO2020199783A1 (zh) | 界面显示方法及终端设备 | |
CN111610904B (zh) | 图标整理方法、电子设备及存储介质 | |
WO2020038166A1 (zh) | 桌面应用的操作方法及终端 | |
WO2020001604A1 (zh) | 显示方法及终端设备 | |
WO2019184947A1 (zh) | 图像查看方法及移动终端 | |
WO2020215969A1 (zh) | 内容输入方法及终端设备 | |
WO2021104232A1 (zh) | 显示方法及电子设备 | |
WO2021082772A1 (zh) | 截屏方法及电子设备 | |
WO2020220893A1 (zh) | 截图方法及移动终端 | |
WO2020125405A1 (zh) | 终端设备的控制方法及终端设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20853703 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022510182 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20227008786 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020853703 Country of ref document: EP Effective date: 20220316 |