CN112148172B - Operation control method and device - Google Patents

Operation control method and device Download PDF

Info

Publication number
CN112148172B
CN112148172B CN202011051864.8A CN202011051864A CN112148172B CN 112148172 B CN112148172 B CN 112148172B CN 202011051864 A CN202011051864 A CN 202011051864A CN 112148172 B CN112148172 B CN 112148172B
Authority
CN
China
Prior art keywords
input
target
touch
area
touch area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011051864.8A
Other languages
Chinese (zh)
Other versions
CN112148172A (en
Inventor
李亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011051864.8A priority Critical patent/CN112148172B/en
Publication of CN112148172A publication Critical patent/CN112148172A/en
Application granted granted Critical
Publication of CN112148172B publication Critical patent/CN112148172B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses an operation control method and device, belongs to the technical field of communication, and can solve the problem of poor convenience of one-hand operation of electronic equipment. The method comprises the following steps: receiving a first input of a touch body on a target touch area; in response to the first input, displaying a selection box on the first object, wherein the selection box is used for selecting an object to be subjected to target operation; and under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation, wherein the second object is the object selected by the selection frame when the touch body leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of: when a user performs one-hand operation, a finger touches an area in a screen of the electronic equipment; a preset operation touch pad; a predetermined levitation zone.

Description

Operation control method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to an operation control method and device.
Background
With the development of communication technology, screens of electronic devices are becoming larger and larger to better meet the demands of users.
However, when a user operates the electronic device with a single hand, since the screen of the electronic device is large, it may be difficult for the user's finger to touch the side, top, or bottom areas of the screen. For example, in the process of editing information by using the left hand of the user, if the input mode of the electronic device is the full keyboard input method mode, the fingers of the user may not touch the rightmost keys of the full keyboard, and therefore the user may need to perform a hand-changing operation or a two-hand operation, which results in poor convenience of performing a one-hand operation by using the electronic device.
Disclosure of Invention
The embodiment of the application aims to provide an operation control method and device, and the problem that the convenience of one-hand operation of electronic equipment is poor can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an operation control method, where the method includes: receiving a first input of a touch body on a target touch area; in response to the first input, displaying a selection box on the first object, wherein the selection box is used for selecting an object to be subjected to target operation; and under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation, wherein the second object is the object selected by the selection frame when the touch body leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of the following: when a user performs one-hand operation, a finger touches an area in a screen of the electronic equipment; a preset operation touch pad; a preset suspension area.
In a second aspect, an embodiment of the present application provides an operation control apparatus, including: the device comprises a receiving module, a display module and a processing module. The receiving module is used for receiving a first input of a touch body on a target touch area. And the display module is used for responding to the first input received by the receiving module and displaying a selection frame on the first object, wherein the selection frame is used for selecting the object to be subjected to the target operation. And the processing module is used for controlling a second object to execute the target operation under the condition that the touch body leaves the target touch area, and the second object is an object selected by the selection frame when the touch body leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a predetermined levitation zone.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method as in the first aspect described above.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method as in the first aspect.
In the embodiment of the application, a first input of a touch body on a target touch area can be received; in response to the first input, displaying a selection frame on the first object, wherein the selection frame is used for selecting an object to be subjected to target operation; and under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation, wherein the second object is the object selected by the selection frame when the touch body leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a predetermined levitation zone. By means of the scheme, when a user wants to trigger some objects displayed in the side, top or bottom areas and the like in the screen of the electronic device with one hand, since the user can trigger the selection frame to be displayed on one of the objects through the operation of the target touch area (for example, a preset floating area) by the touch body (for example, the finger of the user), if the touch body leaves the target touch area in the objects in the selection frame triggered by the touch body, the display of the other object in the input frame or the display of the interface corresponding to the other object can be triggered. Therefore, objects which cannot be touched by a single hand of a user and are displayed in the side, top or bottom areas and the like of the screen of the electronic equipment can be triggered directly through the operation of the target touch area, and the convenience of one-hand operation is improved without hand changing operation or two-hand operation.
Drawings
Fig. 1 is a schematic diagram of an operation control method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a target touch area provided in the present embodiment;
FIG. 3 is a schematic diagram of controlling the movement of a selection box according to an embodiment of the present disclosure;
FIG. 4 is a second schematic diagram illustrating the movement of the control selection box according to the embodiment of the present application;
fig. 5 is a second schematic diagram of an operation control method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an operation control device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a hardware schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," should not be construed as advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
The embodiment of the application provides an operation control method and device, which can receive a first input of a touch body on a target touch area; in response to the first input, displaying a selection box on the first object, wherein the selection box is used for selecting an object to be subjected to target operation; and under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation, wherein the second object is an object selected by the selection frame when the touch body leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a predetermined levitation zone. By the scheme, when a user wants to trigger some objects displayed in the side, top or bottom areas and the like of the screen of the electronic device with one hand, since the user can trigger the selection frame to be displayed on one of the objects through the operation of the target touch area (for example, the preset floating area) by the touch body (for example, the finger of the user), in the case that the touch body triggers the selection frame to select another object actually required by the user in the objects, if the touch body leaves the target touch area, the display of the other object in the input frame or the display of the interface corresponding to the other object can be triggered. Therefore, the objects which cannot be touched by one hand of a user and are displayed in the side, top or bottom areas and the like of the screen of the electronic equipment can be triggered directly through the operation on the target touch area, hand changing operation or operation with two hands is not needed, and convenience of one-hand operation is improved.
The operation control method, the operation control device, and the electronic device provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Specifically, in order to more clearly illustrate the operation control method provided in the embodiment of the present application, the following provides two possible application scenarios of the operation control method:
(1) The method is applied to a scene that a user edits information in an input box in a full keyboard input method mode. For example, when a user edits information in a chat interface, the keypad of the handset is in a full keypad entry mode. If the user wants to edit information by using the left hand, the user can trigger the mobile phone to display a selection box by inputting a target touch area (such as a middle area) in the screen of the mobile phone by using the fingers of the left hand, and the selection box is displayed on the rightmost letter of the screen; then, the user can trigger the selection frame to move to a needed letter through the operation in the target touch area; finally, when the finger of the user leaves the target touch area, the mobile phone can input the letter in the input box. Therefore, the purpose of editing the short message by one hand can be realized without changing hands of the user for operation, and the convenience of single-hand input of the user is improved.
(2) The method is applied to an interface comprising a plurality of functional icons of an application program, and the interface corresponding to a certain functional icon in the plurality of functional icons is triggered and displayed. For example, when a user holds a mobile phone with his right hand, a personal center interface of a certain shopping application program is displayed in the screen of the mobile phone, and a plurality of function icons of "my order", "to be paid", and "to be evaluated" of the personal center setting interface are displayed at the top end of the screen of the mobile phone. If the user wants to view the information of the history order interface corresponding to the "my order" function icon, the user can trigger to display a selection box through an input of a finger of a right hand in a target touch area (such as a middle area) in a mobile phone screen, and the selection box is displayed on the "to-be-paid" function icon at the topmost end of the screen; then, the user can trigger the selection frame to move to the My order function icon through the operation in the target touch area; finally, when the fingers of the user leave the target touch area, the mobile phone can display a history order interface corresponding to the My order function icon, so that the user can view information of the history order interface. Therefore, the function icons displayed in the area which cannot be touched by a single hand can be triggered without the need of replacing hands of the user for operation, and convenience in single-hand operation of the user is improved.
As shown in fig. 1, an embodiment of the present application provides an operation control method including the following steps S101 to S103.
S101, receiving a first input of a touch body on a target touch area by an operation control device.
It should be noted that, in the embodiments of the application, the touch object may be a stylus, a finger of a user, or other objects that may be used for touch control.
Optionally, the target touch area may be any one of the following items: when a user performs one-hand operation, a finger touches an area in a screen of the electronic equipment; a preset operation touch pad; a predetermined levitation zone.
For example, the target touch area is an area touched by a finger when a user operates with a single hand in a screen of the electronic device. As shown in fig. 2 (a), an area A1 in the screen is an area that can be touched by a finger when the user operates with one hand, that is, the area A1 is a target touch area.
For example, the target touch area is taken as a preset floating area. When a user operates the screen of the electronic device with one hand, the user may trigger the electronic device to display the preset hovering area, as shown in fig. 2 (b), the electronic device displays the preset hovering area A2, that is, the preset hovering area A2 is the target touch area.
For example, the target touch area is taken as a preset operation touch pad. When a user operates the screen of the electronic device with one hand, the user may trigger the electronic device to enable the touch function of the retractable board, that is, the electronic device may extend out of the retractable board from inside, as shown in fig. 2 (c), the electronic device may extend out of the retractable board A3 from inside, that is, all touch areas of the retractable board A3 are target touch areas.
The display form of the target touch area of the electronic device is not specifically limited in the embodiments of the present application, where the display form may include at least one of the following: display size, display shape, display position, and the like. For example, the target touch area is a floating area with a preset size, and the floating area may be a circle. It will be appreciated that the display size may be a preset size, or a user-defined size.
Optionally, in this embodiment of the application, the first input may be a touch input to the target touch area. Wherein the touch input may include at least one of a press input and a slide input. For example, the touch input is a re-press input of a finger of the user on the target touch area.
In a possible manner, the electronic device provided in the embodiment of the present application has an operation control function, and the target operation function is used for using the operation control method provided in the embodiment of the present application. Before the foregoing S101, the operation control method provided in the embodiment of the present application may further include: receiving an input from a user; in response to the one input, the operation control means turns on an operation control function of the electronic apparatus. As such, the operation control methods in S101 to S103 in the embodiment of the present application can be performed.
In another possible way, the electronic device provided in the embodiment of the present application is provided with a target operation mode, and the target operation mode is used for using the operation control method provided in the embodiment of the present application. Before the foregoing S101, the operation control method provided in the embodiment of the present application may further include: displaying a suspension control; receiving another input to the hover control; in response to the one input, the operation control means switches the electronic apparatus from the normal operation mode to the target operation mode. Wherein the normal operation mode is an operation mode responding to a normal function different from the target operation mode. In this manner, the operation control methods in S101 to S103 in the embodiment of the present application can be executed.
It should be noted that, the above description related to enabling the operation control method provided by the embodiment of the present application is only two possible cases provided by the embodiment of the present application, and of course, the electronic device may be distinguished as the normal operation mode or the target operation mode by the difference between the first input and the normal input. The embodiments of the present application are not limited to the specific embodiments according to practical situations.
S102, the operation control device responds to the first input and displays a selection frame on the first object.
The selection box is used for selecting an object to be subjected to target operation.
Optionally, the first object may include at least one of: characters, emoticons, icons, thumbnails, and the like. The characters can be characters, letters, symbols, or the like. For example, the first object is an English letter. The specific situation is determined according to the actual situation, and the embodiment of the present application does not limit this.
Optionally, the display form of the selection frame is not specifically limited in the embodiments of the present application, where the display form may include at least one of the following: display size, display shape, etc. The display size can be a preset size or a self-defined size; the display shape may be rectangular, square, oval or other possible shapes. It can be understood that, in one case, the user can adjust the display size of the selection box at any time according to the object to be selected; alternatively, the electronic device may automatically adjust the display size of the selection box according to a change of the object in the selection of the selection box.
Alternatively, in this embodiment of the present application, the first object may be determined according to an input direction of the first input. The input direction of the first input is a horizontal direction, and the first object is an object displayed at the leftmost end or the rightmost end in the horizontal direction corresponding to the first input in the screen of the electronic equipment; or the input direction of the first input is a vertical direction, and the first object is an object displayed at the topmost or bottommost end in the vertical direction corresponding to the first input in the screen of the electronic device.
Illustratively, as shown in fig. 3 (a), the mobile phone displays a chat session interface, which is in a full keyboard input method mode. In the case where the user operates the mobile phone with the right hand, the user's right finger may press the screen and slide to the right in the direction indicated by the arrow 02 (i.e., the first input) in the area 01 (i.e., the target touch area) in the screen of the mobile phone. After the mobile phone receives the first input of the user, in response to the first input, a selection box 03 is displayed on the displayed leftmost letter "Q" (i.e., the first object) in the horizontal direction in the screen corresponding to the first input according to the input direction of the first input.
Optionally, in the foregoing embodiment, the first object is determined according to the input direction of the first input, and this embodiment of the present application may also provide another manner: the display position of the first object is a preset display position, and in response to the first input, a selection frame is displayed at the preset display position where the first object is located.
For example, in the case of displaying multiple rows of objects on the screen, pressing the target touch area by the user's finger triggers display of a selection box on the object in the first row of the multiple rows of objects, or sliding the user's finger downward on the target touch area triggers display of a selection box on the object in the first row of the multiple rows of objects.
And S103, under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation by the operation control device.
And when the touch body leaves the target touch area, the second object is selected by the selection frame. The target operation may include any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object.
Optionally, the second object may include at least one of: characters, emoticons, icons, thumbnails, and the like. The characters can be characters, letters, symbols, or the like. For example, the first subject is an image of a "smiley face" (here, a smiley face expression is represented by text). The specific situation is determined according to the actual situation, and the embodiment of the present application does not limit this.
It will be appreciated that where the second object is an icon, the icon may be an application icon on the desktop, or a function icon in an application program interface. For example, the second object is a music application icon on the desktop; as another example, the second object is a "two-dimensional" icon displayed in the classification interface of the reading application.
It should be noted that, in the embodiment of the present application, the second object may be the same as or different from the first object. It will be appreciated that where the second object is different from the first object, the second object is a different object of the same type as the first object.
Illustratively, if the first object is a character, the second object is a character; if the first object is an expression image, the second object is an expression image; if the first object is an icon, the second object is an icon; and if the first object is a picture, the second object is a picture.
Optionally, the step S103 may specifically be: and controlling the second object to execute the target operation under the condition that the second object is selected by the selection frame and the touch body is away from the target touch area.
It should be noted that, in the embodiment of the present application, the target operation performed on different second objects may be the same or different.
Illustratively, where the second object is a character in an input method keyboard, the target operation is: displaying the one character in the input box; in the case where the second object is an expression image, the target operation is: displaying the expression image in an input box; under the condition that the second object is an application icon on the desktop of the mobile phone, the target operation is as follows: displaying an interface corresponding to the application icon, namely displaying an interface of the application program indicated by the application icon; in the case where the second object is a thumbnail in an album, the target operation is: and displaying an interface corresponding to the thumbnail, namely displaying a display interface comprising a picture corresponding to the thumbnail.
For example, the operation control device is taken as a mobile phone. As shown in fig. 3 (a), the mobile phone displays a chat session interface, which is in a full keyboard input method mode. In the case where the user operates the mobile phone with the right hand, if the user wants to edit information in the input box, the user's right finger may press the screen and slide rightward in the direction indicated by the arrow 02 (i.e., the first input) in the area 01 (i.e., the target touch area) in the screen of the mobile phone. After the mobile phone receives the first input of the user, in response to the first input, a selection box 03 is displayed on the leftmost letter "Q" (i.e., the first object) displayed on the screen. If the user wants to enter the letter "Q" in the input box, the user's finger may leave the area 01 at this time. Thereafter, as shown in fig. 3 (b), the cellular phone displays the letter "Q" in the input box. It is understood that the second object is the first object.
For example, the operation control device is taken as a mobile phone. As shown in fig. 4 (a), the mobile phone displays a service application interface including a "contact application icon", a "mail application icon", a "voice recording application icon", a "scanning application icon", and a "train ticket application icon". In the case where the user operates the cell phone with the right hand, if the user wants to use the cell phone to order a train ticket, the user's right hand finger may press the screen and slide down (i.e., first input) in the direction indicated by arrow 05 in area 04 (i.e., target touch area) in the cell phone screen. After the cell phone receives this first input by the user, a selection box 06 is displayed on the topmost "contact application icon" (i.e., the first object) displayed on the screen in response to the first input. Then, as shown in fig. 4 (b), the user's right finger may continue to slide down in the direction indicated by the arrow 05 in the area 04 in the screen of the cell phone, triggering the cell phone to move the selection box 06 to the "train ticket application icon" (i.e., the second object). When the selection box 06 is moved to the "train ticket application icon", the user's finger may leave the area 04. As shown in (c) of fig. 4, in a case where the finger of the user can leave the area 04, the mobile phone displays a train ticket buying interface corresponding to the "train ticket application icon" in the screen.
The embodiment of the application provides an operation control method, when a user wants to trigger some objects displayed in a side, top or bottom area and the like in a screen of an electronic device with a single hand, since the user can trigger a selection frame to be displayed on one of the objects through an operation of a target touch area (for example, a preset floating area) by a touch body (for example, a finger of the user), in a case that the touch body triggers the selection frame to select another object actually required by the user in the objects, if the touch body leaves the target touch area, the display of the other object in an input frame or a corresponding interface of the other object can be triggered. Therefore, the objects which cannot be touched by one hand of a user and are displayed in the side, top or bottom areas and the like of the screen of the electronic equipment can be triggered directly through the operation on the target touch area, hand changing operation or operation with two hands is not needed, and convenience of one-hand operation is improved.
Optionally, after displaying the selection box on the first object, the user may trigger the selection box to select another object. For example, referring to fig. 1, as shown in fig. 5, after S102 and before S103, the operation control method provided in the embodiment of the present application may further include S104 and S105 described below.
And S104, receiving a second input on the target touch area by the operation control device.
Optionally, in this embodiment of the application, the second input may be a touch input to the target touch area. For example, the touch input may be a sliding input of a finger of the user on the target touch area. In addition, the second input may further include an input parameter, the input parameter being an input direction of the second input.
And S105, the operation control device responds to the second input and controls the selection frame to move to the second object.
Alternatively, S105 may be specifically implemented by S105A described below.
And S105A, responding to the second input, and controlling the selection frame to move to the second object according to the target direction.
Wherein the target direction is determined according to an input direction of the second input.
For example, if the input direction of the second input is leftward, the target direction is leftward, that is, the selection box is moved to the second object according to the leftward direction; if the input direction of the second input is rightward, the target direction is rightward, namely the selection frame is moved to the second object according to the rightward direction; if the input direction of the second input is upward, the target direction is upward, namely the selection frame is moved to the second object according to the upward direction; if the input direction of the second input is downward, the target direction is downward, that is, the selection frame is moved to the second object in the downward direction.
It should be noted that the selection box can be moved from the first object to the second object by moving in at least one of the left, right, downward, or upward directions. Specifically, the embodiment of the present application does not limit this, and is determined according to actual situations.
Optionally, in a case where a plurality of objects including the first object and the second object are displayed in the screen, if the second input includes one sub-input, the selection box moves once to the second object in response to the second input, that is, the second object is adjacent to the display position of the first object; if the second input comprises a plurality of sub-inputs, the selection box responds to the second input and moves for a plurality of times to the second object, namely the second object is not adjacent to the display position of the first object. It will be appreciated that in response to an input, the selection box moves one position.
Optionally, the second input includes M sub-inputs, one sub-input is used to trigger the selection frame to move once, and M is a positive integer. The "controlling the selection box to move to the second object" may specifically include: and controlling the selection frame to move for M times until the selection frame moves to the second object. Wherein the selection box is moved one display bit at a time, one display bit for displaying one object.
Optionally, the display position is a position occupied by the display object. For example, a first object is displayed in one display position and a second object is displayed in another display position.
It can be understood that, when the second input includes M sub-inputs, since one sub-input is used to trigger the selection box to move once and the selection box moves one display bit at a time, it can be ensured that the selection box is accurately displayed on the object that the user wants to select through a plurality of sub-inputs, thereby preventing the user from operating frequently in the target touch area due to the selection box being displayed on the object that the user wants to select by missing the selection box due to a user misoperation, and controlling the selection box to be displayed on the object that the user wants to select.
Illustratively, and by way of example, fig. 4 is also provided. As shown in fig. 4 (a), the cell phone displays a selection box 06 on the topmost "contact application icon" (i.e., the first object) displayed on the screen. If the user wants the selection box to move to "train ticket application icon", the user's right hand finger may continue to slide down in the direction indicated by arrow 05 (i.e., the second input) in area 04 in the cell phone screen. After the cell phone receives the second input, in response to the second input, the cell phone moves the selection box 06 to the "train ticket application icon" (i.e., the second object) as shown in fig. 4 (b).
It should be noted that, in the above embodiment, the input parameter of the second input only includes the input direction as an example, and therefore, in the case that the second input includes M sub-inputs, one sub-input is used to trigger the selection frame to move once, and one display bit is moved each time. The embodiment of the present application may also provide another implementation: the input parameters of the second input can also comprise input step length, and under the condition that the second input comprises M sub-inputs and the input step length of the second input is a preset step length, one sub-input is used for triggering the selection frame to move one display bit; and under the condition that the second input comprises M sub-inputs and the input step size of the second input is N preset step sizes, one sub-input is used for triggering the selection frame to move N display bits, and N is a positive integer greater than 1.
According to the operation control method provided by the embodiment of the application, under the condition that the selection frame is triggered to be displayed on the first object through one input to the target touch area, the selection frame can be triggered to move to the second object through the other input to the target touch area, so that the selection frame can be autonomously controlled to be displayed on different objects according to the actual needs of a user, namely, any area in the screen of the electronic equipment can be freely operated through the input to the target touch area.
It should be noted that, in the operation control method provided in the embodiment of the present application, the execution main body may be an operation control device, or a control module used for executing the operation control method in the operation control device. In the embodiment of the present application, an operation control device executing an operation control method is taken as an example, and the operation control device provided in the embodiment of the present application is described.
As shown in fig. 6, an operation control apparatus 200 according to an embodiment of the present application may include a receiving module 201, a display module 202, and a processing module 203. The receiving module 201 may be configured to receive a first input of a touch object on a target touch area. The display module 202 may be configured to display a selection box on the first object in response to the first input received by the receiving module 201, where the selection box is used to select an object to be subjected to the target operation. The processing module 203 may be configured to control a second object to perform the target operation when the touch object leaves the target touch area, where the second object is an object selected by the selection box when the touch object leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of the following: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a preset suspension area.
Optionally, the receiving module 201 may be further configured to receive a second input of the touch object on the target touch area before the processing module 203 performs the target operation on the second object. The processing module 203 may be further configured to control the selection box to move to the second object in response to the second input received by the receiving module 201.
Optionally, the processing module 203 may be specifically configured to, in response to a second input received by the receiving module 201, control the selection frame to move to the second object according to a target direction, where the target direction is determined according to an input direction of the second input.
Optionally, the second input includes M sub-inputs, one sub-input is used to trigger the selection frame to move once, and M is a positive integer. The processing module 203 may be specifically configured to control the selection box to move M times until the selection box moves to the second object. Wherein the selection box is moved one display bit at a time, one display bit for displaying one object.
Optionally, the first object is determined according to an input direction of the first input. The input direction of the first input is a horizontal direction, and the first object is an object displayed at the leftmost end or the rightmost end in the horizontal direction corresponding to the first input in the screen of the electronic equipment; or the input direction of the first input is a vertical direction, and the first object is an object displayed at the topmost or bottommost end in the vertical direction corresponding to the first input in the screen of the electronic device.
The embodiment of the application provides an operation control device, when a user wants to trigger some objects displayed in a side, top or bottom area of a screen of an electronic device with a single hand, since the user can trigger a selection frame to be displayed on one of the objects through an operation of a target touch area (for example, a preset floating area) by a touch body (for example, a finger of the user), in a case that the touch body triggers the selection frame to select another object actually required by the user in the objects, if the touch body leaves the target touch area, the display of the other object in an input frame or a corresponding interface of the other object can be triggered. Therefore, objects which cannot be touched by a single hand of a user and are displayed in the side, top or bottom areas and the like of the screen of the electronic equipment can be triggered directly through the operation of the target touch area, and the convenience of one-hand operation is improved without hand changing operation or two-hand operation.
The operation control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present application is not particularly limited.
The operation control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The operation control device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 5, and is not described here again to avoid repetition.
Optionally, as shown in fig. 7, an electronic device 300 is further provided in this embodiment of the present application, and includes a processor 301, a memory 302, and a program or an instruction stored in the memory 302 and capable of being executed on the processor 301, where the program or the instruction is executed by the processor 301 to implement each process of the operation control method in the embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 400 includes, but is not limited to: radio unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, and processor 410.
Those skilled in the art will appreciate that the electronic device 400 may further comprise a power supply (e.g., a battery) 411 for supplying power to various components, and the power supply may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation to the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 407 may be configured to receive a first input of a touch object on a target touch area. The display unit 406 may be configured to display a selection box for selecting an object on which a target operation is to be performed on the first object in response to the first input received by the user input unit 407. The processor 410 may be configured to perform the target operation on a second object when the touch object leaves the target touch area, where the second object is an object selected by the selection box when the touch object leaves the target touch area. Wherein the target operation comprises any one of: and displaying the second object in an input frame of the interface where the first object is positioned, and displaying the interface corresponding to the second object. The target touch area is any one of: when a user performs one-hand operation, a finger touches an area in a screen of the electronic equipment; a preset operation touch pad; a preset suspension area.
Optionally, the user input unit 407 may be further configured to receive a second input of the touch object on the target touch area before the processor performs the target operation on the second object. The processor 410 may be further configured to control the selection box to move to the second object in response to the second input received by the user input unit 407.
Optionally, the processor 410 may be specifically configured to control the selection box to move to the second object according to a target direction in response to a second input received by the user input unit 407, where the target direction is determined according to an input direction of the second input.
Optionally, the second input includes M sub-inputs, one sub-input is used to trigger the selection box to move once, and M is a positive integer. The processor 410 may be specifically configured to control the selection box to move M times until the selection box moves to the second object. Wherein the selection box is moved one display bit at a time, one display bit for displaying one object.
The embodiment of the application provides an electronic device, when a user wants to trigger some objects displayed in a side, top or bottom area of a screen of the electronic device with a single hand, since the user can trigger a selection frame to be displayed on one of the objects through an operation of a target touch area (for example, a preset floating area) by a touch body (for example, a finger of the user), in a case that the touch body triggers the selection frame to select another object actually required by the user in the objects, if the touch body leaves the target touch area, the display of the other object in an input frame or a corresponding interface of the other object can be triggered. Therefore, the objects which cannot be touched by one hand of a user and are displayed in the side, top or bottom areas and the like of the screen of the electronic equipment can be triggered directly through the operation on the target touch area, hand changing operation or operation with two hands is not needed, and convenience of one-hand operation is improved.
It should be understood that, in the embodiment of the present application, the input unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the graphics processor 4041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 407 includes a touch panel 4071 and other input devices 4072. A touch panel 4071, also referred to as a touch screen. The touch panel 4071 may include two parts, a touch detection device and a touch controller. Other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 409 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 410 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing operation control method embodiment, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer-readable storage media such as a computer-read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and so forth.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the operation control method embodiment, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one of 8230, and" comprising 8230does not exclude the presence of additional like elements in a process, method, article, or apparatus comprising the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method in the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An operation control method, characterized in that the method comprises:
receiving a first input of a touch body on a target touch area;
displaying a selection box on a first object in response to the first input, wherein the selection box is used for selecting an object to be subjected to target operation, and the first object is determined according to the input direction of the first input;
under the condition that the touch body leaves the target touch area, controlling a second object to execute the target operation, wherein the second object is an object selected by the selection frame when the touch body leaves the target touch area;
wherein the target operation comprises any one of: displaying the second object in an input frame in the interface where the first object is located, and displaying an interface corresponding to the second object;
the target touch area is any one of the following items: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a preset suspension area.
2. The method of claim 1, wherein prior to controlling the second object to perform the target operation, the method further comprises:
receiving a second input of the touch object on the target touch area;
controlling the selection box to move to the second object in response to the second input.
3. The method of claim 2, wherein moving the selection box to a second object in response to the second input comprises:
and responding to the second input, and controlling the selection frame to move to a second object according to a target direction, wherein the target direction is determined according to the input direction of the second input.
4. A method according to claim 2 or 3, wherein the second input comprises M sub-inputs, one sub-input for triggering the selection box to move once, M being a positive integer;
the controlling the selection frame to move to a second object includes:
controlling the selection frame to move for M times until the selection frame moves to the second object;
wherein the selection box is moved one display bit at a time, one display bit for displaying one object.
5. The method of claim 1, wherein the first object is determined according to an input direction of the first input; wherein the content of the first and second substances,
the input direction of the first input is a horizontal direction, and the first object is an object displayed at the leftmost end or the rightmost end in the horizontal direction corresponding to the first input in the screen of the electronic equipment; alternatively, the first and second liquid crystal display panels may be,
the input direction of the first input is a vertical direction, and the first object is an object displayed at the topmost or bottommost end in the vertical direction corresponding to the first input in the screen of the electronic device.
6. An operation control device is characterized by comprising a receiving module, a display module and a processing module;
the receiving module is used for receiving a first input of a touch body on a target touch area;
the display module is used for responding to the first input received by the receiving module, and displaying a selection frame on a first object, wherein the selection frame is used for selecting an object to be subjected to target operation, and the first object is determined according to the input direction of the first input;
the processing module is configured to control a second object to execute the target operation when the touch object leaves the target touch area, where the second object is an object selected by the selection frame when the touch object leaves the target touch area;
wherein the target operation comprises any one of: displaying the second object in an input frame of the interface where the first object is located, and displaying an interface corresponding to the second object;
the target touch area is any one of the following items: when a user performs one-hand operation on a screen of the electronic equipment, a finger touches an area; a preset operation touch pad; a preset suspension area.
7. The apparatus according to claim 6, wherein the receiving module is further configured to receive a second input of the touch object on the target touch area before the processing module controls the second object to perform the target operation;
the processing module is further configured to control the selection box to move to the second object in response to the second input received by the receiving module.
8. The apparatus according to claim 7, wherein the processing module is specifically configured to control the selection box to move to the second object according to a target direction in response to the second input received by the receiving module, and the target direction is determined according to an input direction of the second input.
9. The apparatus of claim 7 or 8, wherein the second input comprises M sub-inputs, one sub-input for triggering the selection box to move once, M being a positive integer;
the processing module is specifically configured to control the selection frame to move M times until the selection frame moves to the second object;
wherein the selection box is moved one display bit at a time, one display bit for displaying one object.
10. The apparatus of claim 6, wherein the first object is determined according to an input direction of the first input; wherein the content of the first and second substances,
the input direction of the first input is a horizontal direction, and the first object is an object displayed at the leftmost end or the rightmost end in the horizontal direction corresponding to the first input in the screen of the electronic equipment; alternatively, the first and second electrodes may be,
the input direction of the first input is a vertical direction, and the first object is an object displayed at the topmost or bottommost end in the vertical direction corresponding to the first input in the screen of the electronic device.
CN202011051864.8A 2020-09-29 2020-09-29 Operation control method and device Active CN112148172B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011051864.8A CN112148172B (en) 2020-09-29 2020-09-29 Operation control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011051864.8A CN112148172B (en) 2020-09-29 2020-09-29 Operation control method and device

Publications (2)

Publication Number Publication Date
CN112148172A CN112148172A (en) 2020-12-29
CN112148172B true CN112148172B (en) 2022-11-11

Family

ID=73895293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011051864.8A Active CN112148172B (en) 2020-09-29 2020-09-29 Operation control method and device

Country Status (1)

Country Link
CN (1) CN112148172B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764626A (en) * 2021-01-27 2021-05-07 维沃移动通信有限公司 Touch response method and device and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147197A (en) * 2019-04-08 2019-08-20 努比亚技术有限公司 A kind of operation recognition methods, device and computer readable storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102478994A (en) * 2010-11-29 2012-05-30 国际商业机器公司 Method and device for operating device with interactive screen and mobile device
CN102830917A (en) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 Mobile terminal and touch control establishing method thereof
CN102937873B (en) * 2012-10-12 2015-09-16 天津三星通信技术研究有限公司 The method and apparatus of input through keyboard is carried out in portable terminal
CN103412725B (en) * 2013-08-27 2016-07-06 广州市动景计算机科技有限公司 A kind of touch operation method and device
CN103902158B (en) * 2014-03-18 2017-08-25 深圳市艾优尼科技有限公司 One kind management application program image target method and terminal
CN107092410A (en) * 2016-02-24 2017-08-25 口碑控股有限公司 Interface alternation method, equipment and the intelligent terminal of a kind of touch-screen
CN110069178B (en) * 2019-03-14 2021-04-02 维沃移动通信有限公司 Interface control method and terminal equipment
CN110471587A (en) * 2019-07-17 2019-11-19 深圳传音控股股份有限公司 Exchange method, interactive device, terminal and computer readable storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147197A (en) * 2019-04-08 2019-08-20 努比亚技术有限公司 A kind of operation recognition methods, device and computer readable storage medium

Also Published As

Publication number Publication date
CN112148172A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
US10209877B2 (en) Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
CN106775313B (en) Split screen operation control method and mobile terminal
US10228835B2 (en) Method for displaying information, and terminal equipment
EP3037948B1 (en) Portable electronic device and method of controlling display of selectable elements
US20140123049A1 (en) Keyboard with gesture-redundant keys removed
JP2010079441A (en) Mobile terminal, software keyboard display method, and software keyboard display program
CN112099707A (en) Display method and device and electronic equipment
CN112433693B (en) Split screen display method and device and electronic equipment
WO2023045837A1 (en) Desktop editing method and electronic device
CN112558831A (en) Desktop sorting method and device and electronic equipment
CN112817376A (en) Information display method and device, electronic equipment and storage medium
CN112764613A (en) Icon arranging method and device, electronic equipment and readable storage medium
CN112148172B (en) Operation control method and device
CN113783995A (en) Display control method, display control device, electronic apparatus, and medium
CN112269501A (en) Icon moving method and device and electronic equipment
CN111638828A (en) Interface display method and device
EP2741194A1 (en) Scroll jump interface for touchscreen input/output device
WO2023078348A1 (en) Application display method and apparatus, and electronic device
CN103502921A (en) Text indicator method and electronic device
CN112162689B (en) Input method and device and electronic equipment
CN113515216A (en) Application program switching method and device and electronic equipment
CN114356113A (en) Input method and input device
CN113778277A (en) Application icon management method and device and electronic equipment
CN113885748A (en) Object switching method and device, electronic equipment and readable storage medium
CN113436297A (en) Picture processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant