CN107291235B - Control method and device - Google Patents
Control method and device Download PDFInfo
- Publication number
- CN107291235B CN107291235B CN201710480689.6A CN201710480689A CN107291235B CN 107291235 B CN107291235 B CN 107291235B CN 201710480689 A CN201710480689 A CN 201710480689A CN 107291235 B CN107291235 B CN 107291235B
- Authority
- CN
- China
- Prior art keywords
- gesture operation
- target
- preset
- user
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/48—Program initiating; Program switching, e.g. by interrupt
- G06F9/4806—Task transfer initiation or dispatching
- G06F9/4843—Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a control method and a control device. One embodiment of the method comprises: in response to the fact that the gesture operation of a user in any preset area of at least one preset area in a display screen is detected, determining the preset area where the gesture operation is located as a target area, and matching the gesture operation with the target gesture operation related to the target area and/or default gesture operation for configuring the target area; and executing the operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation. This embodiment improves the flexibility of control.
Description
Technical Field
The application relates to the field of computers, in particular to the field of terminal equipment, and particularly relates to a control method and device.
Background
With the development of computer technology, digital products are more and more widely applied to various fields of life and work. Terminal devices (such as mobile phones) are also indispensable tools for people, and not only are used for daily communications, but also various applications (such as music playing applications and reading applications) can be installed on mobile phones.
With the increase of functions and applications of mobile phones, when a user needs to control a terminal device to run a certain function, the user usually needs to perform the operation several times (for example, click, page turning, etc.), and thus there is a problem of low flexibility.
Disclosure of Invention
It is an object of embodiments of the present application to provide an improved … method and apparatus to address the above mentioned problems of the background.
In a first aspect, an embodiment of the present application provides a control method for a terminal device, where the method includes: in response to the fact that the gesture operation of a user in any preset area of at least one preset area in the display screen is detected, determining the preset area where the gesture operation is located as a target area, and matching the gesture operation with the target gesture operation related to the target area and/or default gesture operation for configuring the target area; and executing the operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation.
In some embodiments, based on the matching result of the gesture operation and the target gesture operation and the default gesture operation, the operation corresponding to the matching result is executed, and the operation comprises: in response to determining that the gesture operation matches the target gesture operation, performing a target control operation that simultaneously matches the target region and the target gesture operation, wherein the target control operation includes at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications.
In some embodiments, based on the matching result of the gesture operation and the target gesture operation and the default gesture operation, the operation corresponding to the matching result is executed, and the operation comprises: and presenting a configuration interface for configuring the target area in response to determining that the gesture operation is matched with the default gesture operation, wherein the configuration interface displays a frame of the target area and a scroll bar for setting the transparency of the target area.
In some embodiments, the method further comprises: and in response to the detection of the selection operation and the sliding operation of the frame by the user in sequence, adjusting the size of the target area based on the position selected by the selection operation and the stop position of the sliding operation.
In some embodiments, the method further comprises: in response to detecting a sliding operation of a user on a target area surrounded by the frame, adjusting the position of the target area in the display screen to move the target area from a starting position of the sliding operation to a stopping position of the sliding operation.
In some embodiments, the method further comprises: in response to detecting the user's selection operation and the slide operation of the scroll bar in sequence, the transparency of the object is set based on the stop position of the slide operation.
In some embodiments, a gesture operation configuration key is also displayed in the configuration interface; and the method further comprises: presenting a gesture operation selection interface in response to the detection of the click operation of the user on the gesture operation configuration key, so that the user selects a preset gesture operation from a plurality of preset gesture operations presented on the gesture operation selection interface, wherein the plurality of preset gesture operations comprise a sliding operation, a click operation and a long press operation; in response to the fact that the user successfully selects the preset gesture operation, determining the preset gesture operation selected by the user as a target gesture operation associated with the target area to update the original target gesture operation, and presenting a control operation selection interface to enable the user to select and preset the control operation from a plurality of preset control operations; and in response to the fact that the user successfully selects the preset control operation, determining the preset control operation selected by the user as a target control operation matched with the target area and the target gesture operation at the same time, so as to update the original target control operation.
In a second aspect, an embodiment of the present application provides a control apparatus for a terminal device, where the apparatus includes: the matching unit is configured to respond to the detection of a gesture operation of a user in any preset area of at least one preset area in the display screen, determine the preset area where the gesture operation is located as a target area, and match the gesture operation with the target gesture operation associated with the target area and/or default gesture operation for configuring the target area; and the execution unit is configured to execute the operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation.
In some embodiments, the execution unit is further configured to: in response to determining that the gesture operation matches the target gesture operation, performing a target control operation that simultaneously matches the target region and the target gesture operation, wherein the target control operation includes at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications.
In some embodiments, the execution unit is further configured to: and presenting a configuration interface for configuring the target area in response to determining that the gesture operation is matched with the default gesture operation, wherein the configuration interface displays a frame of the target area and a scroll bar for setting the transparency of the target area.
In some embodiments, the apparatus further comprises: and the adjusting unit is configured to respond to the sequential detection of the selection operation and the sliding operation of the user on the frame, and adjust the size of the target area based on the position selected by the selection operation and the stop position of the sliding operation.
In some embodiments, the apparatus further comprises: and the moving unit is used for adjusting the position of the target area in the display screen in response to detecting the sliding operation of the user on the target area surrounded by the frame, so that the target area moves from the starting position of the sliding operation to the stopping position of the sliding operation.
In some embodiments, the apparatus further comprises: and the setting unit is used for responding to the sequential detection of the selection operation and the sliding operation of the scroll bar by the user, and setting the transparency of the target based on the stop position of the sliding operation.
In some embodiments, a gesture operation configuration key is also displayed in the configuration interface; and the apparatus further comprises: the gesture operation selection interface is configured to be displayed in response to the fact that the user clicks the gesture operation configuration key, so that the user can select a preset gesture operation from a plurality of preset gesture operations displayed on the gesture operation selection interface, wherein the preset gesture operations comprise sliding operation, clicking operation and long-time pressing operation; the second presentation unit is configured to respond to the fact that the user successfully selects the preset gesture operation, determine the preset gesture operation selected by the user as a target gesture operation associated with the target area so as to update the original target gesture operation, and present a control operation selection interface so that the user selects and presets the control operation from a plurality of preset control operations; and the determining unit is configured to respond to the fact that the user successfully selects the preset control operation, determine the preset control operation selected by the user as a target control operation matched with the target area and the target gesture operation at the same time, and update the original target control operation.
In a third aspect, an embodiment of the present application provides a terminal device, including: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement a method as in any one of the embodiments of the control method described above.
According to the control method and device provided by the embodiment of the application, the preset area where the gesture operation is located is determined as the target area by responding to the gesture operation of the user in any preset area of the at least one preset area in the display screen, the gesture operation is respectively matched with the target gesture operation associated with the target area and the default gesture operation used for configuring the target area, and then the operation corresponding to the matching result is executed based on the matching results of the gesture operation, the target gesture operation and the default gesture operation, so that the multitask operation of the user can be realized, and the control flexibility is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a control method according to the present application;
FIG. 3 is a schematic diagram of an application scenario of a control method according to the present application;
FIG. 4 is a schematic diagram of yet another application scenario of a control method according to the present application;
FIG. 5 is a schematic diagram of yet another application scenario of a control method according to the present application;
FIG. 6 is a schematic diagram of yet another application scenario of a control method according to the present application;
FIG. 7 is a schematic diagram of a configuration interface according to the present application;
FIG. 8 is a flow chart of yet another embodiment of a control method according to the present application;
FIG. 9 is a schematic block diagram of one embodiment of a control device according to the present application;
fig. 10 is a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which the control method or control apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as a reading application, a music application, a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, and the like.
The server 105 may be a server that provides various services, such as an application management server that provides support for applications installed on the terminal devices 101, 102, 103. The application management server may provide services such as updating of applications.
It should be noted that the control method provided in the embodiment of the present application is generally executed by the terminal devices 101, 102, and 103, and accordingly, the control device is generally disposed in the terminal devices 101, 102, and 103. It should be noted that the control method provided by the embodiment of the present application does not depend on the network 104 and the server 105, and thus, the network 104 and the server 105 may not exist in fig. 1.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a control method according to the present application is shown. The control method comprises the following steps:
In this embodiment, in response to detecting a gesture operation of a user in any preset area of at least one preset area in a display screen of the electronic device (for example, terminal devices 101, 102, and 103 shown in fig. 1), the control method may determine the preset area where the gesture operation is located as a target area, and match the gesture operation with a target gesture operation associated with the target area and/or a default gesture operation for configuring the target area. The preset area may be any area in the display screen preset by the user, such as the center of the screen, the upper left corner of the screen, the lower right corner of the screen, and the like. The gesture operation may be any operation performed by the user on the display screen, such as clicking, sliding, long pressing, and the like. It should be noted that the target gesture operation and the default gesture operation may be any gesture operations set by a user, such as a leftward slide, a rightward slide, an upward slide, a downward slide, a long press, a click, and the like, and the target gesture operation and the default gesture operation are different.
And 202, executing an operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation.
In this embodiment, the electronic device may execute an operation corresponding to a matching result based on a matching result of the gesture operation and the target gesture operation and/or the default gesture operation. Specifically, in response to the matching result indicating that the gesture operation matches the target gesture operation or the default gesture operation, the electronic device may execute an operation preset by a user or default by the electronic device and corresponding to the matched target gesture operation or default gesture operation; and responding to the matching result to indicate that the gesture operation is not matched with the target gesture operation and the default gesture operation, and the electronic equipment can continue to monitor the gesture operation of the user and execute the matching of the next gesture operation.
In some optional implementations of the embodiment, in response to determining that the gesture operation matches the target gesture operation, the electronic device may perform a target control operation that simultaneously matches the target area and the target gesture operation. Wherein the target control operation may include at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications. It should be noted that, the target control operation that is simultaneously matched with the target area and the target gesture operation may be preset by the user.
As an example, as shown in fig. 3, when a certain area 301 below the screen is set in advance by the user and a left-right swipe in this area 301 is set, switching to the application program (for example, a right swipe switches to the previous application in the application list, and a left swipe switches to the next application in the application list). When the gesture operation of the user in the area 301 is a rightward sliding operation, the electronic device may present an open interface of the previous application program.
As still another example, as shown in fig. 4, the user sets a certain area 401 on the left side of the screen in advance, and sets to lock the screen while sliding down in this area 401. When the gesture operation of the user in the area 401 is downward sliding, the electronic device may lock the screen.
As still another example, as shown in fig. 5, a user sets in advance a certain area 501 at the upper right of the screen, and sets to switch to the previous music when sliding upward in the area 501 and to switch to the next music when sliding downward. When the user operates to slide upwards in the gesture of the area 501, the electronic device may switch to the previous music; when the user's gesture in this area 501 operates to slide down, the electronic device may switch to the next music.
As still another example, as shown in fig. 6, the user sets a certain area 601 in the upper right of the screen in advance, and sets switching to the previous one when sliding upward in this area 601 so as to switch to the next music when sliding downward. In addition, the user sets in advance a certain area 602 in the upper left of the screen, and sets a telephone number for calling a contact designated in advance when sliding upward in this area 602. When the gesture operation of the user in the area 601 is upward sliding, the user can switch to the previous music; when the gesture operation of the user in the area 601 is downward sliding, switching to the next music can be performed; when the gesture operation of the user in the area 602 is a sliding upward, the electronic device may call the contact.
In some optional implementations of the embodiment, in response to determining that the gesture operation matches the default gesture operation (e.g., a long-press operation), the electronic device may present a configuration interface for configuring the target area, where the configuration interface may display a frame of the target area and a scroll bar for setting transparency of the target area. As shown in fig. 7, fig. 7 is a schematic diagram of the configuration interface, and the configuration interface displays a frame 701 of a target area and a scroll bar 702 for setting the transparency of the target area.
In the method provided by the embodiment of the application, in response to the detection of the gesture operation of the user in any preset area of the at least one preset area in the display screen, the preset area where the gesture operation is located is determined as the target area, the gesture operation is matched with the target gesture operation associated with the target area and/or the default gesture operation for configuring the target area, and then the operation corresponding to the matching result is executed based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation, so that the multitask operation of the user can be realized, and the control flexibility is improved.
With further reference to fig. 8, a flow 800 of yet another embodiment of a control method is illustrated. The process 800 of the control method includes the following steps:
In this embodiment, in response to detecting a gesture operation of a user in any preset area of at least one preset area in a display screen of the electronic device (for example, terminal devices 101, 102, and 103 shown in fig. 1), the control method may determine the preset area where the gesture operation is located as a target area, and match the gesture operation with a target gesture operation associated with the target area and/or a default gesture operation for configuring the target area.
And step 802, in response to determining that the gesture operation is matched with the default gesture operation, presenting a configuration interface for configuring the target area, wherein the configuration interface displays a frame of the target area and a scroll bar for setting the transparency of the target area.
In this embodiment, in response to determining that the gesture operation matches the default gesture operation (e.g., a long-press operation), the electronic device may present a configuration interface for configuring the target area, where the configuration interface may display a frame of the target area and a scroll bar for setting transparency of the target area. As shown in fig. 7, fig. 7 is a schematic diagram of the configuration interface, and the configuration interface displays a frame 701 of a target area and a scroll bar 702 for setting the transparency of the target area.
It should be noted that the operations of the steps 801-802 are substantially the same as the operations of the steps 201-202, and are not described herein again.
And 803, in response to the detection of the selection operation and the sliding operation of the user on the frame in sequence, adjusting the size of the target area based on the position selected by the selection operation and the stop position of the sliding operation.
In this embodiment, in response to sequentially detecting the selection operation and the sliding operation of the user on the border, the electronic book device may adjust the size of the target area based on a position selected by the selection operation and a stop position of the sliding operation. For example, the target area may be reduced when the stop position of the slide operation is within the frame, and the target area may be enlarged when the stop position of the slide operation is outside the frame.
And step 804, in response to detecting the sliding operation of the user on the target area surrounded by the frame, adjusting the position of the target area in the display screen, so that the target area moves from the starting position of the sliding operation to the stopping position of the sliding operation.
In this embodiment, in response to detecting the sliding operation of the user on the target area surrounded by the frame, the electronic book device may adjust the position of the target area in the display screen so that the target area moves from the start position of the sliding operation to the stop position of the sliding operation.
In response to detecting the user's selection operation and the sliding operation of the scroll bar in sequence, the transparency of the object is set based on the stop position of the sliding operation, step 805.
In this embodiment, in response to sequentially detecting the selection operation and the sliding operation of the user on the scroll bar, the electronic book device may set the transparency of the target area based on a stop position of the sliding operation. In practice, the transparency can be adjusted from 0% to 100%. When adjusted to 0%, the target area is hidden.
In this embodiment, a gesture operation configuration key may also be displayed in the configuration interface. In response to detecting the click operation of the user on the gesture operation configuration key, the electronic book device may present a gesture operation selection interface, so that the user selects a preset gesture operation from a plurality of preset gesture operations presented on the gesture operation selection interface, where the preset gesture operations may include a sliding operation, a click operation, and a long-time pressing operation.
In step 807, in response to determining that the user successfully selects the preset gesture operation, the preset gesture operation selected by the user is determined as a target gesture operation associated with the target area, so as to update the original target gesture operation, and a control operation selection interface is presented, so that the user selects and presets a control operation from a plurality of preset control operations.
In this embodiment, in response to determining that the user successfully selects the preset gesture operation, the electronic book device may determine the preset gesture operation selected by the user as a target gesture operation associated with the target area, so as to update an original target gesture operation, and present a control operation selection interface, so that the user selects and preset control operation from a plurality of preset control operations. Here, the control operation selection interface may include a plurality of preset control operation options.
And 808, in response to the fact that the user successfully selects the preset control operation, determining the preset control operation selected by the user as a target control operation matched with the target area and the target gesture operation at the same time, so as to update the original target control operation.
In this embodiment, in response to determining that the user successfully selects the preset control operation, the electronic book device may determine the preset control operation selected by the user as a target control operation that is simultaneously matched with the target area and the target gesture operation, so as to update an original target control operation.
As can be seen from fig. 8, compared with the embodiment corresponding to fig. 2, the flow 800 of the control method in the present embodiment highlights the setting step of the target area. Therefore, the scheme described in the embodiment can enable the user to change and reset the set preset area at any time, thereby not only realizing the multitask operation of the user, but also further improving the flexibility of control.
With further reference to fig. 9, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of a control apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which may be specifically applied in a terminal device.
As shown in fig. 9, the control device 900 according to the present embodiment includes: a matching unit 901, configured to, in response to detecting a gesture operation of a user in any one of at least one preset region in a display screen, determine the preset region where the gesture operation is located as a target region, and match the gesture operation with a target gesture operation associated with the target region and/or a default gesture operation for configuring the target region; an executing unit 902, configured to execute, based on a matching result between the gesture operation and the target gesture operation and/or the default gesture operation, an operation corresponding to the matching result.
In this embodiment, in response to detecting a gesture operation of a user in any preset area of at least one preset area in a display screen of the electronic device, the matching unit 901 may determine the preset area where the gesture operation is located as a target area, and match the gesture operation with a target gesture operation associated with the target area and/or a default gesture operation for configuring the target area.
In this embodiment, the executing unit 902 may execute an operation corresponding to a matching result based on a matching result between the gesture operation and the target gesture operation and/or the default gesture operation.
In some optional implementations of the embodiment, the executing unit 902 may be further configured to, in response to determining that the gesture operation matches the target gesture operation, execute a target control operation that simultaneously matches the target area and the target gesture operation, where the target control operation includes at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications.
In some optional implementations of the embodiment, the executing unit 902 may be further configured to, in response to determining that the gesture operation matches the default gesture operation, present a configuration interface for configuring the target area, where the configuration interface displays a border of the target area and a scroll bar for setting a transparency of the target area.
In some optional implementations of the present embodiment, the control device 900 may further include an adjusting unit (not shown in the figure). The adjustment unit may be configured to adjust the size of the target area based on a position selected by the selection operation and a stop position of the sliding operation in response to sequentially detecting the selection operation and the sliding operation of the user on the frame.
In some optional implementations of the present embodiment, the control device 900 may further include a mobile unit (not shown in the figure). The moving unit may be configured to adjust a position of the target area in the display screen in response to detecting a sliding operation of the user on the target area surrounded by the frame, so that the target area moves from a start position of the sliding operation to a stop position of the sliding operation.
In some optional implementations of the present embodiment, the control device 900 may further include a setting unit (not shown in the figure). The setting unit may be configured to set the transparency of the object based on a stop position of the sliding operation in response to sequentially detecting the selection operation and the sliding operation of the user on the scroll bar.
In some optional implementations of this embodiment, a gesture operation configuration key is further displayed in the configuration interface. The control device 900 may further include a first presenting unit, a second presenting unit, and a determining unit (not shown in the figure). The first presenting unit may be configured to present a gesture operation selection interface in response to detecting a click operation of the user on the gesture operation configuration key, so that the user selects a preset gesture operation from a plurality of preset gesture operations presented on the gesture operation selection interface, where the plurality of preset gesture operations include a sliding operation, a click operation, and a long-time pressing operation. The second presenting unit may be configured to, in response to determining that the user successfully selects the preset gesture operation, determine the preset gesture operation selected by the user as a target gesture operation associated with the target area to update an original target gesture operation, and present a control operation selection interface, so that the user selects and preset a control operation from a plurality of preset control operations. The determining unit may be configured to determine, in response to determining that the user successfully selects a preset control operation, the preset control operation selected by the user as a target control operation that is simultaneously matched with the target area and the target gesture operation, so as to update an original target control operation.
In the apparatus provided in the foregoing embodiment of the application, in response to detecting a gesture operation of a user in any preset area of at least one preset area in a display screen through the matching unit 901, the preset area where the gesture operation is located is determined as a target area, the gesture operation is respectively matched with a target gesture operation associated with the target area and a default gesture operation for configuring the target area, and then the execution unit 902 executes an operation corresponding to the matching result based on a matching result of the gesture operation, the target gesture operation, and the default gesture operation, so that a multitask operation of the user can be implemented, and flexibility of control is improved.
Referring now to FIG. 10, shown is a block diagram of a computer system 1000 suitable for use in implementing a terminal device of an embodiment of the present application. The terminal device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 10, the computer system 1000 includes a Central Processing Unit (CPU)1001 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)1002 or a program loaded from a storage section 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for the operation of the system 1000 are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other via a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a touch screen, a touch panel, and the like; an output section 1007 including a display such as a Liquid Crystal Display (LCD) and a speaker; a storage portion 1008 including a hard disk and the like; and a communication section 1009 including a network interface card such as a LAN card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the I/O interface 1005 as necessary. A removable medium 1011 such as a semiconductor memory or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage portion 1008 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication part 1009 and/or installed from the removable medium 1011. The above-described functions defined in the method of the present application are executed when the computer program is executed by the Central Processing Unit (CPU) 1001. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a matching unit and an execution unit. Where the names of the cells do not in some cases constitute a definition of the cell itself, for example, a matching cell may also be described as a "cell matching a gesture operation".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: in response to the fact that the gesture operation of a user in any preset area of at least one preset area in a display screen is detected, determining the preset area where the gesture operation is located as a target area, and matching the gesture operation with the target gesture operation related to the target area and/or default gesture operation for configuring the target area; and executing the operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.
Claims (16)
1. A control method for a terminal device, the method comprising:
in response to the fact that a gesture operation of a user in any preset area of at least one preset area in a display screen is detected, determining the preset area where the gesture operation is located as a target area, and matching the gesture operation with a target gesture operation associated with the target area and/or a default gesture operation for configuring the target area, wherein the preset area is an area in the display screen preset by the user, the target gesture operation is a gesture operation set by the user, and the target gesture operation is different from the default gesture operation;
based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation, executing an operation corresponding to the matching result, including: in response to determining that the gesture operation matches the target gesture operation, performing a target control operation that simultaneously matches the target region and the target gesture operation, wherein the target control operation that simultaneously matches the target region and the target gesture operation is preset by a user;
the executing the operation corresponding to the matching result based on the matching result of the gesture operation and the target gesture operation and/or the default gesture operation further comprises: in response to determining that the gesture operation matches the default gesture operation, presenting a configuration interface for configuring the target area.
2. The control method of claim 1, wherein the target control operation comprises at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications.
3. The control method according to claim 1 or 2, wherein the configuration interface displays a frame of the target area and a scroll bar for setting transparency of the target area.
4. The control method according to claim 3, characterized in that the method further comprises:
in response to the fact that the user selects and slides the frame, the size of the target area is adjusted based on the position selected by the selecting operation and the stop position of the sliding operation.
5. The control method according to claim 3, characterized in that the method further comprises:
in response to detecting the sliding operation of the user on the target area enclosed by the frame, adjusting the position of the target area in the display screen to enable the target area to move from the starting position of the sliding operation to the stopping position of the sliding operation.
6. The control method according to claim 3, characterized in that the method further comprises:
and in response to the sequential detection of the selection operation and the sliding operation of the scroll bar by the user, setting the transparency of the control area based on the stop position of the sliding operation.
7. The control method according to claim 3, wherein a gesture operation configuration key is further displayed in the configuration interface; and
the method further comprises the following steps:
presenting a gesture operation selection interface in response to the detection of the click operation of the user on the gesture operation configuration key, so that the user selects a preset gesture operation from a plurality of preset gesture operations presented on the gesture operation selection interface, wherein the preset gesture operations comprise a sliding operation, a click operation and a long-time press operation;
in response to the fact that the user successfully selects the preset gesture operation, determining the preset gesture operation selected by the user as a target gesture operation associated with the target area to update the original target gesture operation, and presenting a control operation selection interface to enable the user to select and preset control operation from a plurality of preset control operations;
and in response to the fact that the user successfully selects the preset control operation, determining the preset control operation selected by the user as a target control operation matched with the target area and the target gesture operation at the same time, so as to update the original target control operation.
8. A control apparatus for a terminal device, the apparatus comprising:
the matching unit is configured to respond to the detection of a gesture operation of a user in any preset area of at least one preset area in a display screen, determine the preset area where the gesture operation is located as a target area, and match the gesture operation with a target gesture operation associated with the target area and/or a default gesture operation for configuring the target area, wherein the preset area is an area in the display screen preset by the user, the target gesture operation is a gesture operation set by the user, and the target gesture operation is different from the default gesture operation;
an execution unit configured to execute an operation corresponding to a matching result of the gesture operation and the target gesture operation and/or the default gesture operation based on the matching result;
wherein the execution unit is further configured to: in response to determining that the gesture operation matches the target gesture operation, performing a target control operation that simultaneously matches the target region and the target gesture operation, wherein the target control operation that simultaneously matches the target region and the target gesture operation is preset by a user;
the execution unit is further configured to: in response to determining that the gesture operation matches the default gesture operation, presenting a configuration interface for configuring the target area.
9. The control apparatus of claim 8, wherein the target control operation comprises at least one of: switching applications, controlling music play or pause, switching currently played music, locking a screen, setting system options, calling contacts, starting applications.
10. The control device of claim 8 or 9, wherein the configuration interface displays a border of the target area and a scroll bar for setting a transparency of the target area.
11. The control device of claim 10, further comprising:
and the adjusting unit is configured to respond to the sequential detection of the selection operation and the sliding operation of the user on the frame, and adjust the size of the target area based on the position selected by the selection operation and the stop position of the sliding operation.
12. The control device of claim 10, further comprising:
a moving unit configured to adjust a position of the target area in the display screen to move the target area from a start position of the sliding operation to a stop position of the sliding operation in response to detecting the sliding operation of the target area surrounded by the bezel by the user.
13. The control device of claim 10, further comprising:
and the setting unit is used for responding to the sequential detection of the selection operation and the sliding operation of the scroll bar by the user and setting the transparency of the target based on the stop position of the sliding operation.
14. The control device of claim 10, wherein a gesture operation configuration key is further displayed in the configuration interface; and
the device further comprises:
the first presentation unit is configured to present a gesture operation selection interface in response to detecting a click operation of the user on the gesture operation configuration key, so that the user selects a preset gesture operation from a plurality of preset gesture operations presented on the gesture operation selection interface, wherein the preset gesture operations include a sliding operation, a click operation and a long-time pressing operation;
the second presentation unit is configured to determine, in response to determining that the user successfully selects a preset gesture operation, the preset gesture operation selected by the user as a target gesture operation associated with the target area to update an original target gesture operation, and present a control operation selection interface to enable the user to select and preset a control operation from a plurality of preset control operations;
and the determining unit is configured to respond to the determination that the user successfully selects the preset control operation, determine the preset control operation selected by the user as a target control operation matched with the target area and the target gesture operation at the same time, and update the original target control operation.
15. A terminal device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710480689.6A CN107291235B (en) | 2017-06-22 | 2017-06-22 | Control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710480689.6A CN107291235B (en) | 2017-06-22 | 2017-06-22 | Control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107291235A CN107291235A (en) | 2017-10-24 |
CN107291235B true CN107291235B (en) | 2021-09-21 |
Family
ID=60097809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710480689.6A Active CN107291235B (en) | 2017-06-22 | 2017-06-22 | Control method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107291235B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019090577A1 (en) * | 2017-11-09 | 2019-05-16 | 深圳传音通讯有限公司 | Screen control method and screen control system for smart terminal |
CN114625288B (en) * | 2020-12-11 | 2024-08-27 | Oppo广东移动通信有限公司 | Interface processing method, device, electronic equipment and computer readable storage medium |
CN112612361A (en) * | 2020-12-17 | 2021-04-06 | 深圳康佳电子科技有限公司 | Equipment control method, device, system, terminal equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103914222A (en) * | 2013-01-07 | 2014-07-09 | Lg电子株式会社 | Image display device and controlling method thereof |
CN104391642A (en) * | 2014-10-23 | 2015-03-04 | 上海闻泰电子科技有限公司 | Service information providing method and system and application execution method and system |
CN104699238A (en) * | 2013-12-10 | 2015-06-10 | 现代自动车株式会社 | System and method for gesture recognition of vehicle |
CN105830351A (en) * | 2014-06-23 | 2016-08-03 | Lg电子株式会社 | Mobile terminal and method of controlling the same |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060160090A1 (en) * | 2003-04-11 | 2006-07-20 | Macina Robert A | Composition splice variants and methods relating to cancer specific genes and proteins |
KR101345755B1 (en) * | 2007-09-11 | 2013-12-27 | 삼성전자주식회사 | Apparatus and method for controlling operation in a mobile terminal |
CN102117166A (en) * | 2009-12-31 | 2011-07-06 | 联想(北京)有限公司 | Electronic equipment, method for realizing prearranged operation instructions, and handset |
US9542097B2 (en) * | 2010-01-13 | 2017-01-10 | Lenovo (Singapore) Pte. Ltd. | Virtual touchpad for a touch device |
CN102750099B (en) * | 2012-06-20 | 2016-01-13 | 宇龙计算机通信科技(深圳)有限公司 | Terminal and terminal control method |
CN103677603B (en) * | 2012-09-17 | 2017-11-07 | 北京三星通信技术研究有限公司 | The method and apparatus of touch-sensitive area operation |
KR102015347B1 (en) * | 2013-01-07 | 2019-08-28 | 삼성전자 주식회사 | Method and apparatus for providing mouse function using touch device |
KR102131828B1 (en) * | 2013-12-03 | 2020-07-09 | 엘지전자 주식회사 | Terminal and method for controlling the same |
CN106254596A (en) * | 2016-09-29 | 2016-12-21 | 努比亚技术有限公司 | A kind of kneading identification system based on proximity transducer and mobile terminal |
-
2017
- 2017-06-22 CN CN201710480689.6A patent/CN107291235B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103914222A (en) * | 2013-01-07 | 2014-07-09 | Lg电子株式会社 | Image display device and controlling method thereof |
CN104699238A (en) * | 2013-12-10 | 2015-06-10 | 现代自动车株式会社 | System and method for gesture recognition of vehicle |
CN105830351A (en) * | 2014-06-23 | 2016-08-03 | Lg电子株式会社 | Mobile terminal and method of controlling the same |
CN104391642A (en) * | 2014-10-23 | 2015-03-04 | 上海闻泰电子科技有限公司 | Service information providing method and system and application execution method and system |
Also Published As
Publication number | Publication date |
---|---|
CN107291235A (en) | 2017-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7538329B2 (en) | Horizontal screen interaction method, device, electronic device, and storage medium | |
US11397522B2 (en) | Page browsing | |
CN107172685B (en) | Method and equipment for displaying information of wireless access point on mobile terminal | |
CN102112953B (en) | Information processing apparatus and program | |
EP2908231A1 (en) | Object suspension realizing method and device | |
US20220050562A1 (en) | Methods and apparatuses for generating a hosted application | |
CN103294335A (en) | Apparatus and method for creating a shortcut menu | |
CN110825997B (en) | Information flow page display method, device, terminal equipment and system | |
CN110865734B (en) | Target object display method and device, electronic equipment and computer readable medium | |
CN107291235B (en) | Control method and device | |
US20150199058A1 (en) | Information processing method and electronic device | |
JP2019008772A (en) | Method and device for inputting characters | |
CN107045546B (en) | Webpage processing method and device and intelligent terminal | |
CN104765525A (en) | Operation interface switching method and device | |
CN110737495A (en) | Window display method, device, terminal and storage medium | |
CN113835585A (en) | Application interface switching method, device and equipment based on navigation and storage medium | |
CN113190152A (en) | Method and device for switching application program theme | |
US10261666B2 (en) | Context-independent navigation of electronic content | |
CN111291090B (en) | Method, device, electronic equipment and medium for acquiring time period based on time control | |
CN112764862A (en) | Application program control method and device and electronic equipment | |
CN110069186B (en) | Method and equipment for displaying operation interface of application | |
KR20190001076A (en) | Method of providing contents of a mobile terminal based on a duration of a user's touch | |
CN111638828A (en) | Interface display method and device | |
CN112083840A (en) | Method, device, terminal and storage medium for controlling electronic equipment | |
CN112966201B (en) | Object processing method, device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |