CN108984093B - Touch operation method and device, storage medium and electronic equipment - Google Patents

Touch operation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108984093B
CN108984093B CN201810688999.1A CN201810688999A CN108984093B CN 108984093 B CN108984093 B CN 108984093B CN 201810688999 A CN201810688999 A CN 201810688999A CN 108984093 B CN108984093 B CN 108984093B
Authority
CN
China
Prior art keywords
touch
area
state
display screen
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810688999.1A
Other languages
Chinese (zh)
Other versions
CN108984093A (en
Inventor
宁梦琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810688999.1A priority Critical patent/CN108984093B/en
Publication of CN108984093A publication Critical patent/CN108984093A/en
Application granted granted Critical
Publication of CN108984093B publication Critical patent/CN108984093B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Abstract

The application discloses a touch operation method and device, a storage medium and electronic equipment. The method is applied to electronic equipment, the electronic equipment comprises a side frame, the side frame comprises a plurality of areas which are adjacent in sequence, each area is provided with a touch sensor, and the touch operation method comprises the following steps: acquiring touch information of a user through the touch sensor; determining a target area, wherein the target area is an area where a touch sensor acquiring the touch information is located; acquiring an instruction corresponding to the touch information and the target area; the operation indicated by the instruction is executed. The application can improve the operation efficiency of the electronic equipment.

Description

Touch operation method and device, storage medium and electronic equipment
Technical Field
The application belongs to the technical field of terminals, and particularly relates to a touch operation method, a touch operation device, a storage medium and electronic equipment.
Background
With the rapid development of electronic technology, electronic devices such as smart phones have more and more functions. For example, a user may select a network connection mode, a standby mode, etc. of the electronic device, and the user may also use various applications on the electronic device to implement different functions.
During the use of the electronic device by the user, various instructions are often required to be issued to the electronic device. And the user typically needs to select the desired instruction from a system menu of the electronic device or a setup menu of an application installed on the electronic device. The instruction selecting mode is complex to operate and low in efficiency.
Disclosure of Invention
The embodiment of the application provides a touch operation method, a touch operation device, a storage medium and electronic equipment, which can improve the operation efficiency of the electronic equipment.
The embodiment of the application provides a touch operation method, which is applied to electronic equipment, wherein the electronic equipment comprises a side frame, the side frame comprises a plurality of areas which are adjacent in sequence, each area is provided with a touch sensor, and the touch operation method comprises the following steps:
acquiring touch information of a user through the touch sensor;
determining a target area, wherein the target area is an area where a touch sensor acquiring the touch information is located;
acquiring an instruction corresponding to the touch information and the target area;
and executing the operation indicated by the instruction.
The embodiment of the application provides a touch control operating means, is applied to electronic equipment, electronic equipment includes the side frame, the side frame is including a plurality of areas adjacent in proper order, all is equipped with touch sensor in each area, touch control operating means includes:
the first acquisition module is used for acquiring touch information of a user through the touch sensor;
the determining module is used for determining a target area, wherein the target area is an area where a touch sensor for acquiring the touch information is located;
the second acquisition module is used for acquiring the instruction corresponding to the touch information and the target area;
and the execution module is used for executing the operation indicated by the instruction.
The embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is enabled to execute the steps in the touch operation method provided by the embodiment of the present application.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the steps in the touch operation method provided in the embodiment of the present application by calling the computer program stored in the memory.
In this embodiment, the electronic device partitions the side frame in advance, and touch sensors are disposed in different areas, so that each area becomes a touch area. The electronic device can acquire the touch information of the user on the touch areas. That is, the electronic device may acquire touch information of different touch areas on the side frame by the user, and accordingly acquire an instruction, and perform an operation indicated by the instruction. The embodiment improves the operation efficiency of the electronic equipment because the operation to be completed is not required to be executed through a system menu or a setting menu of an application built in the electronic equipment.
Drawings
The technical solution and the advantages of the present invention will be apparent from the following detailed description of the embodiments of the present invention with reference to the accompanying drawings.
Fig. 1 is a first flowchart illustrating a touch operation method according to an embodiment of the present disclosure.
Fig. 2 is a second flowchart of the touch operation method according to the embodiment of the present disclosure.
Fig. 3 is a third flowchart illustrating a touch operation method according to an embodiment of the present disclosure.
Fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 5 to 6 are scene schematic diagrams of a touch operation method according to an embodiment of the present disclosure.
Fig. 7 to 8 are schematic views of another scene of the touch operation method according to the embodiment of the present application.
Fig. 9 is a fourth flowchart illustrating a touch operation method according to an embodiment of the present disclosure.
Fig. 10 to 12 are schematic views of another scene of the touch operation method according to the embodiment of the present application.
Fig. 13 is a fifth flowchart illustrating a touch operation method according to an embodiment of the present disclosure.
Fig. 14 is a schematic structural diagram of a touch operation device according to an embodiment of the present disclosure.
Fig. 15 is another schematic structural diagram of a touch operation device according to an embodiment of the present disclosure.
Fig. 16 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 17 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring now to the drawings, in which like numerals represent like elements, the principles of the present invention are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the invention and should not be taken as limiting the invention with regard to other embodiments that are not detailed herein.
The terms "first," "second," "third," and the like in the description and in the claims of the present application and in the above-described drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the objects so described are interchangeable under appropriate circumstances. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, or apparatus, electronic device, system comprising a list of steps is not necessarily limited to those steps or modules or units explicitly listed, may include steps or modules or units not explicitly listed, and may include other steps or modules or units inherent to such process, method, apparatus, electronic device, or system.
Referring to fig. 1, fig. 1 is a first flowchart illustrating a touch operation method according to an embodiment of the present disclosure. The touch operation method can be applied to electronic equipment. The electronic device may be a device such as a smartphone or tablet computer. The electronic device may include a side bezel, which may include a plurality of sequentially adjacent regions, each region having a touch sensor disposed therein.
The flow of the touch operation method may include:
in 101, touch information of a user is acquired through a touch sensor.
With the rapid development of electronic technology, electronic devices such as smart phones have more and more functions. For example, a user may select a network connection mode, a standby mode, etc. of the electronic device, and the user may also use various applications on the electronic device to implement different functions. During the use of the electronic device by the user, various instructions are often required to be issued to the electronic device. And the user typically needs to select the desired instruction from a system menu of the electronic device or a setup menu of an application installed on the electronic device.
For example, in order to implement a setting function of an electronic device by a user to meet personalized requirements of different users, a system menu of the electronic device is usually provided with a plurality of control instructions, such as a standby mode, a network connection mode, a notification bar setting, a reminding mode setting, a storage space management, an application permission setting, and the like of the electronic device. In order to set the application installed in the electronic device by the user, various control instructions, such as a new message reminder, a font size, a do-not-disturb mode, etc., are also typically set in the setting menu of the application.
However, the operation of the electronic device by selecting the control command included in the system menu or the setting menu of the application is cumbersome and inefficient.
In 101 of the embodiment of the present application, for example, the electronic device may first acquire touch information of a user through a touch sensor disposed on a side frame.
It can be understood that each area of the side frame of the electronic device is a touch area because each area is provided with a touch sensor. When a user touches a touch area of the side frame, the touch sensor corresponding to the touch area can acquire touch information of the user.
In some embodiments, the touch operation represented by the touch information may be a single click, a double click, a long press, a swipe, a drag, a double press, or the like. It is to be understood that the present embodiment is not limited thereto.
In 102, a target area is determined, where the touch sensor acquiring the touch information is located.
For example, after acquiring the touch information of the user, the electronic device may determine a target area where the touch sensor acquiring the touch information is located. It will be appreciated that the target area may be one or more of a plurality of areas on the side frame.
In other words, the electronic device needs to determine which area on the side frame the user's touch operation occurs in 102.
At 103, an instruction corresponding to the touch information and the target area is acquired.
At 104, the operation indicated by the instruction is performed.
For example, after the target area is determined, the electronic device may acquire an instruction corresponding to the touch information of the user and the target area, and perform an operation indicated by the instruction.
It can be understood that, in this embodiment, the electronic device divides the side frame in advance, and the touch sensors are disposed in different areas, so that each area becomes a touch area. The electronic device can acquire the touch information of the user on the touch areas. That is, the electronic device may acquire touch information of different touch areas on the side frame by the user, and accordingly acquire an instruction, and perform an operation indicated by the instruction. The embodiment improves the operation efficiency of the electronic equipment because the operation to be completed is not required to be executed through a system menu or a setting menu of an application built in the electronic equipment.
Referring to fig. 2, fig. 2 is a second flowchart illustrating a touch operation method according to an embodiment of the present disclosure. The touch operation method can be applied to electronic equipment. The electronic device may include a side bezel including a plurality of regions, each region having a touch sensor disposed therein.
The flow of the touch operation method may include:
in 201, the electronic device acquires touch information of a user through a touch sensor.
For example, the electronic device may first acquire touch information of the user through a touch sensor disposed on the side frame.
At 202, the electronic device determines a target area, which is an area where the touch sensor acquiring the touch information is located.
For example, after acquiring the touch information of the user, the electronic device may determine a target area where the touch sensor acquiring the touch information is located. It will be appreciated that the target area is one or more of a plurality of areas on the side frame.
At 203, the electronic device determines a display screen state, where the display screen state includes a screen-off state and a screen-on state, the screen-on state includes a locked state and an unlocked state, and the unlocked state includes an application interface state and a desktop state.
For example, after determining the target area, the electronic device may determine a display screen status thereof. The display screen state may include a screen-off state and a screen-on state. The bright screen state in turn comprises a locked state of the screen and an unlocked state of the screen. The unlocked state of the screen, in turn, includes an application interface state and a desktop state.
It should be noted that the application interface state refers to a state in which the display screen currently displays an operation interface of the application program, and the desktop state refers to a state in which the display screen currently displays a system desktop of the electronic device.
At 204, the electronic device obtains instructions corresponding to the touch information, the target area, and the display screen state.
In 205, the electronic device performs the operation indicated by the instruction.
For example, after determining the current display screen state of the display screen, the electronic device may obtain an instruction corresponding to the touch information, the target area, and the display screen state, and execute an operation indicated by the instruction.
It can be understood that, in the embodiment, the touch operation on the electronic device can be realized according to different touch operations and display screen states triggered by different touch areas on the side frame by the user. This not only can improve the efficiency of operating electronic equipment, but also carries out the relevance with touch operation and display screen state, therefore this embodiment can also improve the variety of touch operation to make the realization of touch operation more flexible.
Referring to fig. 3, fig. 3 is a third flowchart illustrating a touch operation method according to an embodiment of the present disclosure. The touch operation method can be applied to electronic equipment. As shown in fig. 4, the electronic device 100 may include a side frame 10, which includes a first region 11, a second region 12, and a third region 13 that are adjacent to each other in sequence from top to bottom. Touch sensors are provided in each of the first, second, and third regions 11, 12, and 13, so that each of the first, second, and third regions 11, 12, and 13 becomes a touch region. It should be understood that the side frame is divided into 3 regions in this embodiment, but this is not a limitation to this embodiment. In other embodiments, more areas or less areas may be divided for the side frame, which is not limited in this embodiment.
The flow of the touch operation method may include:
in 301, the electronic device acquires touch information of a user through a touch sensor.
In 302, the electronic device determines a target area, where the touch sensor acquiring the touch information is located.
In 303, the electronic device determines a display screen status.
For example, 301, and 303 may include:
the user triggers a touch operation on a side frame of the electronic equipment. At this time, the electronic device may acquire the touch information of the user through the touch sensor of the side frame of the electronic device.
Then, the electronic device may determine a target area where the touch sensor acquiring the touch information of the user is located. The target region may be one or more of the first region or the second region or the third region.
And the electronic equipment can also determine the display screen state of the display screen at the moment. The display screen state may include a screen-off state and a screen-on state. The bright screen state in turn comprises a locked state of the screen and an unlocked state of the screen. The unlocked state of the screen, in turn, includes an application interface state and a desktop state.
In 304, when the touch operation represented by the touch information is pressing, the pressing time length reaches a preset time length, and the display screen state is an unlocked state, the electronic device determines a target application associated with the target area.
In 305, the electronic device obtains an instruction for instructing to open a target application and allowing a user to access all data of the target application.
For example, 304 and 305 may include:
the touch operation represented by the touch information of the user is a pressing operation, and the pressing time length of the pressing operation reaches a preset time length, namely the touch operation is a long-pressing operation. At this time, if the display screen state of the electronic device is the unlocked state in the bright screen, the electronic device may determine the application associated with the target area, i.e., the target application.
Thereafter, the electronic device may obtain an instruction indicating to open the target application and allow the user to access all data of the target application. Then, 308 is entered.
For example, the electronic device presets that an application associated with the first area is a camera application a, an application associated with the second area is an instant messaging application B, and an application associated with the third area is a browser application C.
Then, when the display screen is in the unlocked state in the bright screen, if the user presses the first area for a long time, the electronic device may open the camera application a, and at this time, the electronic device allows the user to access all data in the camera application a, for example, at this time, the electronic device allows the user to view all photos in the camera album.
For example, a camera album currently contains photos H, I, J, K, L. When the display screen is in the unlocked state, the user presses the first area for a long time, as shown in fig. 5, and the electronic device may open the camera application a. After opening the camera application a, the user takes a picture M. Then at this point the electronic device may allow the user to view all of the photos H, I, J, K, L, M. For example, as shown in FIG. 6, when the user clicks the "album" button, the electronic device may present all of the photos in the album to the user for viewing.
For another example, when the display screen is in the unlocked state in the bright screen, if the user presses the second area for a long time, the electronic device may open the instant messaging application B, and at this time, the electronic device allows the user to access all data in the instant messaging application B, for example, at this time, the electronic device allows the user to view all messages in the instant messaging application B.
In 306, when the touch operation represented by the touch information is pressing, the pressing time length reaches a preset time length, and the display screen state is a screen resting state or a locking state, the electronic device determines a target application associated with the target area.
In 307, the electronic device obtains an instruction for instructing to open the target application, and denies the user to access data generated by the target application before the target application is opened this time.
For example, 306 and 307 may include:
the touch operation represented by the touch information of the user is a pressing operation, and the pressing time length of the pressing operation reaches a preset time length, namely the touch operation represented by the touch information is a long-pressing operation. At this time, if the display screen state of the electronic device is a screen-saving state or a locked state, the electronic device may determine an application associated with the target area, that is, a target application.
The electronic device may then obtain an instruction, where the instruction is used to instruct the target application to be opened, but deny the user access to the data generated by the target application before the target application is opened. Then, 308 is entered.
For example, the electronic device presets that an application associated with the first area is a camera application a, an application associated with the second area is an instant messaging application B, and an application associated with the third area is a browser application C.
Then, when the display screen is in a screen-rest state or a locked state under a bright screen, if the user presses the first area for a long time, the electronic device may open the camera application a, but at this time, the electronic device only allows the user to view a picture taken after the camera application is opened this time, and rejects the user to view other pictures in the camera album.
For example, a camera album currently contains photos H, I, J, K, L. When the display screen is in the screen rest state, the user presses the first area for a long time, as shown in fig. 7, and then the electronic device may open the camera application a. After opening the camera application a, the user takes a picture N. Then at this point the electronic device only allows the user to view photograph N and not photograph H, I, J, K, L. For example, as shown in FIG. 8, when the user clicks the "album" button, the electronic device may only present photos N in the album to the user for viewing.
It should be noted that, in an embodiment, if a user presses a certain area of a side frame for a long time when the electronic device is in a screen-saving state or a screen-unlocking state, the electronic device may open a target application corresponding to the certain area, but the user is denied access to data generated by the target application before the target application is opened this time, but the electronic device may not perform an unlocking operation on the screen in this process. For example, when the display screen is in the screen-rest state, the user presses the first area for a long time, and then the electronic device may open the camera application a. After opening the camera application a, the user takes a picture N. Then, at this point, the electronic device only allows the user to view picture N, but not other pictures, and if the user exits the camera application, the electronic device is still in the screen-locked state.
At 308, the electronic device performs the operation indicated by the instruction.
For example, after acquiring the instruction, the electronic device may perform the operation indicated by the instruction.
In one embodiment, the electronic device may allow a user to customize the applications to be opened for different areas of the long button border.
It can be understood that, in this embodiment, the electronic device may open different applications corresponding to different areas according to a long-press operation of a user on different areas of the side frame, and the electronic device may give different data access permissions to the user according to a state of the display screen when the long-press operation is triggered. Therefore, the embodiment can not only improve the operation efficiency of the electronic device, but also improve the data security of the electronic device.
Referring to fig. 9, fig. 9 is a fourth flowchart illustrating a touch operation method according to an embodiment of the present disclosure. The touch operation method can be applied to electronic equipment. As shown in fig. 4, the electronic device 100 may include a side frame 10, which includes a first region 11, a second region 12, and a third region 13 that are adjacent to each other in sequence from top to bottom. Touch sensors are provided in each of the first, second, and third regions 11, 12, and 13 so that each of the first, second, and third regions 11, 12, and 13 becomes a touch region.
The flow of the touch operation method may include:
in 401, the electronic device acquires touch information of a user through a touch sensor.
In 402, the electronic device determines a target area, where the touch sensor acquiring the touch information is located.
In 403, the electronic device determines a display screen status.
For example, 401, 402, and 403 may include:
the user triggers a touch operation on a side frame of the electronic equipment. At this time, the electronic device may acquire the touch information of the user through the touch sensor of the side frame of the electronic device.
Then, the electronic device may determine a target area where the touch sensor acquiring the touch information of the user is located. The target region may be one or more of the first region or the second region or the third region.
And the electronic equipment can also determine the display screen state of the display screen at the moment. The display screen state may include a screen-off state and a screen-on state. The bright screen state in turn comprises a locked state of the screen and an unlocked state of the screen. The unlocked state of the screen, in turn, includes an application interface state and a desktop state.
In 404, when the display screen state is the application interface state and the touch operation represented by the touch information is a single click or a combination of the single click and sliding, the electronic device acquires the identifier of the current application.
At 405, the electronic device obtains an instruction corresponding to the identifier of the current application and the target area.
At 406, the electronic device performs the operation indicated by the instruction.
For example, 404, 405, 406 may include:
the touch operation represented by the touch information of the user is clicking or a combination of clicking and sliding, and if the display screen state of the display screen is an application interface state at the moment, namely the display screen currently displays an application running interface, the electronic device can acquire the identifier of the current application. Then, the electronic device may obtain an instruction corresponding to the identifier and the target area of the current application, and execute an operation indicated by the instruction. In other words, if the user triggers a single click or a combination of a single click and a sliding action on the side frame, and the display screen currently displays an application running interface, the electronic device may execute a certain shortcut operation of the current application, and the specific type of shortcut operation is associated with the area corresponding to the touch operation of the user.
For example, a user opens a camera application, and at this time, a display screen of the electronic device displays a running interface of the camera application, such as a browsing interface of a camera album or a setting interface of camera shooting parameters (i.e., the display screen state is an application interface state). If the user clicks the first area, the electronic device may acquire a jump-to-photograph interface instruction associated with the camera application and the first area, and enter a photograph interface. If the user clicks the second area, the electronic device may obtain a filter selection interface opening instruction associated with the camera application and the second area, and open the filter selection interface. If the user clicks the third area, the electronic device may acquire a switch mode entering instruction associated with the camera application and the third area, and enter a switch mode, at which point the user may control the camera to switch between photographing and recording video.
For another example, the user opens a music application, and at this time, the display screen of the electronic device displays an operation interface of the music application, such as a song playing interface (i.e., the display screen state is an application interface state). If the user clicks on the first area, the electronic device performs an operation of playing the previous song, as shown in fig. 10. If the user clicks the second area, the electronic device may perform an operation of pausing the play, as shown in fig. 11. If the user clicks on the third area, the electronic device may perform the next operation, as shown in fig. 12.
In another embodiment, if the user opens the music application, the electronic device may also perform an operation of playing a previous song if the user clicks the first area when the display screen state is the information screen state. If the user clicks the second area, the electronic device may perform an operation of pausing the play. If the user clicks the third area, the electronic device may perform the next operation.
For another example, the user opens the video application, and at this time, the display screen of the electronic device displays a running interface of the video application, such as a video playing interface (i.e., the display screen state is an application interface state). If the user clicks the first area and performs a sliding operation in the first area, the electronic device may perform a volume adjustment operation. For example, if the user clicks on the first region and slides up the first region, the electronic device may turn up the current volume. If the user clicks the first region and slides down the first region, the electronic device may turn down the current volume. If the user clicks the second area, the electronic device may perform an operation of pausing the play. If the user clicks the third area, the electronic device may perform a brightness adjustment operation. For example, if the user clicks on the first region and slides up the first region, the electronic device may turn up the current display screen brightness. If the user clicks the first area and slides down the first area, the electronic device may turn down the current display screen brightness.
For another example, the user opens an application such as a contact, an alarm clock, a memo or a calendar, and the display of the electronic device displays a running interface of the current application (i.e., the display state is an application interface state). And if the user clicks the second area, the electronic equipment executes the operation of the new item. For example, the electronic device may create a contact, an alarm, a memo, a date-related item, or the like. If the user clicks the first area, the electronic device may perform an editing operation. Such as editing information for a currently selected contact or editing a currently selected memo item, etc. If the user clicks the third area, the electronic device may perform an operation of switching tab pages. For example, the current application is a memo, and the memo includes three memo items, i.e., memo item 1, memo item 2, and memo item 3. These 3 memos exist in the form of different tab pages. For example, if memo item 1 is currently selected, the electronic device may switch from memo item 1 to memo item 2 if the user clicks the third area.
For another example, the user opens the recording or short message application, and at this time, the display screen of the electronic device displays the running interface of the application (that is, the display screen state is the application interface state). If the user clicks the second area, the electronic device can perform the operation of the new item. For example, the electronic device creates a new short message. If the user clicks the third area, the electronic device may perform an editing operation. For example, the electronic device edits the currently selected short message.
In another example, the user opens the album application, and the display screen of the electronic device is in the application interface state. If the user clicks the second area, the electronic device may perform a switch to a luni-solar view. If the user is currently viewing a photograph, the electronic device may perform a photograph editing operation if the user clicks the second region.
As another example, if the user is currently using the input method application, and the user clicks the second area, the electronic device may perform an operation of switching different voices or emoticons.
For another example, the user opens an electronic book application, and an electronic book reading interface displayed on the display screen of the electronic device (i.e., the display screen state is the application interface state) at this time. And if the user clicks the first area and then slides up and down, the electronic equipment executes the operation of adjusting the font size. If the user clicks the second area, the electronic device may perform a page turning operation. If the user clicks the third area and then slides up and down, the electronic device can execute the operation of adjusting the page brightness.
It can be understood that, in this embodiment, when the display screen of the electronic device currently displays the application running interface, if the user clicks the area on the side frame, the electronic device may execute the application shortcut function corresponding to the area, such as pausing the play or playing the previous frame or playing the next frame, so as to improve the operation efficiency of the electronic device.
Referring to fig. 13, fig. 13 is a fifth flowchart illustrating a touch operation method according to an embodiment of the present disclosure. The touch operation method can be applied to electronic equipment. As shown in fig. 4, the electronic device 100 may include a side frame 10, which includes a first region 11, a second region 12, and a third region 13 that are adjacent to each other in sequence from top to bottom. Touch sensors are provided in each of the first, second, and third regions 11, 12, and 13 so that each of the first, second, and third regions 11, 12, and 13 becomes a touch region.
The flow of the touch operation method may include:
in 501, the electronic device acquires touch information of a user through a touch sensor.
In 502, the electronic device determines a target area, which is an area where the touch sensor acquiring the touch information is located.
At 503, the electronic device determines a display screen status.
For example, 501, 502, and 503 may include:
the user triggers a touch operation on a side frame of the electronic equipment. At this time, the electronic device may acquire the touch information of the user through the touch sensor of the side frame of the electronic device.
Then, the electronic device may determine a target area where the touch sensor acquiring the touch information of the user is located. The target region may be one or more of the first region or the second region or the third region.
And the electronic equipment can also determine the display screen state of the display screen at the moment. The display screen state may include a screen-off state and a screen-on state. The bright screen state in turn comprises a locked state of the screen and an unlocked state of the screen. The unlocked state of the screen, in turn, includes an application interface state and a desktop state.
At 504, the electronic device obtains an instruction corresponding to the touch information, the target area, and a display screen state, where the display screen state is an unlocked state, and the touch operation represented by the touch information is a double click.
For example, the electronic device may obtain an instruction corresponding to the touch information, the target area, and a display screen state, where the display screen state is an unlock state, and the touch operation represented by the touch information is a double click. That is, the user triggers the double-click operation on the side frame, and if the display screen is in the unlocked state under the bright screen, the electronic device can acquire the instruction corresponding to the target area where the double-click operation is located no matter the display screen displays the desktop or the application running interface. Then, the instruction proceeds to 508, where the operation indicated by the instruction is executed.
For example, a display screen of the electronic device is currently in an unlocked state with a bright screen. If the user double-clicks the first area, the electronic device may perform an operation of a cleaning process. If the user clicks the second area, the electronic device may perform an operation of turning on the voice assistant. If the user clicks the third area, the electronic device may execute the screenshot function.
In 505, when the display screen state is the unlock state and the target area is the second area, the electronic device acquires an instruction for instructing to open the control panel if the touch operation indicated by the touch information is a slide-up, and acquires an instruction for instructing to open the notification panel if the touch operation indicated by the touch information is a slide-down.
For example, when the display screen is in the unlocked state under the bright screen, that is, no matter the display screen displays a desktop or an application running interface at this time, if the user performs an up-sliding operation in the second area, the electronic device may obtain an instruction for instructing to open the control panel. The control panel includes various control functions of the electronic device, such as a call-up mode setting, application program permission, and the like. If the user performs a slide-down operation in the second area, the electronic device may acquire an instruction for instructing to turn on the notification panel. The notification panel includes notification information such as unread messages received by each application of the electronic device.
Then, the process proceeds to 508, where the electronic device performs the operation indicated by the instruction, for example, turning on a control panel or turning on a notification panel.
At 506, when the display screen state is the unlocked state and the target area is the first area, if the touch action indicated by the touch information is sliding, the electronic device acquires an instruction for instructing to perform volume adjustment.
For example, when the display screen is in an unlocked state under a bright screen, that is, no matter the display screen displays a desktop or an application running interface, if the user performs a sliding operation in the first area, the electronic device may obtain an instruction for instructing to perform volume adjustment. Then, the process proceeds to 508, where the electronic device performs the operation indicated by the instruction. For example, when the first area is subjected to a slide-up operation, the volume is turned up; when the first area is subjected to the sliding down operation, the volume is turned down.
In 507, when the display screen state is the unlocked state and the target area is the third area, if the touch operation indicated by the touch information is sliding, the electronic device obtains an instruction for instructing brightness adjustment.
For example, when the display screen is in the unlocked state under the bright screen, that is, no matter the display screen displays a desktop or an application running interface, if the user performs a sliding operation in the third area, the electronic device may obtain an instruction for instructing to perform brightness adjustment. Then, the process proceeds to 508, where the electronic device performs the operation indicated by the instruction. For example, when the third region performs the upward-sliding operation, the luminance is turned up; when the third area is subjected to the slide-down operation, the luminance is turned down.
At 508, the electronic device performs the operation indicated by the instruction.
For example, after acquiring the instruction, the electronic device may perform the operation indicated by the instruction.
It can be understood that, in the embodiment of the application, a user can trigger different touch information through different areas on the side frame of the electronic device, so that a plurality of quick touch operations are realized without occupying areas or physical keys in a screen, and therefore, the embodiment not only improves the operation efficiency of the electronic device, but also does not increase the operation burden of the screen or the physical keys.
In an implementation, this embodiment may further include the following steps:
the method comprises the steps that the electronic equipment obtains a plurality of area values in advance, and a preset area threshold value is determined according to the area values, wherein the area values are areas of mutual contact between a touch area and a user finger when a user clicks the touch area corresponding to the touch sensor each time;
when the acquired touch information indicates that a user triggers a click action on the touch area, if the area corresponding to the click action reaches a preset area threshold, the electronic equipment determines that the click action is a single click.
For example, the electronic device may guide a user to click a touch area of the device on the side frame in advance, and obtain an area where the touch area and a finger of the user contact each other when the user clicks the touch area, thereby obtaining a plurality of area values. The electronic device may then determine a preset area threshold from the plurality of area values.
For example, the electronic device guides the user to make 5 clicks, the area values of the 5 clicks being 8.1, 8.3, 8.2, 8.0, respectively. Then, the electronic device may calculate an average of the 5 area values and determine the average as a preset area threshold. For example, if the average of these 5 area values is 8.18, then the preset area threshold may be 8.18.
After the preset area threshold is obtained, when a user triggers a click action on the touch area of the side frame, the electronic device determines the click action as a click only when detecting that the area corresponding to the click action reaches the preset area threshold.
Similarly, when the acquired touch information indicates that the user triggers a click action in the touch area twice continuously, the electronic device determines the two consecutive click actions as a double-click operation only when the area corresponding to each click action in the two consecutive click actions reaches the threshold area threshold.
It can be understood that, this can avoid the touch misoperation caused by the user mistaken clicking.
In an implementation, this embodiment may further include the following steps:
the electronic equipment acquires a plurality of track length values in advance, and determines a preset length threshold according to the plurality of track length values, wherein the track length values are length values of tracks when a user performs sliding operation in a touch area corresponding to the touch sensor each time;
when the acquired touch information indicates that a user triggers one touch operation on the touch area, if the touch track corresponding to the touch operation reaches the preset length threshold, the electronic equipment determines that the touch operation is a sliding operation.
For example, the electronic device may guide the user to perform a sliding operation in a touch area on the side frame in advance, and obtain a length value of a track of each sliding operation of the user, so as to obtain a plurality of track length values. The electronic device may then determine a preset length threshold based on the plurality of track length values.
For example, the electronic device guides the user to perform 5 sliding operations, and the length values of the tracks of the 5 sliding operations are 10.5, 10.6, 10.7, 10.8 and 10.9 respectively. The electronic device may then calculate an average of these 5 track length values and determine this average as the preset area threshold. For example, if the average of the 5 track length values is 10.7, then the preset length threshold may be 10.7.
After the preset length threshold is obtained, when a user triggers a touch action on the touch area of the side frame, the electronic device determines the touch action as a sliding operation only when detecting that the touch track corresponding to the touch action reaches the preset length threshold.
It can be understood that, this can avoid the touch misoperation caused by the user sliding by mistake.
In some embodiments, the electronic device may allow a user to customize an instruction corresponding to the partial touch operation or close an instruction corresponding to the partial touch operation.
In some embodiments, in addition to the single-click, double-click, slide, long-press and other touch actions provided in this embodiment, the user may also customize some touch actions, such as three-click, double-press, drag and the like, and the touch operations that these touch actions need to be implemented.
Referring to fig. 14, fig. 14 is a schematic structural diagram of a touch operation device according to an embodiment of the present disclosure. The touch control operation device is applied to electronic equipment, the electronic equipment comprises a side frame, the side frame comprises a plurality of areas which are adjacent in sequence, and each area is provided with a touch sensor.
The touch operation device 600 may include: a first obtaining module 601, a determining module 602, a second obtaining module 603, and an executing module 604.
The first obtaining module 601 is configured to obtain touch information of a user through the touch sensor.
A determining module 602, configured to determine a target area, where the touch sensor that acquires the touch information is located.
A second obtaining module 603, configured to obtain an instruction corresponding to the touch information and the target area.
And the execution module 604 is used for executing the operation indicated by the instruction.
In one embodiment, the determining module 602 may be further configured to: determining the display screen state of the electronic equipment, wherein the display screen state comprises a screen-off state and a screen-on state, the screen-on state comprises a locking state and an unlocking state, and the unlocking state comprises an application interface state and a desktop state.
Then, the second obtaining module 603 may be configured to obtain an instruction corresponding to the touch information, the target area, and the display screen state.
In one embodiment, the second obtaining module 603 may be configured to:
when the display screen state is an unlocking state, the touch operation represented by the touch information is pressing, and the pressing time length reaches a preset time length, determining a target application associated with the target area;
and acquiring an instruction, wherein the instruction is used for indicating to start the target application and allowing a user to access all data of the target application.
In one embodiment, the second obtaining module 603 may be configured to:
when the display screen state is a screen resting state or a locking state, the touch operation represented by the touch information is pressing, and the pressing time length reaches a preset time length, determining a target application associated with the target area;
and acquiring an instruction, wherein the instruction is used for indicating to start the target application and refusing a user to access data generated before the target application is started.
In one embodiment, the second obtaining module 603 may be configured to:
when the display screen state is an application interface state and the touch operation represented by the touch information is clicking or a combination of clicking and sliding, acquiring the identifier of the current application;
and acquiring the instruction corresponding to the identifier of the current application and the target area.
In one embodiment, the second obtaining module 603 may be configured to:
and acquiring an instruction corresponding to the touch information, the target area and the display screen state, wherein the display screen state is an unlocking state, and the touch operation represented by the touch information is double-click.
In one embodiment, the side frame comprises a first area, a second area and a third area which are adjacent in sequence from top to bottom; the second obtaining module 603 may be configured to:
when the display screen is in an unlocked state and the target area is the second area, if the touch operation represented by the touch information is upward sliding, acquiring an instruction for indicating to start a control panel;
when the display screen state is an unlocked state and the target area is the second area, if the touch operation represented by the touch information is downward sliding, acquiring an instruction for indicating to start a notification panel;
when the display screen state is an unlocked state and the target area is the first area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating volume adjustment;
and when the display screen state is an unlocked state and the target area is the third area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating brightness adjustment.
Referring to fig. 15, fig. 15 is another schematic structural diagram of a touch operation device according to an embodiment of the present disclosure. In an embodiment, the touch operation device 600 may further include: a setup module 605.
The setting module 605 is configured to obtain a plurality of area values in advance, and determine a preset area threshold according to the plurality of area values, where the area value is an area where a touch area corresponding to the touch sensor is contacted with a finger of a user each time the user clicks the touch area.
Then, the first obtaining module 601 may be configured to: when the acquired touch information indicates that a user triggers a click action on the touch area, if the area corresponding to the click action reaches a preset area threshold value, the click action is determined to be a single click.
In one embodiment, the first obtaining module 601 may be configured to: when the acquired touch information indicates that the user triggers the click action on the touch area twice continuously, if the area corresponding to each click action reaches a preset area threshold, the click action is determined to be double click.
In another embodiment, the setting module 605 may be configured to:
the method comprises the steps of obtaining a plurality of track length values in advance, and determining a preset length threshold according to the plurality of track length values, wherein the track length values are length values of tracks when a user performs sliding operation in a touch area corresponding to a touch sensor every time.
Then, the first obtaining module 601 may be configured to: when the acquired touch information indicates that a user triggers one touch operation on the touch area, if the touch track corresponding to the touch operation reaches the preset length threshold, determining that the touch operation is a sliding operation.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the steps in the touch operation method provided in this embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the steps in the touch operation method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 16, fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 700 may include components such as a side frame 701, a memory 702, a processor 703, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 16 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The side bezel 701 may include a plurality of regions adjacent in sequence, each region having a touch sensor disposed therein.
The memory 702 may be used to store applications and data. The memory 702 stores applications containing executable code. The application programs may constitute various functional modules. The processor 703 executes various functional applications and data processing by running an application program stored in the memory 702.
The processor 703 is a control center of the electronic device, connects various parts of the entire electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 702 and calling data stored in the memory 702, thereby integrally monitoring the electronic device.
In this embodiment, the processor 703 in the electronic device loads the executable code corresponding to the process of one or more application programs into the memory 702 according to the following instructions, and the processor 703 runs the application programs stored in the memory 702, thereby implementing the steps:
acquiring touch information of a user through the touch sensor; determining a target area, wherein the target area is an area where a touch sensor acquiring the touch information is located; acquiring an instruction corresponding to the touch information and the target area; and executing the operation indicated by the instruction.
Referring to fig. 17, an electronic device 800 may include a side frame 801, a memory 802, a processor 803, an input unit 804, an output unit 805, and the like.
The side bezel 801 may include a plurality of regions adjacent in sequence, each region having a touch sensor disposed therein.
The memory 802 may be used to store applications and data. The memory 802 stores applications containing executable code. The application programs may constitute various functional modules. The processor 803 executes various functional applications and data processing by running the application programs stored in the memory 802.
The processor 803 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 802 and calling data stored in the memory 802, thereby integrally monitoring the electronic device.
The input unit 804 may be used to receive input numbers, character information, or user characteristic information, such as a fingerprint, and to generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control.
The output unit 805 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The output unit may include a display panel.
In this embodiment, the processor 803 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 802 according to the following instructions, and the processor 803 runs the application programs stored in the memory 802, thereby implementing the steps:
acquiring touch information of a user through the touch sensor; determining a target area, wherein the target area is an area where a touch sensor acquiring the touch information is located; acquiring an instruction corresponding to the touch information and the target area; and executing the operation indicated by the instruction.
In one embodiment, the processor 803 may further perform: determining the display screen state of the electronic equipment, wherein the display screen state comprises a screen-off state and a screen-on state, the screen-on state comprises a locking state and an unlocking state, and the unlocking state comprises an application interface state and a desktop state.
Then, when the processor 803 executes the step of obtaining the instruction corresponding to the touch information and the target area, it may execute: and acquiring instructions corresponding to the touch information, the target area and the display screen state.
In one embodiment, when the processor 803 executes the step of obtaining the instruction corresponding to the touch information, the target area and the display screen state, it may execute: when the display screen state is an unlocking state, the touch operation represented by the touch information is pressing, and the pressing time length reaches a preset time length, determining a target application associated with the target area; and acquiring an instruction, wherein the instruction is used for indicating to start the target application and allowing a user to access all data of the target application.
In one embodiment, when the processor 803 executes the step of obtaining the instruction corresponding to the touch information, the target area and the display screen state, it may execute: when the display screen state is a screen resting state or a locking state, the touch operation represented by the touch information is pressing, and the pressing time length reaches a preset time length, determining a target application associated with the target area; and acquiring an instruction, wherein the instruction is used for indicating to start the target application and refusing a user to access data generated before the target application is started.
In one embodiment, when the processor 803 executes the step of obtaining the instruction corresponding to the touch information, the target area and the display screen state, it may execute: when the display screen state is an application interface state and the touch operation represented by the touch information is clicking or a combination of clicking and sliding, acquiring the identifier of the current application; and acquiring the instruction corresponding to the identifier of the current application and the target area.
In one embodiment, when the processor 803 executes the step of obtaining the instruction corresponding to the touch information, the target area and the display screen state, it may execute: and acquiring an instruction corresponding to the touch information, the target area and the display screen state, wherein the display screen state is an unlocking state, and the touch operation represented by the touch information is double-click.
In one embodiment, the side frame comprises a first area, a second area and a third area which are adjacent in sequence from top to bottom; when the processor 803 executes the step of obtaining the instruction corresponding to the touch information, the target area, and the display screen state, it may execute: when the display screen is in an unlocked state and the target area is the second area, if the touch operation represented by the touch information is upward sliding, acquiring an instruction for indicating to start a control panel; when the display screen state is an unlocked state and the target area is the second area, if the touch operation represented by the touch information is downward sliding, acquiring an instruction for indicating to start a notification panel; when the display screen state is an unlocked state and the target area is the first area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating volume adjustment; and when the display screen state is an unlocked state and the target area is the third area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating brightness adjustment.
In one embodiment, the processor 803 may further perform: obtaining a plurality of area values in advance, and determining a preset area threshold according to the area values, wherein the area values are areas of mutual contact between the touch areas and fingers of a user when the user clicks the touch areas corresponding to the touch sensors each time; when the acquired touch information indicates that a user triggers a click action on the touch area, if the area corresponding to the click action reaches a preset area threshold value, the click action is determined to be a single click.
In one embodiment, the processor 803 may further perform: when the acquired touch information indicates that the user triggers the click action on the touch area twice continuously, if the area corresponding to each click action reaches a preset area threshold, the click action is determined to be double click.
In one embodiment, the processor 803 may further perform: obtaining a plurality of track length values in advance, and determining a preset length threshold according to the plurality of track length values, wherein the track length values are length values of tracks when a user performs sliding operation in a touch area corresponding to the touch sensor each time; when the acquired touch information indicates that a user triggers one touch operation on the touch area, if the touch track corresponding to the touch operation reaches the preset length threshold, determining that the touch operation is a sliding operation.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may be referred to the above detailed description of the touch operation method, and are not described herein again.
The touch operation device provided in the embodiment of the present application and the touch operation method in the above embodiments belong to the same concept, and any method provided in the embodiment of the touch operation method may be run on the touch operation device, and a specific implementation process thereof is described in the embodiment of the touch operation method in detail, and is not described herein again.
It should be noted that, for the touch operation method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the touch operation method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution process, the process of the embodiment of the touch operation method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
For the touch operation device in the embodiment of the present application, each functional module may be integrated into one processing chip, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The touch operation method, the touch operation device, the storage medium and the electronic device provided by the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for those skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (11)

1. A touch operation method is applied to electronic equipment, the electronic equipment comprises a side frame, the side frame comprises a plurality of areas which are adjacent in sequence, each area is provided with a touch sensor, and the touch operation method comprises the following steps:
acquiring touch information of a user through the touch sensor;
determining a target area, wherein the target area is an area where a touch sensor acquiring the touch information is located;
determining a display screen state of the electronic equipment, wherein the display screen state comprises a screen-off state and a screen-on state, the screen-on state comprises a locked state and an unlocked state, and the unlocked state comprises an application interface state and a desktop state;
acquiring instructions corresponding to the touch information, the target area and the display screen state, wherein when the display screen state is a screen resting state or a locking state, and the touch operation represented by the touch information is pressing and the pressing duration reaches a preset duration, determining a target application associated with the target area, wherein different areas of a side frame of the electronic equipment correspond to different target applications; acquiring an instruction, wherein the instruction is used for indicating to start the target application and refusing a user to access data generated before the target application is started;
and executing the operation indicated by the instruction.
2. The touch operation method according to claim 1, wherein the step of obtaining the instruction corresponding to the touch information, the target area, and the display screen state includes:
when the display screen state is an unlocking state, the touch operation represented by the touch information is pressing, and the pressing time length reaches a preset time length, determining a target application associated with the target area;
and acquiring an instruction, wherein the instruction is used for indicating to start the target application and allowing a user to access all data of the target application.
3. The touch operation method according to claim 1, wherein the step of obtaining the instruction corresponding to the touch information, the target area, and the display screen state includes:
when the display screen state is an application interface state and the touch operation represented by the touch information is clicking or a combination of clicking and sliding, acquiring the identifier of the current application;
and acquiring the instruction corresponding to the identifier of the current application and the target area.
4. The touch operation method according to claim 1, wherein the step of obtaining the instruction corresponding to the touch information, the target area, and the display screen state includes:
and acquiring an instruction corresponding to the touch information, the target area and the display screen state, wherein the display screen state is an unlocking state, and the touch operation represented by the touch information is double-click.
5. The touch operation method according to claim 1, wherein the side frame comprises a first area, a second area and a third area which are adjacent in sequence from top to bottom;
the step of obtaining the instruction corresponding to the touch information, the target area and the display screen state comprises the following steps:
when the display screen is in an unlocked state and the target area is the second area, if the touch operation represented by the touch information is upward sliding, acquiring an instruction for indicating to start a control panel;
when the display screen state is an unlocked state and the target area is the second area, if the touch operation represented by the touch information is downward sliding, acquiring an instruction for indicating to start a notification panel;
when the display screen state is an unlocked state and the target area is the first area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating volume adjustment;
and when the display screen state is an unlocked state and the target area is the third area, if the touch operation represented by the touch information is sliding, acquiring an instruction for indicating brightness adjustment.
6. The touch operation method according to claim 1, further comprising:
obtaining a plurality of area values in advance, and determining a preset area threshold according to the area values, wherein the area values are areas of mutual contact between the touch areas and fingers of a user when the user clicks the touch areas corresponding to the touch sensors each time;
when the acquired touch information indicates that a user triggers a click action on the touch area, if the area corresponding to the click action reaches a preset area threshold value, the click action is determined to be a single click.
7. The touch operation method according to claim 6, further comprising:
when the acquired touch information indicates that the user triggers the click action on the touch area twice continuously, if the area corresponding to each click action reaches a preset area threshold, the click action is determined to be double click.
8. The touch operation method according to claim 1, further comprising:
obtaining a plurality of track length values in advance, and determining a preset length threshold according to the plurality of track length values, wherein the track length values are length values of tracks when a user performs sliding operation in a touch area corresponding to the touch sensor each time;
when the acquired touch information indicates that a user triggers one touch operation on the touch area, if the touch track corresponding to the touch operation reaches the preset length threshold, determining that the touch operation is a sliding operation.
9. The utility model provides a touch-control operating means, is applied to electronic equipment, electronic equipment includes the side frame, its characterized in that, the side frame is including a plurality of areas adjacent in proper order, all is equipped with touch sensor in each area, touch-control operating means includes:
the first acquisition module is used for acquiring touch information of a user through the touch sensor;
the determining module is used for determining a target area, wherein the target area is an area where a touch sensor for acquiring the touch information is located;
the second acquisition module is used for determining the display screen state of the electronic equipment, wherein the display screen state comprises a screen-off state and a screen-on state, the screen-on state comprises a locked state and an unlocked state, and the unlocked state comprises an application interface state and a desktop state; the second obtaining module is further configured to obtain an instruction corresponding to the touch information, the target area and the display screen state, wherein when the display screen state is an information screen state or a locking state, and the touch operation indicated by the touch information is pressing and a pressing duration reaches a preset duration, the second obtaining module is further configured to determine a target application associated with the target area, and different areas of a side frame of the electronic device correspond to different target applications; the second obtaining module is further used for obtaining an instruction, wherein the instruction is used for indicating to start the target application and refusing a user to access data generated before the target application is started;
and the execution module is used for executing the operation indicated by the instruction.
10. A storage medium having stored thereon a computer program, characterized in that the computer program, when executed on a computer, causes the computer to execute the method according to any of claims 1 to 8.
11. An electronic device comprising a memory, a processor, wherein the processor is configured to perform the method of any one of claims 1 to 8 by invoking a computer program stored in the memory.
CN201810688999.1A 2018-06-28 2018-06-28 Touch operation method and device, storage medium and electronic equipment Active CN108984093B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810688999.1A CN108984093B (en) 2018-06-28 2018-06-28 Touch operation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810688999.1A CN108984093B (en) 2018-06-28 2018-06-28 Touch operation method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108984093A CN108984093A (en) 2018-12-11
CN108984093B true CN108984093B (en) 2020-12-22

Family

ID=64539417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810688999.1A Active CN108984093B (en) 2018-06-28 2018-06-28 Touch operation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108984093B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111756876A (en) * 2019-03-27 2020-10-09 北京小米移动软件有限公司 Touch detection device and method and terminal
CN110134239B (en) * 2019-05-13 2022-05-17 Oppo广东移动通信有限公司 Information transmission method, storage medium, and electronic device
CN110297593A (en) * 2019-06-28 2019-10-01 Oppo广东移动通信有限公司 The processing method of electronic equipment and notice
CN110399082B (en) * 2019-07-05 2022-03-25 北京达佳互联信息技术有限公司 Terminal attribute control method and device, electronic equipment and medium
CN112558803A (en) * 2019-09-26 2021-03-26 北京钛方科技有限责任公司 Vehicle-mounted touch device, control method and automobile
CN111638809A (en) * 2020-05-22 2020-09-08 讯飞幻境(北京)科技有限公司 Method, device, equipment and medium for acquiring touch information
CN111666024B (en) * 2020-05-28 2022-04-12 维沃移动通信(杭州)有限公司 Screen recording method and device and electronic equipment
CN111857496A (en) * 2020-06-30 2020-10-30 维沃移动通信有限公司 Operation execution method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955337A (en) * 2014-05-06 2014-07-30 北京金山安全软件有限公司 Method and system for opening application program in mobile terminal
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104636039A (en) * 2015-01-30 2015-05-20 深圳市中兴移动通信有限公司 Application control method and device based on borderless terminal
CN106527852A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Control device and method of terminal application bar
CN107037971A (en) * 2017-03-27 2017-08-11 努比亚技术有限公司 Application management device, mobile terminal and method
CN108089795A (en) * 2018-01-22 2018-05-29 广东欧珀移动通信有限公司 touch operation method, device, storage medium and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014104727A1 (en) * 2012-12-26 2014-07-03 전자부품연구원 Method for providing user interface using multi-point touch and apparatus for same
CN104679429A (en) * 2015-02-12 2015-06-03 深圳市中兴移动通信有限公司 Accidental-touch-prevention method and device
CN104991711A (en) * 2015-06-10 2015-10-21 努比亚技术有限公司 Method, apparatus and terminal for processing information
CN105511784B (en) * 2015-12-02 2019-05-21 北京新美互通科技有限公司 A kind of data inputting method based on pressure detecting, device and mobile terminal
CN108040145A (en) * 2017-12-14 2018-05-15 广东欧珀移动通信有限公司 Electronic device, method of controlling operation thereof and Related product

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955337A (en) * 2014-05-06 2014-07-30 北京金山安全软件有限公司 Method and system for opening application program in mobile terminal
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104636039A (en) * 2015-01-30 2015-05-20 深圳市中兴移动通信有限公司 Application control method and device based on borderless terminal
CN106527852A (en) * 2016-10-31 2017-03-22 努比亚技术有限公司 Control device and method of terminal application bar
CN107037971A (en) * 2017-03-27 2017-08-11 努比亚技术有限公司 Application management device, mobile terminal and method
CN108089795A (en) * 2018-01-22 2018-05-29 广东欧珀移动通信有限公司 touch operation method, device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN108984093A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
CN108984093B (en) Touch operation method and device, storage medium and electronic equipment
JP7469396B2 (en) Gestural Graphical User Interface for Managing Simultaneously Open Software Applications - Patent application
CN105955607B (en) Content sharing method and device
EP3454197B1 (en) Method, device, and non-transitory computer-readable storage medium for switching pages of applications in a terminal device
RU2595634C2 (en) Touch screens hover input handling
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US8825699B2 (en) Contextual search by a mobile communications device
EP2570910B1 (en) Electronic device, method of providing background thereof, and computer program product using the same
EP2433470B1 (en) Column organization of content
JP2023175718A (en) System and method for interacting with multiple applications displayed simultaneously on electronic device with touch-sensitive display
EP3454196B1 (en) Method and apparatus for editing object
EP4224301A1 (en) Payment interface display method and apparatus, and electronic device
JP2022520094A (en) Interface display method and its devices, terminals and computer programs
US11537265B2 (en) Method and apparatus for displaying object
CN104216973A (en) Data search method and data search device
WO2023016463A1 (en) Display control method and apparatus, and electronic device and medium
CN112068764B (en) Language switching method and device for language switching
CN111857497B (en) Operation prompting method and electronic equipment
CN113253883A (en) Application interface display method and device and electronic equipment
CN111158811B (en) Advertisement processing method and device, electronic equipment and storage medium
CN113452744A (en) File sharing method, device, equipment and storage medium
CN114095611B (en) Processing method and device of caller identification interface, electronic equipment and storage medium
CN111796736B (en) Application sharing method and device and electronic equipment
CN107885571A (en) Show page control method and device
CN112148406A (en) Page switching method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant