CN108062199B - Touch information processing method and device, storage medium and terminal - Google Patents

Touch information processing method and device, storage medium and terminal Download PDF

Info

Publication number
CN108062199B
CN108062199B CN201711353203.9A CN201711353203A CN108062199B CN 108062199 B CN108062199 B CN 108062199B CN 201711353203 A CN201711353203 A CN 201711353203A CN 108062199 B CN108062199 B CN 108062199B
Authority
CN
China
Prior art keywords
touch
touch area
operation information
touch operation
combination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711353203.9A
Other languages
Chinese (zh)
Other versions
CN108062199A (en
Inventor
曾鸿坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711353203.9A priority Critical patent/CN108062199B/en
Publication of CN108062199A publication Critical patent/CN108062199A/en
Application granted granted Critical
Publication of CN108062199B publication Critical patent/CN108062199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a touch information processing method, which comprises the following steps: when the terminal is in a screen locking state, acquiring a group of touch operation information on a display screen; analyzing a group of touch operation information to obtain a touch area indicated by each touch operation information in the group of touch operation information so as to form a touch area combination; judging whether the touch area combination is a preset touch area combination or not; and when the touch area combination is judged to be the preset touch area combination, acquiring the current time and carrying out voice playing. According to the method and the device, when the terminal is in the screen locking state, a group of touch operation information on the display screen is acquired, the touch areas, indicated by each touch operation information in the group of touch operation information, are analyzed, a touch area combination is formed, when the touch area combination is a preset touch area combination, voice playing at the current time is carried out, the problem that the user wants to check the time, steps are complex is solved, the display screen does not need to be lightened, and the electric quantity of the terminal is saved.

Description

Touch information processing method and device, storage medium and terminal
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for processing touch information, a storage medium, and a terminal.
Background
With the rapid development and popularization of terminal technologies, the functions of terminals such as mobile phones are also more and more powerful, mobile phones in full touch screen mode have gradually replaced mobile phones in original keyboard mode, and in order to prevent users from mistakenly touching and save electricity for endurance, when users do not use mobile phones, the mobile phones can enter a screen locking state. When used again, the handset may be unlocked for use.
However, when the mobile phone is in the screen-locked state, the user needs to press the key on the mobile phone to light the mobile phone to check the current time, the lighting process is complicated, and the screen is lighted, which causes waste of the electric quantity of the mobile phone.
Disclosure of Invention
The embodiment of the application provides a touch information processing method and device, a storage medium and a terminal, which can improve the cruising ability of the terminal.
In a first aspect, an embodiment of the present application provides a method for processing touch information, which is applied to a terminal, where the terminal includes a display screen, a touch area of the display screen includes a first touch area and a second touch area, and the method includes:
when the terminal is in a screen locking state, acquiring a group of touch operation information on the display screen;
analyzing the group of touch operation information to obtain a touch area correspondingly indicated by each piece of touch operation information in the group of touch operation information so as to form a touch area combination;
judging whether the touch area combination is a preset touch area combination or not;
and when the touch area combination is judged to be the preset touch area combination, acquiring the current time and carrying out voice playing.
In a second aspect, an embodiment of the present application provides a device for processing touch information, including:
the acquisition module is used for acquiring a group of touch operation information on the display screen when the terminal is in a screen locking state;
the analysis module is used for analyzing the group of touch operation information to obtain a touch area indicated by each touch operation information in the group of touch operation information so as to form a touch area combination;
the judging module is used for judging whether the touch area combination is a preset touch area combination or not;
and the playing module is used for acquiring the current time and playing the voice when the touch area combination is judged to be the preset touch area combination.
In a third aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of any one of the touch information processing methods provided in the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides a terminal, including:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute any touch information processing method provided by the embodiment of the application.
According to the touch information processing method, the touch information processing device, the storage medium and the terminal, when the terminal is in the screen locking state, a group of touch operation information on the display screen is acquired, the touch area corresponding to each indication in the group of touch operation information is analyzed, a touch area combination is formed, and when the touch area combination is the preset touch area combination, voice playing at the current time is performed, so that the problem that a user needs to check the time in complicated steps is solved, the display screen does not need to be lightened, and the electric quantity of the terminal is saved.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic view of a terminal touch scene according to an embodiment of the present disclosure.
Fig. 2 is a flowchart illustrating a touch information processing method according to an embodiment of the present disclosure.
Fig. 3 is another flow chart illustrating a touch information processing method according to an embodiment of the present disclosure.
Fig. 4 is a schematic two-dimensional coordinate diagram of a terminal display screen provided in an embodiment of the present application.
Fig. 5 is a block diagram of a touch information processing apparatus according to an embodiment of the present disclosure.
Fig. 6 is a schematic block diagram of another apparatus for processing touch information according to an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
The term "module" as used herein may be considered a software object executing on the computing system. The various components, modules, engines, and services described herein may be viewed as objects implemented on the computing system. The apparatus and method described herein are preferably implemented in software, but may also be implemented in hardware, and are within the scope of the present application.
Referring to fig. 1, fig. 1 is a schematic view illustrating a terminal touch scenario according to an embodiment of the present disclosure. Wherein the terminal 100 includes a display 101, in some embodiments, the display 101 may be a full-screen or an irregular-shaped screen. The display 101 includes a first touch area 1011 and a second touch area 1012. The size and position of the first touch area 1011 can be set according to the user's needs. In one embodiment, the first touch area 1011 may be a display area for displaying time, and the first touch area 1011 may be limited in size by a coordinate value range, and may be rectangular or circular in shape. The remaining portion of the display 101 excluding the first touch area 1011 is a second touch area 1012. When the terminal 100 is in the screen-locked state, a set of touch operation information on the display screen 101 is obtained, as shown in step 1 in fig. 1, a user firstly touches the second touch area 1012 with a finger and then touches the first touch area 1011 with a finger. The terminal thereby obtains a set of touch operation information on the display 101. The touch operation information is analyzed to obtain touch areas indicated by the touch operation information of the group of touch operation information corresponding to the two times as a second touch area 1012 plus a first touch area 1011, so as to form a touch area combination. Whether the touch area combination second touch area 1012 and the first touch area 1011 are the preset touch area combination is judged, and when the touch area combination second touch area 1012 and the first touch area 1011 are judged to be the preset touch area combination, the current time is obtained, and voice playing is performed.
The following is a detailed description of the analysis.
In the present embodiment, the description will be made in terms of a processing apparatus of touch information, which may be specifically integrated in a terminal, such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), and the like.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating a touch information processing method according to an embodiment of the present disclosure. Specifically, the method comprises the following steps:
in step S101, when the terminal is in the lock screen state, a set of touch operation information on the display screen is acquired.
The lock screen state may be a state where the terminal 100 is in a standby state and is not turned on, and in the standby state and is not turned on, if the user needs to check the current time, the user needs to press a key on the terminal 100 to turn on the display screen of the terminal 100 to check the time.
Further, when the terminal 100 is in the lock screen state, a plurality of touch operations occurring on the display screen are continuously detected, so as to generate a set of touch operation information. It should be noted that the plurality of touch operation events must be continuously generated, and then it is determined that one set of touch operation information is present. That is, the time interval occurring between each touch operation event must be within a preset time interval, for example, the preset time interval is 2 seconds, timing is started while one touch operation event occurs, and if a new touch operation event is detected within 2 seconds, it is determined that the touch operation event occurs continuously, and the touch operation event is a set of touch operation information. If no new touch operation event is detected within 2 seconds, the touch operation is determined to be not continuously generated and is not a set of touch operation information.
In step S102, a set of touch operation information is analyzed to obtain a touch area indicated by each touch operation information in the set of touch operation information, so as to form a touch area combination.
It should be noted that each piece of touch operation information includes a touch coordinate value when a touch occurs. Namely, under the screen locking state, when a user touches the terminal by using a finger, the terminal records the coordinate value of the touch.
Specifically, when the touch coordinate value in the touch operation information falls within the coordinate value range of the first touch region 1011, it is determined that the touch region indicated by the touch operation information is the first touch region 1011. When the touch coordinate value in the touch operation information falls within the coordinate value range of the second touch region 1012, it is determined that the touch region indicated by the touch operation information is the second touch region 1012.
Based on this, a touch area corresponding to each touch operation information in a set of touch operation information is obtained to form a touch area combination, for example, a touch area combination of the second touch area 1012 plus the first touch area 1011.
In step S103, it is determined whether the touch area combination is a preset touch area combination.
In which, by setting a preset touch area combination, it is possible to prevent a situation of wrong execution time playing when a user makes a wrong touch on the display screen 101 under some special circumstances. The preset touch area combination can be freely set by a user.
Based on this, it is determined whether the touch area combination is the preset touch area combination, specifically, each touch area in the touch area combination and each touch area in the preset touch area combination are sequentially compared one by one, when the touch areas are completely matched, it is determined that the touch area combination is the preset touch area combination, and when the touch areas are not completely matched, it is determined that the touch area combination is not the preset touch area combination.
Further, when it is determined that the touch area combination is the preset touch area combination, step S104 is executed. When the touch area combination is judged not to be the preset touch area combination, the misoperation generated by the user carelessly is indicated, and the misoperation is directly ignored.
In step S104, the current time is acquired, and voice playing is performed.
When it is determined that each touch area in the touch area combination is completely matched with each touch area in the preset touch area combination, it is indicated that the group of touch operations are intentional by the user, the current time on the terminal 100 is obtained, and the voice is played.
In an embodiment, the set of touch operations may also be a set of sliding operations, and the terminal 100 records a sliding start point, a sliding operation and a sliding end point, and combines the corresponding touch area combinations by analyzing the touch areas where the sliding start point and the sliding end point are located. And when the touch area combination generated by the sliding operation is a preset touch area combination, acquiring the current time and playing the voice.
As can be seen from the above, according to the method for processing touch information provided in this embodiment, when the terminal is in the screen-locked state, a set of touch operation information on the display screen is obtained, and the touch area indicated by each touch operation information in the set of touch operation information is analyzed to form a touch area combination, and when the touch area combination is the preset touch area combination, the voice playing at the current time is performed, so that the problem of complicated steps when the user wants to check the time is solved, and the display screen does not need to be turned on, thereby saving the electric quantity of the terminal.
The method described in the above embodiments is further illustrated in detail by way of example.
Referring to fig. 3, fig. 3 is another flow chart illustrating a touch information processing method according to an embodiment of the present disclosure.
Specifically, the method comprises the following steps:
in step S201, when the terminal is in the lock screen state, a set of touch operation information on the display screen is acquired.
It should be noted that the touch operation information may include a touch coordinate value and a touch pressure value. As shown in FIG. 4, the display screen 101 includes a two-dimensional coordinate axis X-Y, and each touch point on the display screen corresponds to the two-dimensional coordinate axis X-Y, such that the two-dimensional coordinate values of the four vertices X1, X2, X3, and X4 of the first touch area 1011 are (-4, 3), (-4, 6), (4, 6), and (4, 3), respectively. Since the shape of the first touch area 1011 is rectangular, the size range of the first touch area 1011 can be determined by the four vertices. The two-dimensional coordinate values excluding the range of the first touch area 1011 constitute a second touch area 1012. Therefore, when the user touches with a finger, a touch coordinate value of the touch operation can be acquired based on the two-dimensional coordinate axis X-Y. The touch pressure value can be obtained through the pressure sensor on the display screen 101, and the greater the touch force of the user is, the greater the corresponding touch pressure value is. The smaller the touch force of the user is, the smaller the corresponding touch pressure value is.
When the terminal 100 is in the lock screen state, a plurality of touch operations occurring on the display screen are continuously detected, so as to generate a set of touch operation information, where each touch operation information in the set of touch operation information includes a corresponding touch coordinate value and a corresponding touch pressure value. It should be noted that the plurality of touch operation events must be continuously generated, and then it is determined that one set of touch operation information is present. That is, the time interval occurring between each touch operation event must be within a preset time interval.
In step S202, each touch operation information in a set of touch operation information is obtained.
In step S203, each touch operation information is sequentially analyzed to obtain a touch coordinate value corresponding to the touch operation information.
After each touch operation information in a set of touch operation information is sequentially analyzed to obtain a corresponding touch coordinate value, it may be determined that the touch region indicated by each touch operation information is the first touch region 1011 or the second touch region 1012 based on the touch coordinate value corresponding to each touch operation information.
In step S204, when the touch coordinate value falls within the first touch region, it is determined that the touch region indicated by the touch operation information is the first touch region.
When the touch coordinate value corresponding to the touch operation information falls into the first touch region 1011, if the touch coordinate value of the touch operation information is (0, 4), the two-dimensional coordinate ranges of the first touch region 1011 shown in fig. 4 are compared to determine that the touch coordinate value (0, 4) of the touch information falls into the first touch region 1011, and it is determined that the touch region indicated by the touch operation information corresponding to the touch operation information is the first touch region 1011, that is, the touch operation of the user occurs in the first touch region 1011.
In one embodiment, the position and size of the first touch area 1011 may be the same as the position and size of the display screen 101 displaying the time and date. Correspondingly, all touch areas on the display screen 101 except the first touch area 1011 are the second touch area 1012.
In step S205, when the touch coordinate value falls within the second touch region, it is determined that the touch region indicated by the touch operation information is the second touch region.
When the touch coordinate value corresponding to the touch operation information falls into the second touch region 1011, if the touch coordinate value of the touch operation information is (5, 1), the two-dimensional coordinate ranges of the first touch region 1011 shown in fig. 4 are compared, and it is determined that the touch coordinate value (5, 1) of the touch information does not fall into the first touch region 1011 but falls into the second touch region 1012. Then, it is determined that the touch area indicated by the touch operation information is the second touch area 1012, that is, the user touch operation occurs in the second touch area 1011.
In step S206, the touch areas indicated by the touch operation information are combined to form a touch area combination.
The touch areas indicated by each touch operation information are combined, and if there are 3 pieces of touch operation information, the touch area indicated by the first touch operation information is the first touch area 1011. The touch area indicated by the second touch operation information is the second touch area 1012. When the touch area indicated by the third touch operation information is the second touch area 1012, the touch area combinations are generated as the first touch area 1011, the second touch area 1012 and the second touch area 1012.
In step S207, a touch pressure value corresponding to each touch operation information in a set of touch operation information is obtained.
Wherein, this touch pressure value is produced when the user passes through finger touch display screen, and the dynamics that touches is big more, and the touch pressure value that corresponds production is also big more. When the touch force is smaller, the touch pressure value correspondingly generated is smaller.
In step S208, it is determined whether the touch pressure value is greater than a predetermined threshold.
The preset threshold is a compromise pressure threshold reasonably preset according to human mechanics, and when the touch pressure value is judged to be greater than the preset threshold, the step S09 is executed. When the touch pressure value is not greater than the preset threshold, step S210 is executed.
In step S209, it is determined that the touch operation information association instruction is a heavy pressure touch operation.
When the touch pressure value is judged to be larger than the preset threshold value, the fact that the touch force of the touch operation of the user is large is indicated, and then the touch operation information is correspondingly judged to be the heavy pressure touch operation.
In step S210, it is determined that the touch operation information association instruction is a light pressure touch operation.
When the touch pressure value is judged to be not greater than the preset threshold value, the fact that the touch force of the touch operation of the user is small is indicated, and then the touch operation information is correspondingly judged to be the light touch operation.
In step S211, the light touch operation and the heavy touch operation indicated by each touch operation information are combined to form a touch operation combination.
And combining the touch operations with the weight indicated by each piece of touch operation information, wherein if 3 pieces of touch operation information exist, the first piece of touch operation information is indicated as a heavy-pressure touch operation correspondingly. The second touch operation information is correspondingly indicated as a light touch operation. The third touch operation information is a light touch operation, and the correspondingly formed touch operation combinations are a heavy touch operation, a light touch operation, and a light touch operation.
In step S212, the touch area combinations are compared with the preset touch area combinations one by one to obtain a touch area matching result.
And comparing each touch area in the touch area combination with each preset touch area in the preset touch area combination one by one, wherein when each touch area in the touch area combination is completely consistent with each preset touch area in the preset touch area combination, the touch area matching result is that the matching is passed. When each touch area in the touch area combination is not completely consistent with each touch area in the preset touch area combination, the touch area matching result is that the matching is not passed.
For example, the touch areas are combined into a first touch area 1011, a second touch area 1012, and a second touch area 1012. When the preset touch area combination is also the first touch area 1011, the second touch area 1012 and the second touch area 1012, it is determined that the matching is passed. When the number of the touch areas of the preset touch area combination is more than or less than the number of the touch areas in the touch area combination. Or when each touch area of the preset touch area combination does not correspond to each touch area of the touch area combination in a one-to-one mode, the matching is judged not to pass.
In step S213, the touch operation combinations are compared with the preset touch operation combinations one by one to obtain a touch operation matching result.
And comparing each light touch operation in the touch operation combination with each preset light touch operation in a preset touch operation combination one by one, wherein when each light touch operation in the touch operation combination is completely consistent with each preset light touch operation in the preset touch operation combination, the touch operation matching result is that the matching is passed. When each light touch operation in the touch operation combination is not completely consistent with each preset light touch operation in the preset touch operation combination, the touch area matching result is that the matching is not passed.
For example, the touch operation combination is a heavy touch operation, a light touch operation, and a light touch operation. And when the preset touch operation combination is a heavy-pressure touch operation, a light-pressure touch operation and a light-pressure touch operation, judging that the matching is passed. When the number of the light touch operations in the preset touch operation combination is more than or less than the number of the light touch operations in the touch operation combination. Or when each light touch operation of the preset touch operation combination is not completely in one-to-one correspondence with each light touch operation in the touch operation combination, determining that the matching is failed.
In step S214, when the touch area matching result and the touch operation matching result are matched, it is determined that the touch area combination is the preset touch area combination.
When both the touch area matching result and the touch operation matching result are matched, it indicates that the set of touch operation inputs are correct, and step S215 is triggered to be executed.
In step S215, the current time is acquired, and voice playback is performed.
When the touch area matching result and the touch operation matching result are both matched, it is indicated that the user needs to know the current time, correspondingly acquire the current time, and perform voice playing.
Further, the touch area combination and the light touch operation combination are verified simultaneously. The user needs to touch different touch areas according to a preset sequence and preset force, and the function of voice playing time can be triggered. The condition that the voice playing is inconvenient to the user due to misoperation of the user on the display screen 101 can be better avoided.
In an embodiment, the volume of the voice playing may be determined by detecting a noise value of the current environment, specifically, when the voice playing is required, the noise value of the environment where the terminal 100 is currently located is obtained by the sensor, and when the noise value is larger, the volume of the voice playing is larger, and when the noise value is smaller, the volume of the voice playing is smaller.
In an embodiment, after obtaining the current time and performing the voice playing, the method further includes:
(1) and detecting whether a touch event occurs on the first touch area within a preset time.
The preset time may be a time for playing the voice, before the voice is not played, whether a touch event occurs on the first touch region 1011 is detected, and when the touch event is detected, the step (2) is executed. And when the touch event is not detected, continuing the voice playing until the end.
(2) And when the touch event on the first touch area is detected, acquiring a calendar event corresponding to the current time, and displaying the calendar event on the display screen.
When a touch event on the first touch area is detected, a calendar event corresponding to the current time, for example, a memo set by the user, can be triggered and acquired, and the obtained memo is displayed on the display screen 101. Greater convenience can be brought to the user.
In step S216, when the touch area matching result and/or the touch operation matching result are not matched, it is determined that the touch area combination is not the preset touch area combination.
And when neither the touch area matching result nor the touch operation matching result passes the matching, or when any one of the touch area matching result and the touch operation matching result passes the matching, determining that the touch area combination is not the preset touch area combination. The voice play time is not performed.
As can be seen from the above, according to the method for processing touch information provided in this embodiment, when the terminal is in the screen-locked state, a set of touch operation information on the display screen is obtained, the touch area indicated by each touch operation information in the set of touch operation information is analyzed to form a touch area combination, and the light touch operation indicated by the touch pressure value of each touch operation information in the set of touch operation information is obtained to form a touch operation combination.
In order to better implement the method for processing touch information provided in the embodiments of the present application, an apparatus based on the method for processing touch information is also provided in the embodiments of the present application. The meaning of the noun is the same as that in the above touch information processing method, and specific implementation details may refer to the description in the method embodiment.
Referring to fig. 5, fig. 5 is a schematic block diagram of a touch information processing apparatus according to an embodiment of the present disclosure. Specifically, the touch information processing device 300 includes: an acquisition module 31, an analysis module 32, a judgment module 33, and a playback module 34.
The obtaining module 31 is configured to obtain a set of touch operation information on the display screen when the terminal is in the screen lock state.
Further, when the terminal is in the lock screen state, the obtaining module 31 continuously detects a plurality of touch operations occurring on the display screen, so as to generate a set of touch operation information. It should be noted that the plurality of touch operation events must be continuously generated, and then it is determined that one set of touch operation information is present. That is, the time interval occurring between each touch operation event must be within a preset time interval, for example, the preset time interval is 2 seconds, timing is started while one touch operation event occurs, and if a new touch operation event is detected within 2 seconds, it is determined that the touch operation event occurs continuously, and the touch operation event is a set of touch operation information. If no new touch operation event is detected within 2 seconds, the touch operation is determined to be not continuously generated and is not a set of touch operation information.
The analysis module 32 is configured to analyze the group of touch operation information to obtain a touch area indicated by each piece of touch operation information in the group of touch operation information, so as to form a touch area combination.
The analyzing module 32 analyzes the touch coordinate value in each touch operation information of the group of touch operation information to determine that the touch area indicated by each touch operation information is the first touch area or the second touch area, specifically, when the touch coordinate value in the touch operation information falls into the coordinate value range of the first touch area, it is determined that the touch area indicated by the touch operation information is the first touch area. And when the touch coordinate value in the touch operation information falls into the coordinate value range of the second touch area, determining that the touch area indicated by the touch operation information is the second touch area.
Based on the touch area combination, the touch area indicated by each touch operation information in a group of touch operation information is obtained to form the touch area combination
The determining module 33 is configured to determine whether the touch area combination is a preset touch area combination.
The determining module 33 determines whether the touch area combination is the preset touch area combination, specifically, each touch area in the touch area combination and each touch area in the preset touch area combination are sequentially compared one by one, when the touch areas are completely matched, the touch area combination is determined as the preset touch area combination, and when the touch areas are not completely matched, the touch area combination is determined not as the preset touch area combination.
The playing module 34 is configured to obtain the current time and perform voice playing when it is determined that the touch area combination is the preset touch area combination.
When the determining module 33 determines that each touch area in the touch area combination is completely matched with each touch area in the preset touch area combination, it indicates that the group of touch operations is intended by the user, and the playing module 34 obtains the current time of the terminal and plays the voice.
Referring to fig. 6, fig. 6 is a schematic diagram of another module of the apparatus for processing touch information according to the embodiment of the present disclosure, where the apparatus 300 for processing touch information further includes:
the analysis module 32 may further include an obtaining sub-module 321, an analysis sub-module 322, a first determining sub-module 323, a second determining sub-module 324, and a combining sub-module 325.
Specifically, the obtaining sub-module 321 is configured to obtain each touch operation information in the group of touch operation information. The analysis submodule 322 is configured to sequentially analyze each touch operation information to obtain a touch coordinate value corresponding to the touch operation information. The first determining submodule 323 is configured to determine that the touch area indicated by the touch operation information corresponds to the first touch area when the touch coordinate value falls into the first touch area. The second determining sub-module 342 is configured to determine that the touch area indicated by the touch operation information corresponds to a second touch area when the touch coordinate value falls into the second touch area. The combining sub-module 325 is configured to combine the touch areas indicated by each piece of touch operation information to form a touch area combination.
The detecting module 35 is configured to detect whether a touch event occurs in the first touch area within a preset time;
and the display module 36 is configured to, when it is detected that a touch event occurs on the first touch area, acquire a calendar event corresponding to the current time, and display the calendar event on the display screen.
Therefore, according to the processing device for touch information provided by the embodiment, when the terminal is in the screen locking state, a group of touch operation information on the display screen is acquired, the touch area indicated by each touch operation information in the group of touch operation information is analyzed to form a touch area combination, and when the touch area combination is the preset touch area combination, the voice playing at the current time is performed, so that the problem that steps are complicated when a user wants to check the time is solved, the display screen does not need to be lightened, and the electric quantity of the terminal is saved.
Embodiments of the present application also provide a terminal, as shown in fig. 7, the terminal 400 may include a memory 401 having one or more computer-readable storage media, a sensor 402, an input unit 403, a display 404, and a processor 405 having one or more processing cores. Those skilled in the art will appreciate that the terminal structure shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The memory 401 may be used to store applications and data. The memory 401 stores applications containing executable code. The application programs may constitute various functional modules. The processor 405 executes various functional applications and data processing by running the application programs stored in the memory 401. Further, the memory 401 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 401 may also include a memory controller to provide the processor 405 and the input unit 403 with access to the memory 401.
The terminal may also include at least one sensor 402, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
The input unit 403 may be used to receive input numbers, character information, or user characteristic information, such as a fingerprint, and generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control. In particular, in a particular embodiment, the input unit 403 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 405, and can receive and execute commands sent by the processor 405. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 403 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a fingerprint recognition module, a trackball, a mouse, a joystick, and the like.
The display screen 404 may be used to display information entered by or provided to the user as well as various graphical user interfaces of the terminal, which may be composed of graphics, text, icons, video, and any combination thereof. The display screen 404 may include a display panel. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 405 to determine the type of touch event, and then the processor 405 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 7 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The processor 405 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or executing an application program stored in the memory 401 and calling data stored in the memory 401, thereby performing overall monitoring of the terminal. Optionally, processor 405 may include one or more processing cores; preferably, the processor 405 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application program, and the like.
Although not shown in fig. 7, the terminal may further include a camera, a bluetooth module, a power supply, and the like, which are not described in detail herein.
Specifically, in this embodiment, the processor 405 in the terminal loads the executable code corresponding to the process of one or more application programs into the memory 401 according to the following instructions, and the processor 405 runs the application program stored in the memory 401, thereby implementing various functions:
when the processor 405 detects that the terminal is in the screen locking state, a set of touch operation information on the display screen is acquired.
The processor 405 analyzes the set of touch operation information to obtain a touch area indicated by each piece of touch operation information in the set of touch operation information, so as to form a touch area combination.
The processor 405 determines whether the touch area combination is a predetermined touch area combination.
When the processor 405 determines that the touch area combination is the preset touch area combination, the current time is acquired, and the voice is played.
When the processor 405 performs analysis on the set of touch operation information to obtain a touch area indicated by each piece of touch operation information in the set of touch operation information, so as to form a touch area combination, the method may include: acquiring each touch operation information in the group of touch operation information; sequentially analyzing each piece of touch operation information to obtain a touch coordinate value corresponding to the touch operation information; when the touch coordinate value falls into the first touch area, determining that the touch area indicated by the touch operation information corresponding to the first touch area; when the touch coordinate value falls into the second touch area, determining that the touch area indicated by the touch operation information corresponding to the second touch area; and combining the touch areas indicated by each piece of touch operation information to form a touch area combination.
Before the processor 405 determines whether the touch area combination is the preset touch area combination, the method may further include: acquiring a touch pressure value corresponding to each touch operation information in the group of touch operation information; judging whether the touch pressure value is larger than a preset threshold value or not; when the touch pressure value is larger than a preset threshold value, determining that the touch operation information corresponding indication is a heavy-pressure touch operation; when the touch pressure value is smaller than a preset threshold value, determining that the touch operation information corresponding indication is a light-pressure touch operation; and combining the light touch operation and the heavy touch operation which are correspondingly indicated by each piece of touch operation information to form a touch operation combination.
When the processor 405 determines whether the touch area combination is the preset touch area combination, the method may include: comparing the touch area combination with preset touch area combinations one by one to obtain a touch area matching result; comparing the touch operation combination with preset touch operation combinations one by one to obtain a touch operation matching result; when the touch area matching result and the touch operation matching result are matched, determining that the touch area combination is a preset touch area combination; and when the touch area matching result and/or the touch operation matching result are not matched, determining that the touch area combination is not a preset touch area combination.
After the processor 405 performs acquiring the current time and performs voice playing, the method may further include: detecting whether a touch event occurs on the first touch area within a preset time; and when the touch event on the first touch area is detected, acquiring a calendar event corresponding to the current time, and displaying the calendar event on the display screen.
Since the terminal can execute any one of the methods for processing touch information provided in the embodiments of the present invention, the beneficial effects that can be achieved by any one of the methods for processing touch information provided in the embodiments of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
In the foregoing embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may be referred to the above detailed description of the touch information processing method, and are not described here again.
The method, the apparatus, the storage medium, and the terminal for processing touch information provided in the embodiments of the present application, such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), and the like, belong to the same concept, and any method provided in the embodiments of the method for processing touch information may be run on the apparatus for processing touch information.
It should be noted that, for the touch information processing method of the present application, it can be understood by a person skilled in the art that all or part of the process of implementing the touch information processing method of the present application can be completed by controlling the related hardware through a computer program, where the computer program can be stored in a computer readable storage medium, such as a memory of a terminal, and executed by at least one processor in the terminal, and the process of executing the process can include, for example, the process of the embodiment of the touch information processing method. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the touch information processing apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, or each functional module may exist alone physically, or two or more functional modules may be integrated into one functional module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented as a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium such as a read-only memory, a magnetic or optical disk, or the like.
The foregoing describes in detail a method, an apparatus, a storage medium, and a terminal for processing touch information provided in an embodiment of the present application, and a specific example is applied in the present application to explain principles and implementations of the present application, and the description of the foregoing embodiments is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (8)

1. A touch information processing method is applied to a terminal, the terminal comprises a display screen, and touch areas of the display screen comprise a first touch area and a second touch area, and the touch information processing method is characterized by comprising the following steps:
when the terminal is in a screen locking state, acquiring a group of touch operation information on the display screen;
analyzing the group of touch operation information to obtain a touch area correspondingly indicated by each piece of touch operation information in the group of touch operation information so as to form a touch area combination;
judging whether the touch area combination is a preset touch area combination or not;
when the touch area combination is judged to be the preset touch area combination, acquiring the current time and carrying out voice playing;
detecting whether a touch event occurs on the first touch area within a preset time;
and when the touch event on the first touch area is detected, acquiring a calendar event corresponding to the current time, and displaying the calendar event on the display screen.
2. The method of claim 1, wherein the analyzing the set of touch operation information to obtain touch areas indicated by each piece of touch operation information in the set of touch operation information to form a touch area combination comprises:
acquiring each touch operation information in the group of touch operation information;
sequentially analyzing each piece of touch operation information to obtain a touch coordinate value corresponding to the touch operation information;
when the touch coordinate value falls into the first touch area, determining that the touch area indicated by the touch operation information is the first touch area;
when the touch coordinate value falls into the second touch area, determining that the touch area indicated by the touch operation information is the second touch area;
and combining the touch areas indicated by each piece of touch operation information to form a touch area combination.
3. The method for processing touch information according to claim 2, wherein the touch operation information further includes a touch pressure value;
before judging whether the touch area combination is the preset touch area combination, the method further comprises the following steps:
acquiring a touch pressure value corresponding to each touch operation information in the group of touch operation information;
judging whether the touch pressure value is larger than a preset threshold value or not;
when the touch pressure value is larger than a preset threshold value, determining that the touch operation information corresponding indication is a heavy-pressure touch operation;
when the touch pressure value is smaller than a preset threshold value, determining that the touch operation information corresponding indication is a light-pressure touch operation;
and combining the light touch operation and the heavy touch operation which are correspondingly indicated by each piece of touch operation information to form a touch operation combination.
4. The method as claimed in claim 3, wherein the determining whether the touch area combination is a predetermined touch area combination comprises:
comparing the touch area combination with a preset touch area combination one by one to obtain a touch area matching result;
comparing the touch operation combination with preset touch operation combinations one by one to obtain a touch operation matching result;
when the touch area matching result and the touch operation matching result are matched, determining that the touch area combination is a preset touch area combination;
and when the touch area matching result and/or the touch operation matching result are not matched, determining that the touch area combination is not a preset touch area combination.
5. The utility model provides a processing apparatus of touching information, is applied to the terminal, the terminal includes the display screen, the touch area of display screen includes first touch area and second touch area, its characterized in that includes:
the acquisition module is used for acquiring a group of touch operation information on the display screen when the terminal is in a screen locking state;
the analysis module is used for analyzing the group of touch operation information to obtain a touch area indicated by each touch operation information in the group of touch operation information so as to form a touch area combination;
the judging module is used for judging whether the touch area combination is a preset touch area combination or not;
the playing module is used for acquiring the current time and playing the voice when the touch area combination is judged to be the preset touch area combination;
the detection module is used for detecting whether a touch event occurs on the first touch area within a preset time;
and the display module is used for acquiring a calendar event corresponding to the current time and displaying the calendar event on the display screen when the touch event on the first touch area is detected.
6. The apparatus for processing touch information according to claim 5, wherein the analysis module comprises:
the acquisition sub-module is used for acquiring each touch operation information in the group of touch operation information;
the analysis submodule is used for sequentially analyzing each piece of touch operation information to obtain a touch coordinate value corresponding to the touch operation information;
the first judgment submodule is used for judging that the touch area indicated by the touch operation information is a first touch area when the touch coordinate value falls into the first touch area;
the second judging sub-module is used for judging that the touch area indicated by the touch operation information is a second touch area when the touch coordinate value falls into the second touch area;
and the combining submodule is used for combining the touch areas indicated by each piece of touch operation information to form a touch area combination.
7. A storage medium on which a computer program is stored, wherein the program, when executed by a processor, implements the steps of the method for processing touch information according to any one of claims 1 to 4.
8. A terminal, comprising:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute the touch information processing method according to any one of claims 1 to 4.
CN201711353203.9A 2017-12-15 2017-12-15 Touch information processing method and device, storage medium and terminal Active CN108062199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711353203.9A CN108062199B (en) 2017-12-15 2017-12-15 Touch information processing method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711353203.9A CN108062199B (en) 2017-12-15 2017-12-15 Touch information processing method and device, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN108062199A CN108062199A (en) 2018-05-22
CN108062199B true CN108062199B (en) 2020-05-12

Family

ID=62139149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711353203.9A Active CN108062199B (en) 2017-12-15 2017-12-15 Touch information processing method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN108062199B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872228A (en) * 2016-03-31 2016-08-17 宇龙计算机通信科技(深圳)有限公司 Alarm clock reminding processing method and device, and terminal
CN106575170A (en) * 2014-07-07 2017-04-19 三星电子株式会社 Method of performing a touch action in a touch sensitive device
CN106775413A (en) * 2016-12-27 2017-05-31 努比亚技术有限公司 A kind of control method and terminal
CN106951161A (en) * 2017-03-29 2017-07-14 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107193455A (en) * 2017-04-27 2017-09-22 努比亚技术有限公司 A kind of information processing method and mobile terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9213482B2 (en) * 2011-11-11 2015-12-15 Elan Microelectronics Corporation Touch control device and method
US9933884B2 (en) * 2015-07-29 2018-04-03 Stmicroelectronics Asia Pacific Pte Ltd Correcting coordinate jitter in touch screen displays due to forceful touches

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106575170A (en) * 2014-07-07 2017-04-19 三星电子株式会社 Method of performing a touch action in a touch sensitive device
CN105872228A (en) * 2016-03-31 2016-08-17 宇龙计算机通信科技(深圳)有限公司 Alarm clock reminding processing method and device, and terminal
CN106775413A (en) * 2016-12-27 2017-05-31 努比亚技术有限公司 A kind of control method and terminal
CN106951161A (en) * 2017-03-29 2017-07-14 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107193455A (en) * 2017-04-27 2017-09-22 努比亚技术有限公司 A kind of information processing method and mobile terminal

Also Published As

Publication number Publication date
CN108062199A (en) 2018-05-22

Similar Documents

Publication Publication Date Title
US10423322B2 (en) Method for viewing message and terminal
US8654085B2 (en) Multidimensional navigation for touch sensitive display
CN107885534B (en) Screen locking method, terminal and computer readable medium
US20100073302A1 (en) Two-thumb qwerty keyboard
CN107395871B (en) Method and device for opening application, storage medium and terminal
US20140078091A1 (en) Terminal Device and Method for Quickly Starting Program
WO2018214885A1 (en) Radio frequency interference processing method and electronic device
US20090322699A1 (en) Multiple input detection for resistive touch panel
US20120050530A1 (en) Use camera to augment input for portable electronic device
CN108733298B (en) Touch information processing method and device, storage medium and electronic equipment
US20080273015A1 (en) Dual function touch screen module for portable device and opeating method therefor
CN103529934A (en) Method and apparatus for processing multiple inputs
US20090237373A1 (en) Two way touch-sensitive display
KR20100021425A (en) Device having precision input capability
EP4057137A1 (en) Display control method and terminal device
US20170357568A1 (en) Device, Method, and Graphical User Interface for Debugging Accessibility Information of an Application
JPWO2009031213A1 (en) Portable terminal device and display control method
CN106778131B (en) Display method and device of hidden information and terminal
CN106886351B (en) Display method and device of terminal time information and computer equipment
US20130278512A1 (en) Touch sensitive electronic device with clipping function and clipping method
CN107609146B (en) Information display method and device, terminal and server
CN106933576B (en) Terminal unlocking method and device and computer equipment
CN108958607B (en) Split screen display starting method and device, storage medium and electronic equipment
CN108646976B (en) Touch information processing method and device, storage medium and electronic equipment
CN108519849B (en) Touch information processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

GR01 Patent grant
GR01 Patent grant