WO2021057654A1 - 屏幕触控管理方法、智能终端、装置及可读存储介质 - Google Patents
屏幕触控管理方法、智能终端、装置及可读存储介质 Download PDFInfo
- Publication number
- WO2021057654A1 WO2021057654A1 PCT/CN2020/116484 CN2020116484W WO2021057654A1 WO 2021057654 A1 WO2021057654 A1 WO 2021057654A1 CN 2020116484 W CN2020116484 W CN 2020116484W WO 2021057654 A1 WO2021057654 A1 WO 2021057654A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- maximum
- direction coordinate
- coordinate difference
- determined
- Prior art date
Links
- 238000007726 management method Methods 0.000 title claims abstract description 42
- 238000005070 sampling Methods 0.000 claims abstract description 111
- 230000000903 blocking effect Effects 0.000 claims description 11
- 238000013459 approach Methods 0.000 claims description 6
- 230000004913 activation Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims 4
- 230000003213 activating effect Effects 0.000 abstract 1
- 238000000034 method Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- GNFTZDOKVXKIBK-UHFFFAOYSA-N 3-(2-methoxyethoxy)benzohydrazide Chemical compound COCCOC1=CC=CC(C(=O)NN)=C1 GNFTZDOKVXKIBK-UHFFFAOYSA-N 0.000 description 1
- FGUUSXIOTUKUDN-IBGZPJMESA-N C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 Chemical compound C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 FGUUSXIOTUKUDN-IBGZPJMESA-N 0.000 description 1
- YTAHJIFKAKIKAV-XNMGPUDCSA-N [(1R)-3-morpholin-4-yl-1-phenylpropyl] N-[(3S)-2-oxo-5-phenyl-1,3-dihydro-1,4-benzodiazepin-3-yl]carbamate Chemical compound O=C1[C@H](N=C(C2=C(N1)C=CC=C2)C1=CC=CC=C1)NC(O[C@H](CCN1CCOCC1)C1=CC=CC=C1)=O YTAHJIFKAKIKAV-XNMGPUDCSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009527 percussion Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1318—Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- the present disclosure relates to, but is not limited to, the technical field of smart terminals.
- the present disclosure provides a screen touch management method.
- the screen touch management method includes: starting the front camera when the front camera is not turned on and the proximity sensor detects that an object approaches the front camera. Set a camera and determine whether the approaching object is a user's finger; when it is determined that the approaching object is a user's finger, activate the tracking mode, and obtain the movement of the user's finger based on the front camera and the tracking mode Trajectory; obtain the four-dimensional coordinates of the sampling points on the motion trajectory based on a preset time interval, the four-dimensional coordinates including horizontal X coordinate, vertical Y coordinate, distance Z coordinate and trajectory time; based on the four-dimensional coordinates of each sampling point and
- the preset touch template determines the touch type corresponding to the motion track, and performs a touch operation based on the touch type.
- the present disclosure also provides a screen touch management device, the screen touch management device includes: an activation module configured to when the front camera is not turned on and the proximity sensor detects an object approaching the front camera , Start the front camera, and determine whether the approaching object is a user's finger; the acquisition module is configured to activate the tracking mode when it is determined that the approaching object is a user's finger, and based on the front camera and the The tracking mode obtains the movement trajectory of the user's finger; the sampling module is configured to obtain the four-dimensional coordinates of the sampling points on the movement trajectory based on a preset time interval, and the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, and distance. Z-direction coordinates and track time; the determining module is configured to determine the touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and perform a touch operation based on the touch type.
- an activation module configured to when the front camera is not turned on and
- the present disclosure also provides an intelligent terminal.
- the intelligent terminal includes a memory, a processor, and a screen touch management program that is stored on the memory and can run on the processor, and the screen touch management program is When the processor is executed, any one of the screen touch management methods described herein is implemented.
- the present disclosure also provides a readable storage medium having a screen touch management program stored on the readable storage medium, and when the screen touch management program is executed by a processor, any one of the screen touch management methods described herein is implemented .
- FIG. 1 is a schematic structural diagram of a smart terminal in a hardware operating environment involved in an embodiment of the present disclosure
- FIG. 2 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure
- FIG. 3 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure.
- FIG. 4 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure.
- FIG. 5 is a schematic flowchart of a screen touch management method according to an embodiment of the present disclosure.
- FIG. 6 is a schematic diagram of functional modules of a screen touch management device according to an embodiment of the disclosure.
- Fig. 1 is a schematic structural diagram of a smart terminal in a hardware operating environment involved in a solution of an embodiment of the present disclosure.
- the smart terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
- the communication bus 1002 is used to implement connection and communication between these components.
- the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard).
- the user interface 1003 may also include a standard wired interface and a wireless interface.
- the network interface 1004 may include a standard wired interface and a wireless interface (such as a WI-FI interface).
- the memory 1005 may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as a magnetic disk memory.
- the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
- the smart terminal may also include a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and so on.
- Sensors such as light sensors, motion sensors and other sensors.
- the light sensor may include an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display screen according to the brightness of the ambient light.
- the proximity sensor can turn off the display screen and/or backlight when the mobile terminal is moved to the ear.
- the posture sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary.
- the smart terminal can be used to identify the application of mobile terminal posture (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; in addition, the smart terminal can also be equipped with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. Go into details again.
- sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. Go into details again.
- the structure of the smart terminal shown in FIG. 1 does not constitute a limitation on the terminal, and may include more or less components than those shown in the figure, or a combination of certain components, or different component arrangements.
- the memory 1005 which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and a screen touch management program.
- the network interface 1004 is mainly used to connect to a back-end server and communicate with the back-end server;
- the user interface 1003 is mainly used to connect to a client (user side) to communicate with the client;
- the processor 1001 can be used to call the screen touch management program stored in the memory 1005.
- the smart terminal includes: a memory 1005, a processor 1001, and a screen touch management program that is stored on the memory 1005 and can run on the processor 1001.
- the processor 1001 calls the memory 1005 stored in the memory 1005.
- the screen touch management program the screen touch management method provided by each embodiment of the present disclosure is executed.
- FIG. 2 is a schematic flowchart of a screen touch management method according to an embodiment of the present disclosure.
- the embodiments of the present disclosure provide an embodiment of a screen touch management method. It should be noted that although the logical sequence is shown in the flowchart, in some cases, the sequence shown may be executed in a different order than here. Or the steps described.
- the screen touch management method includes the following steps S100 to S400.
- step S100 when the front camera is not turned on and the proximity sensor detects that an object approaches the front camera based on the proximity sensor, the front camera is activated, and it is determined whether the approaching object is a user's finger.
- the current full screens of smart terminals are all quasi-full screens, and true full-screen mobile phones will appear in the future, and full-screen mobile phones will be an inevitable trend in the future.
- Full-screen mobile phones have solved technical barriers such as under-screen fingerprints and screen sound.
- the technical barriers that still need to be resolved are under-screen cameras.
- Under-screen cameras need to solve two main problems: First, the location of the under-screen camera openings on the screen Display; the second is the touch of the hole position of the camera under the screen.
- the technical solution of the present disclosure solves the second problem: the touch of the hole position of the camera under the screen.
- the smart terminal is provided with a proximity sensor at the camera opening position to detect whether an object is close to the under-screen camera, so as to trigger the activation of the under-screen camera.
- the proximity sensor is a device with the ability to perceive the proximity of an object. It uses the sensitive characteristic of the displacement sensor to recognize the proximity of the object and outputs the corresponding switch signal. Therefore, the proximity sensor is usually called a proximity switch. It is a general term for sensors that replace contact detection methods such as switches and do not need to touch the detected object. It can detect the movement and existence of the object and convert it into an electrical signal.
- the smart terminal is also equipped with a tracking mode, which is used to capture the movement trajectory of the user's finger according to the video image taken by the camera.
- the front camera of the smart terminal When the front camera of the smart terminal is not turned on and the proximity sensor detects that an object is close to the front camera, the front camera is activated to take an image of the approaching object, and it is determined whether the approaching object is the user's finger according to a preset image recognition algorithm. It should be noted that if the front camera of the smart terminal is being used for taking photos, facial recognition, video recording, video calls, etc., if the proximity sensor detects an object approaching the front camera, it will not respond to the corresponding front camera under the screen. The touch event of the opening position.
- step S200 when it is determined that the approaching object is a user's finger, the tracking mode is activated, and the movement track of the user's finger is acquired based on the front camera and the tracking mode.
- the proximity sensor when the proximity sensor detects that an object is approaching the front camera, it determines that the approaching object is the user's finger according to the front camera of the smart terminal and the preset image recognition algorithm, then starts the tracking mode of the smart terminal, and then according to The video image captured by the camera captures the movement trajectory of the user's finger.
- Motion trajectory refers to the spatial characteristics of the action composed of the route taken by a certain part of the body from the start position to the end.
- the motion trajectory is represented by the motion trajectory direction, the motion trajectory form and the motion amplitude.
- the movement trajectory of the user's finger refers to the spatial characteristics of the action composed of the route that the finger travels from the start position to the end in the shooting area after the front camera is turned on.
- the direction of the movement trajectory of the finger is constantly changing, and the movement The trajectory form is a curve.
- moving target tracking is to find the moving target of interest in real time in each image in a sequence of images, including motion parameters such as position, speed, and acceleration.
- the tracking mode of the smart terminal uses the existing The tracking algorithm recognizes the user's finger from the video image taken by the front camera, and the route the finger travels is the finger's movement trajectory.
- step S300 four-dimensional coordinates of sampling points on the motion track are acquired based on a preset time interval, where the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and track time;
- the trajectory of the finger can be represented by a four-dimensional coordinate system, which are the horizontal X axis, the vertical Y axis, the far and near direction Z axis, and the time T axis.
- the origin of the coordinate system can be set according to the actual situation.
- the lower left corner of the opening position corresponding to the front camera under the screen is the origin.
- horizontal to the right is the positive coordinate direction of the X-axis
- vertical upward is the positive coordinate direction of the Y-axis.
- the direction away from the screen is the positive direction of the Z axis
- the time T axis is the real time.
- the trajectory of the finger is a curve expressed in a four-dimensional coordinate system. Therefore, the trajectory can be sampled according to a preset time interval to obtain multiple sampling points.
- Each sampling point is represented by four-dimensional coordinates.
- the four-dimensional coordinates include horizontal X-direction coordinates, The vertical Y-direction coordinates, the near-far Z-direction coordinates, and the track time, these sampling points are used to determine the touch type corresponding to the finger operation, for example, to determine the current finger operation is sliding left, sliding down, etc.
- the preset time interval is determined according to the actual situation, and the preset time interval determines the number of sampling points, and at least 2 sampling points must be guaranteed.
- step S400 a touch type corresponding to the motion track is determined based on the four-dimensional coordinates of each sampling point and a preset touch template, and a touch operation is performed based on the touch type.
- the touch type corresponding to the motion trajectory is further determined according to the four-dimensional coordinates of the sampling points and the preset touch template.
- Touch types include: left sliding touch, right sliding touch, sliding up touch, down touch, single-click touch, double-click touch, and long-press touch; the preset touch template saves left and right sliding threshold data, Up and down sliding threshold data and click threshold data.
- the left and right sliding threshold data is a three-dimensional array, including X direction data, Y direction data, and Z direction data.
- the left and right sliding threshold data is used to determine whether it is a left and right sliding touch; similarly, sliding up and down Threshold data and click threshold data are also three-dimensional arrays, including X-direction data, Y-direction data, and Z-direction data, which are used to determine whether it is sliding touch up and down and click touch respectively.
- step S400 includes the following steps S410 and S420.
- step S410 the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference are determined based on the four-dimensional coordinate values of the respective sampling points.
- the X-direction coordinate difference, the Y-direction coordinate difference, and the Z-direction coordinate difference between each sampling point are respectively calculated, and the maximum X-direction coordinate difference is obtained from all the X-direction coordinate differences, and Obtain the largest Y-direction coordinate difference among all Y-direction coordinate differences, and obtain the largest Z-direction coordinate difference among all Z-direction coordinate differences.
- step S420 the touch type corresponding to the motion track is determined based on the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and a preset touch template.
- the left and right sliding threshold data, the up and down sliding threshold data, and the click threshold data are stored in the preset touch template, and the maximum X coordinate difference, the maximum Y coordinate difference, and the maximum Z coordinate difference are combined with The preset touch template is matched, and the touch type corresponding to the motion track is determined according to the matching result.
- the front camera when the front camera is not turned on and the proximity sensor detects that an object is approaching the front camera, the front camera is activated, and it is determined whether the approaching object is Is the user’s finger, and then obtains the motion trajectory of the user’s finger based on the front camera and the tracking mode, and then obtains the four-dimensional coordinates of the sampling points on the motion trajectory based on a preset time interval, and the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, far-near Z-direction coordinates, and track time, and then based on the four-dimensional coordinates of each sampling point and a preset touch template to determine the touch type corresponding to the motion track, and execute it based on the touch type Touch operation.
- step S420 includes the following steps S421 and S422.
- step S421 when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the left-right sliding threshold is acquired data.
- the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, it can be preliminarily judged that the current touch is sliding to the left. Touch or right sliding touch, so the left and right sliding threshold data in the preset touch template is obtained.
- step S422 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data, it is determined that the touch type is left. Swipe touch or right swipe touch.
- the smart terminal it is further determined whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data.
- the above data matches, it can be further determined to be the left sliding touch Or slide right touch.
- the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
- step S422 includes the following steps a to c.
- step a the first sampling point and the second sampling point corresponding to the maximum X-direction coordinate difference are obtained, and the X-direction coordinate of the first sampling point is greater than the X-direction coordinate of the second sampling point.
- the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data, it is further determined whether the current sliding touch is left or right.
- the two sampling points corresponding to the maximum X-direction coordinate difference are obtained: the first sampling point and the second sampling point, the X-direction coordinate of the first sampling point is set to be greater than the X-direction coordinate of the second sampling point, and the two are further compared. The time corresponding to each sampling point.
- step b when the trajectory time of the first sampling point is later than the trajectory time of the second sampling point, it is determined that the touch type is right sliding touch.
- the X coordinate of the sampling point with the earlier track time is smaller than the X coordinate of the sampling point with the later track time.
- the trajectory time of one sampling point is later than the trajectory time of the second sampling point, and it is determined that the touch type corresponding to the current motion trajectory is right sliding touch.
- step c when the trajectory time of the first sampling point is earlier than the trajectory time of the second sampling point, it is determined that the touch type is left swipe touch.
- the X coordinate of the sampling point with the earlier track time is greater than the X coordinate of the sampling point with the later track time.
- the screen touch management method when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference , Acquiring the left and right sliding threshold data, and then determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference and the left and right sliding threshold data match when determining The touch type is left sliding touch or right sliding touch, and then it is accurately determined whether the touch type is left sliding touch or right sliding touch, which realizes that the position of the camera hole under the screen can respond to the screen touch operation.
- step S420 further includes steps S423 and S424.
- step S423 when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the up and down sliding threshold is acquired data.
- step S424 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data, it is determined that the touch type is up Slide touch or slide touch.
- the smart terminal it is further determined whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data.
- the above data matches, it can be further determined that it is an upward sliding touch. Or slide the touch.
- the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
- step S424 includes the following steps d to f.
- step d the third sampling point and the fourth sampling point corresponding to the maximum Y-direction coordinate difference are acquired, and the Y-direction coordinate of the third sampling point is greater than the Y-direction coordinate of the fourth sampling point.
- the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data
- two sampling points corresponding to the maximum Y-direction coordinate difference are obtained: the third sampling point and the fourth sampling point, the Y-direction coordinate of the third sampling point is set to be greater than the Y-direction coordinate of the fourth sampling point, and the two are further compared. The time corresponding to each sampling point.
- step e when the trajectory time of the third sampling point is later than the trajectory time of the fourth sampling point, it is determined that the touch type is an up-slide touch.
- the Y coordinate of the sampling point with the earlier track time is smaller than the Y coordinate of the sampling point with the later track time.
- the trajectory time of the three sampling points is later than the trajectory time of the fourth sampling point, and it is determined that the touch type corresponding to the current motion trajectory is the up-slide touch.
- step f when the trajectory time of the third sampling point is earlier than the trajectory time of the fourth sampling point, it is determined that the touch type is sliding touch.
- the Y coordinate of the sampling point with the earlier track time is greater than the Y coordinate of the sampling point with the later track time, so when the third The trajectory time of the sampling point is earlier than the trajectory time of the fourth sampling point, and it is determined that the touch type corresponding to the current motion trajectory is sliding touch.
- the screen touch management method when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference , Acquiring the upper and lower sliding threshold data, and then determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data.
- the touch type is slide-up touch or slide-down touch, and then it is accurately determined whether the touch type is slide-up touch or slide-down touch, so that the hole position of the camera under the screen can respond to the screen touch operation.
- step S420 further includes the following steps S425 and S426.
- step S425 when it is determined that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference, the click threshold data is acquired .
- the finger touches the screen position of the front camera opening from far and near, and leaves from near to far in a short time, such as 1s it is determined as a single click touch event.
- Double-click touch in a short period of time is judged as a double-tap touch event.
- the finger touches the screen position of the front camera opening from far and near it stays for a period of time, and then leaves from near and far. It is judged as a long-press touch event, whether it is a single-click touch, a double-click touch, or a long-press touch, the Z-direction coordinates of the motion trajectory corresponding to these click touch events change the most, and there are slight changes in the X and Y directions. .
- step S426 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data, it is determined that the touch type is a click touch. control.
- the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data.
- the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
- step S426 includes the following steps g to j.
- step g the number of occlusions for the front camera to be occluded and the occlusion duration each time the front camera is occluded are determined based on the motion trajectory.
- the click touch includes one-click touch, double-tap touch, and long-press touch.
- the one-click touch and double-tap touch shield the front camera for a short period of time, for example, less than 1s, while the long-press touch Control the occlusion duration of the front camera for a long time, such as greater than or equal to 2s. Therefore, it is necessary to determine the number of occlusions of the front camera and the occlusion duration of each time the front camera is occluded according to the motion trajectory, and then it is further determined to be a single click Touch, double-tap touch or long-press touch.
- step h when it is determined that the blocking duration satisfies the long-press condition, it is determined that the touch type is the long-press touch.
- the occlusion duration when there is a long-press occlusion duration longer than the preset duration among all occlusion durations, it is determined that the occlusion duration satisfies the long-press condition, wherein the preset duration is determined according to actual conditions, for example, the preset duration is equal to 2s.
- the preset duration is determined according to actual conditions, for example, the preset duration is equal to 2s.
- step i when it is determined that the occlusion duration does not satisfy the long-press condition and the number of occlusions is equal to the first preset value, it is determined that the touch type is a single-click touch.
- step j when it is determined that the occlusion duration does not meet the long-press condition, and the number of occlusions is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the second preset value.
- the first preset value when it is determined that the occlusion duration does not meet the long-press condition, and the number of occlusions is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the second preset value.
- the touch type is further determined according to the number of occlusions.
- the second preset value is The set value is greater than the first preset value.
- the first preset value is equal to 1
- the second preset value is equal to 2.
- FIG. 6 is a schematic diagram of functional modules of an embodiment of the screen touch management device of the present disclosure.
- the screen touch management device of the present disclosure includes: an activation module 10 configured to activate the front camera when the front camera is not turned on and when the proximity sensor detects that an object approaches the front camera, and determines Whether the approaching object is a user's finger; the acquiring module 20 is configured to activate the tracking mode when it is determined that the approaching object is a user's finger, and obtain the information of the user's finger based on the front camera and the tracking mode Motion trajectory; sampling module 30 configured to obtain four-dimensional coordinates of sampling points on the motion trajectory based on a preset time interval, the four-dimensional coordinates including horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and track time; and The determining module 40 is configured to determine the touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and
- the determining module 40 is further configured to determine the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference based on the four-dimensional coordinate values of the respective sampling points;
- the coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and a preset touch template determine the touch type corresponding to the motion track.
- the determining module 40 is further configured to determine that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
- the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference
- the maximum Z-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
- the determining module 40 is further configured to obtain the first sampling point and the second sampling point corresponding to the maximum X-direction coordinate difference, and the X-direction coordinate of the first sampling point is greater than that of the second sampling point.
- the trajectory time of the first sampling point is later than the trajectory time of the second sampling point, it is determined that the touch type is right sliding touch; when the trajectory time of the first sampling point is early At the track time of the second sampling point, it is determined that the touch type is left swipe touch.
- the determining module 40 is further configured to determine that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
- the vertical sliding threshold data is determined.
- the touch type is sliding up touch or sliding down touch.
- the determining module 40 is further configured to: obtain a third sampling point and a fourth sampling point corresponding to the maximum Y-direction coordinate difference, where the Y-direction coordinate of the third sampling point is greater than that of the fourth sampling point
- the trajectory time of the third sampling point is later than the trajectory time of the fourth sampling point, it is determined that the touch type is up-slip touch; when the trajectory time of the third sampling point is less than The track time of the fourth sampling point determines that the touch type is sliding touch.
- the determining module 40 is further configured to determine that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference.
- the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference matching the click threshold data it is determined that the touch The control type is click touch.
- the determining module 40 is further configured to determine the number of occlusions for the front camera to be blocked and the occlusion duration each time the front camera is occluded based on the motion trajectory; when determining that the occlusion duration satisfies the long-press condition When it is determined that the touch type is long-press touch; when it is determined that the blocking duration does not meet the long-press condition and the number of blocking times is equal to the first preset value, it is determined that the touch type is single-click touch; When it is determined that the blocking time does not meet the long-press condition, and the number of blocking times is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the first preset value .
- the determining module 40 is further configured to: when there is a long-press occlusion duration greater than a preset duration among all occlusion durations, determine that the occlusion duration satisfies the long-press condition.
- the embodiments of the present disclosure also provide a readable storage medium, the readable storage medium stores a screen touch management program, and when the screen touch management program is executed by a processor, the screen touch in each of the above embodiments is realized. Control management methods.
- the present disclosure activates the front camera when the front camera is not turned on and when the proximity sensor detects that an object approaches the front camera, and determines whether the approaching object is a user's finger, and then determines the
- the tracking mode is activated, and the movement track of the user's finger is obtained based on the front camera and the tracking mode, and then the sampling points on the movement track are obtained based on a preset time interval.
- the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and trajectory time.
- the touch type corresponding to the motion trajectory is determined , And perform a touch operation based on the touch type.
- the technical solution of the present disclosure essentially or the part that contributes to the prior art can be embodied in the form of a software product, and the computer software product is stored in a readable storage medium (such as ROM) as described above. /RAM, magnetic disk, optical disk), including several instructions to make a system device (can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (12)
- 一种屏幕触控管理方法,应用于设有接近传感器的智能终端,所述接近传感器安装在所述智能终端的前置摄像头预设范围内,所述智能终端设有追踪模式,其中,所述的屏幕触控管理方法包括:在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,其中,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
- 如权利要求1所述的屏幕触控管理方法,其中,基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:基于所述各个采样点的四维坐标值确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值;基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型。
- 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括左滑触控和右滑触控,所述预设触控模板包括左右滑动阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:在确定所述最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,获取所述左右滑动阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控。
- 如权利要求3所述的屏幕触控管理方法,其中,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控的步骤包括:获取所述最大X方向坐标差值对应的第一采样点和第二采样点,所述第一采样点的X方向坐标大于所述第二采样点的X方向坐标;响应于所述第一采样点的轨迹时间晚于所述第二采样点的轨迹时间,确定所述触控类型是右滑触控;响应于所述第一采样点的轨迹时间早于所述第二采样点的轨迹时间,确定所述触控类型是左滑触控。
- 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括上滑触控和下滑触控,所述预设触控模板包括上下滑动阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:在确定所述最大Y方向坐标差值大于所述最大X方向坐标差值,且所述最大Y方向坐标差值大于所述最大Z方向坐标差值时,获取所述上下滑动阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控。
- 如权利要求5所述的屏幕触控管理方法,其中,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控的步骤包括:获取所述最大Y方向坐标差值对应的第三采样点和第四采样点,所述第三采样点的Y方向坐标大于所述第四采样点的Y方向坐标;响应于所述第三采样点的轨迹时间晚于所述第四采样点的轨迹时间,确定所述触控类型是上滑触控;响应于所述第三采样点的轨迹时间小于所述第四采样点的轨迹时间,确定所述触控类型是下滑触控。
- 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括点击触控,所述预设触控模板包括点击阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:在确定所述最大Z方向坐标差值大于所述最大X方向坐标差值,且所述最大Z方向坐标差值大于所述最大Y方向坐标差值时,获取所述点击阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控。
- 如权利要求7所述的屏幕触控管理方法,其中,所述点击触控包括单击触控、双击触控和长按触控,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控的步骤包括:基于所述运动轨迹确定所述前置摄像头被遮挡的遮挡次数以及所述前置摄像头每次被遮挡的遮挡时长;在确定遮挡时长满足长按条件时,确定所述触控类型是长按触 控;在确定遮挡时长不满足长按条件,且所述遮挡次数等于第一预设值时,确定所述触控类型是单击触控;在确定遮挡时长不满足长按条件,且所述遮挡次数大于或等于第二预设值时,确定所述触控类型是双击触控,所述第二预设值大于所述第一预设值。
- 如权利要求8所述的屏幕触控管理方法,其中,所述在全部遮挡时长中存在大于预设时长的长按遮挡时长时,确定所述遮挡时长满足长按条件。
- 一种屏幕触控管理装置,包括:启动模块,配置为在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;获取模块,配置为在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;采样模块,配置为基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,其中,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;以及确定模块,配置为基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
- 一种智能终端,包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的屏幕触控管理程序,所述屏幕触控管理程序被所述处理器执行时实现如权利要求1至9中任一项所述的屏幕触控管理方法。
- 一种可读存储介质,其中,所述可读存储介质上存储有所述屏幕触控管理程序,所述屏幕触控管理程序被处理器执行时实现如权利要求1至9中任一项所述的屏幕触控管理方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910900469.3 | 2019-09-23 | ||
CN201910900469.3A CN112540696A (zh) | 2019-09-23 | 2019-09-23 | 屏幕触控管理方法、智能终端、装置及可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021057654A1 true WO2021057654A1 (zh) | 2021-04-01 |
Family
ID=75013168
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/116484 WO2021057654A1 (zh) | 2019-09-23 | 2020-09-21 | 屏幕触控管理方法、智能终端、装置及可读存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112540696A (zh) |
WO (1) | WO2021057654A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115150550A (zh) * | 2022-06-20 | 2022-10-04 | 湖北星纪时代科技有限公司 | 用于终端的拍照处理方法、装置、电子设备和存储介质 |
CN115150551A (zh) * | 2022-06-20 | 2022-10-04 | 湖北星纪时代科技有限公司 | 用于终端的拍照处理方法、装置、电子设备和存储介质 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113538762B (zh) * | 2021-09-16 | 2021-12-14 | 深圳市海清视讯科技有限公司 | 门禁平板设备菜单控制方法、装置、系统、介质及产品 |
CN114020192B (zh) * | 2021-09-18 | 2024-04-02 | 特斯联科技集团有限公司 | 一种基于曲面电容实现非金属平面的互动方法和系统 |
CN114115673B (zh) * | 2021-11-25 | 2023-10-27 | 海信集团控股股份有限公司 | 车载屏幕的控制方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150286281A1 (en) * | 2014-04-04 | 2015-10-08 | Alibaba Group Holding Limited | Generating a screenshot |
CN106055143A (zh) * | 2016-05-20 | 2016-10-26 | 广州视睿电子科技有限公司 | 触摸点位置检测方法和系统 |
CN109298798A (zh) * | 2018-09-21 | 2019-02-01 | 歌尔科技有限公司 | 触控板的操作控制方法、设备以及智能终端 |
-
2019
- 2019-09-23 CN CN201910900469.3A patent/CN112540696A/zh active Pending
-
2020
- 2020-09-21 WO PCT/CN2020/116484 patent/WO2021057654A1/zh active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150286281A1 (en) * | 2014-04-04 | 2015-10-08 | Alibaba Group Holding Limited | Generating a screenshot |
CN106055143A (zh) * | 2016-05-20 | 2016-10-26 | 广州视睿电子科技有限公司 | 触摸点位置检测方法和系统 |
CN109298798A (zh) * | 2018-09-21 | 2019-02-01 | 歌尔科技有限公司 | 触控板的操作控制方法、设备以及智能终端 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115150550A (zh) * | 2022-06-20 | 2022-10-04 | 湖北星纪时代科技有限公司 | 用于终端的拍照处理方法、装置、电子设备和存储介质 |
CN115150551A (zh) * | 2022-06-20 | 2022-10-04 | 湖北星纪时代科技有限公司 | 用于终端的拍照处理方法、装置、电子设备和存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN112540696A (zh) | 2021-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021057654A1 (zh) | 屏幕触控管理方法、智能终端、装置及可读存储介质 | |
US10739854B2 (en) | Terminal and touch response method and device | |
WO2019033957A1 (zh) | 交互位置确定方法、系统、存储介质和智能终端 | |
US20140270413A1 (en) | Auxiliary device functionality augmented with fingerprint sensor | |
US9268407B1 (en) | Interface elements for managing gesture control | |
US20170083741A1 (en) | Method and device for generating instruction | |
US20140270414A1 (en) | Auxiliary functionality control and fingerprint authentication based on a same user input | |
WO2022110614A1 (zh) | 手势识别方法及装置、电子设备和存储介质 | |
CN108920202B (zh) | 应用预加载管理方法、装置、存储介质及智能终端 | |
WO2013000381A1 (zh) | 控制移动终端状态的方法及移动终端 | |
CN109558000B (zh) | 一种人机交互方法及电子设备 | |
WO2005119591A1 (ja) | 表示制御方法および装置、プログラム、並びに携帯機器 | |
WO2014105012A1 (en) | System and method for gesture based touchscreen control of displays | |
CN113253908B (zh) | 按键功能执行方法、装置、设备及存储介质 | |
CN110730298A (zh) | 一种显示控制方法及电子设备 | |
US12022190B2 (en) | Photographing method and electronic device | |
CN106325623A (zh) | 在触摸屏上监测触摸的方法、装置及终端设备 | |
JP6911870B2 (ja) | 表示制御装置、表示制御方法及びコンピュータプログラム | |
US11886643B2 (en) | Information processing apparatus and information processing method | |
CN112749590B (zh) | 目标检测方法、装置、计算机设备和计算机可读存储介质 | |
CN113867550A (zh) | 电子设备的姿态检测方法及装置、存储介质 | |
WO2021204101A1 (zh) | 显示方法及电子设备 | |
JP7413546B2 (ja) | 撮影方法及び電子機器 | |
US9350918B1 (en) | Gesture control for managing an image view display | |
CN106325622B (zh) | 自电容式压力触摸装置及终端设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20870068 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20870068 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.02.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20870068 Country of ref document: EP Kind code of ref document: A1 |