WO2021057654A1 - 屏幕触控管理方法、智能终端、装置及可读存储介质 - Google Patents

屏幕触控管理方法、智能终端、装置及可读存储介质 Download PDF

Info

Publication number
WO2021057654A1
WO2021057654A1 PCT/CN2020/116484 CN2020116484W WO2021057654A1 WO 2021057654 A1 WO2021057654 A1 WO 2021057654A1 CN 2020116484 W CN2020116484 W CN 2020116484W WO 2021057654 A1 WO2021057654 A1 WO 2021057654A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
maximum
direction coordinate
coordinate difference
determined
Prior art date
Application number
PCT/CN2020/116484
Other languages
English (en)
French (fr)
Inventor
冯凯
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2021057654A1 publication Critical patent/WO2021057654A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the present disclosure relates to, but is not limited to, the technical field of smart terminals.
  • the present disclosure provides a screen touch management method.
  • the screen touch management method includes: starting the front camera when the front camera is not turned on and the proximity sensor detects that an object approaches the front camera. Set a camera and determine whether the approaching object is a user's finger; when it is determined that the approaching object is a user's finger, activate the tracking mode, and obtain the movement of the user's finger based on the front camera and the tracking mode Trajectory; obtain the four-dimensional coordinates of the sampling points on the motion trajectory based on a preset time interval, the four-dimensional coordinates including horizontal X coordinate, vertical Y coordinate, distance Z coordinate and trajectory time; based on the four-dimensional coordinates of each sampling point and
  • the preset touch template determines the touch type corresponding to the motion track, and performs a touch operation based on the touch type.
  • the present disclosure also provides a screen touch management device, the screen touch management device includes: an activation module configured to when the front camera is not turned on and the proximity sensor detects an object approaching the front camera , Start the front camera, and determine whether the approaching object is a user's finger; the acquisition module is configured to activate the tracking mode when it is determined that the approaching object is a user's finger, and based on the front camera and the The tracking mode obtains the movement trajectory of the user's finger; the sampling module is configured to obtain the four-dimensional coordinates of the sampling points on the movement trajectory based on a preset time interval, and the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, and distance. Z-direction coordinates and track time; the determining module is configured to determine the touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and perform a touch operation based on the touch type.
  • an activation module configured to when the front camera is not turned on and
  • the present disclosure also provides an intelligent terminal.
  • the intelligent terminal includes a memory, a processor, and a screen touch management program that is stored on the memory and can run on the processor, and the screen touch management program is When the processor is executed, any one of the screen touch management methods described herein is implemented.
  • the present disclosure also provides a readable storage medium having a screen touch management program stored on the readable storage medium, and when the screen touch management program is executed by a processor, any one of the screen touch management methods described herein is implemented .
  • FIG. 1 is a schematic structural diagram of a smart terminal in a hardware operating environment involved in an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure
  • FIG. 3 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure.
  • FIG. 4 is a schematic flowchart of a screen touch management method according to an embodiment of the disclosure.
  • FIG. 5 is a schematic flowchart of a screen touch management method according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of functional modules of a screen touch management device according to an embodiment of the disclosure.
  • Fig. 1 is a schematic structural diagram of a smart terminal in a hardware operating environment involved in a solution of an embodiment of the present disclosure.
  • the smart terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection and communication between these components.
  • the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard).
  • the user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the memory 1005 may be a high-speed RAM memory, or a non-volatile memory (non-volatile memory), such as a magnetic disk memory.
  • the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
  • the smart terminal may also include a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and so on.
  • Sensors such as light sensors, motion sensors and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display screen according to the brightness of the ambient light.
  • the proximity sensor can turn off the display screen and/or backlight when the mobile terminal is moved to the ear.
  • the posture sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary.
  • the smart terminal can be used to identify the application of mobile terminal posture (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; in addition, the smart terminal can also be equipped with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. Go into details again.
  • sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. Go into details again.
  • the structure of the smart terminal shown in FIG. 1 does not constitute a limitation on the terminal, and may include more or less components than those shown in the figure, or a combination of certain components, or different component arrangements.
  • the memory 1005 which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and a screen touch management program.
  • the network interface 1004 is mainly used to connect to a back-end server and communicate with the back-end server;
  • the user interface 1003 is mainly used to connect to a client (user side) to communicate with the client;
  • the processor 1001 can be used to call the screen touch management program stored in the memory 1005.
  • the smart terminal includes: a memory 1005, a processor 1001, and a screen touch management program that is stored on the memory 1005 and can run on the processor 1001.
  • the processor 1001 calls the memory 1005 stored in the memory 1005.
  • the screen touch management program the screen touch management method provided by each embodiment of the present disclosure is executed.
  • FIG. 2 is a schematic flowchart of a screen touch management method according to an embodiment of the present disclosure.
  • the embodiments of the present disclosure provide an embodiment of a screen touch management method. It should be noted that although the logical sequence is shown in the flowchart, in some cases, the sequence shown may be executed in a different order than here. Or the steps described.
  • the screen touch management method includes the following steps S100 to S400.
  • step S100 when the front camera is not turned on and the proximity sensor detects that an object approaches the front camera based on the proximity sensor, the front camera is activated, and it is determined whether the approaching object is a user's finger.
  • the current full screens of smart terminals are all quasi-full screens, and true full-screen mobile phones will appear in the future, and full-screen mobile phones will be an inevitable trend in the future.
  • Full-screen mobile phones have solved technical barriers such as under-screen fingerprints and screen sound.
  • the technical barriers that still need to be resolved are under-screen cameras.
  • Under-screen cameras need to solve two main problems: First, the location of the under-screen camera openings on the screen Display; the second is the touch of the hole position of the camera under the screen.
  • the technical solution of the present disclosure solves the second problem: the touch of the hole position of the camera under the screen.
  • the smart terminal is provided with a proximity sensor at the camera opening position to detect whether an object is close to the under-screen camera, so as to trigger the activation of the under-screen camera.
  • the proximity sensor is a device with the ability to perceive the proximity of an object. It uses the sensitive characteristic of the displacement sensor to recognize the proximity of the object and outputs the corresponding switch signal. Therefore, the proximity sensor is usually called a proximity switch. It is a general term for sensors that replace contact detection methods such as switches and do not need to touch the detected object. It can detect the movement and existence of the object and convert it into an electrical signal.
  • the smart terminal is also equipped with a tracking mode, which is used to capture the movement trajectory of the user's finger according to the video image taken by the camera.
  • the front camera of the smart terminal When the front camera of the smart terminal is not turned on and the proximity sensor detects that an object is close to the front camera, the front camera is activated to take an image of the approaching object, and it is determined whether the approaching object is the user's finger according to a preset image recognition algorithm. It should be noted that if the front camera of the smart terminal is being used for taking photos, facial recognition, video recording, video calls, etc., if the proximity sensor detects an object approaching the front camera, it will not respond to the corresponding front camera under the screen. The touch event of the opening position.
  • step S200 when it is determined that the approaching object is a user's finger, the tracking mode is activated, and the movement track of the user's finger is acquired based on the front camera and the tracking mode.
  • the proximity sensor when the proximity sensor detects that an object is approaching the front camera, it determines that the approaching object is the user's finger according to the front camera of the smart terminal and the preset image recognition algorithm, then starts the tracking mode of the smart terminal, and then according to The video image captured by the camera captures the movement trajectory of the user's finger.
  • Motion trajectory refers to the spatial characteristics of the action composed of the route taken by a certain part of the body from the start position to the end.
  • the motion trajectory is represented by the motion trajectory direction, the motion trajectory form and the motion amplitude.
  • the movement trajectory of the user's finger refers to the spatial characteristics of the action composed of the route that the finger travels from the start position to the end in the shooting area after the front camera is turned on.
  • the direction of the movement trajectory of the finger is constantly changing, and the movement The trajectory form is a curve.
  • moving target tracking is to find the moving target of interest in real time in each image in a sequence of images, including motion parameters such as position, speed, and acceleration.
  • the tracking mode of the smart terminal uses the existing The tracking algorithm recognizes the user's finger from the video image taken by the front camera, and the route the finger travels is the finger's movement trajectory.
  • step S300 four-dimensional coordinates of sampling points on the motion track are acquired based on a preset time interval, where the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and track time;
  • the trajectory of the finger can be represented by a four-dimensional coordinate system, which are the horizontal X axis, the vertical Y axis, the far and near direction Z axis, and the time T axis.
  • the origin of the coordinate system can be set according to the actual situation.
  • the lower left corner of the opening position corresponding to the front camera under the screen is the origin.
  • horizontal to the right is the positive coordinate direction of the X-axis
  • vertical upward is the positive coordinate direction of the Y-axis.
  • the direction away from the screen is the positive direction of the Z axis
  • the time T axis is the real time.
  • the trajectory of the finger is a curve expressed in a four-dimensional coordinate system. Therefore, the trajectory can be sampled according to a preset time interval to obtain multiple sampling points.
  • Each sampling point is represented by four-dimensional coordinates.
  • the four-dimensional coordinates include horizontal X-direction coordinates, The vertical Y-direction coordinates, the near-far Z-direction coordinates, and the track time, these sampling points are used to determine the touch type corresponding to the finger operation, for example, to determine the current finger operation is sliding left, sliding down, etc.
  • the preset time interval is determined according to the actual situation, and the preset time interval determines the number of sampling points, and at least 2 sampling points must be guaranteed.
  • step S400 a touch type corresponding to the motion track is determined based on the four-dimensional coordinates of each sampling point and a preset touch template, and a touch operation is performed based on the touch type.
  • the touch type corresponding to the motion trajectory is further determined according to the four-dimensional coordinates of the sampling points and the preset touch template.
  • Touch types include: left sliding touch, right sliding touch, sliding up touch, down touch, single-click touch, double-click touch, and long-press touch; the preset touch template saves left and right sliding threshold data, Up and down sliding threshold data and click threshold data.
  • the left and right sliding threshold data is a three-dimensional array, including X direction data, Y direction data, and Z direction data.
  • the left and right sliding threshold data is used to determine whether it is a left and right sliding touch; similarly, sliding up and down Threshold data and click threshold data are also three-dimensional arrays, including X-direction data, Y-direction data, and Z-direction data, which are used to determine whether it is sliding touch up and down and click touch respectively.
  • step S400 includes the following steps S410 and S420.
  • step S410 the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference are determined based on the four-dimensional coordinate values of the respective sampling points.
  • the X-direction coordinate difference, the Y-direction coordinate difference, and the Z-direction coordinate difference between each sampling point are respectively calculated, and the maximum X-direction coordinate difference is obtained from all the X-direction coordinate differences, and Obtain the largest Y-direction coordinate difference among all Y-direction coordinate differences, and obtain the largest Z-direction coordinate difference among all Z-direction coordinate differences.
  • step S420 the touch type corresponding to the motion track is determined based on the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and a preset touch template.
  • the left and right sliding threshold data, the up and down sliding threshold data, and the click threshold data are stored in the preset touch template, and the maximum X coordinate difference, the maximum Y coordinate difference, and the maximum Z coordinate difference are combined with The preset touch template is matched, and the touch type corresponding to the motion track is determined according to the matching result.
  • the front camera when the front camera is not turned on and the proximity sensor detects that an object is approaching the front camera, the front camera is activated, and it is determined whether the approaching object is Is the user’s finger, and then obtains the motion trajectory of the user’s finger based on the front camera and the tracking mode, and then obtains the four-dimensional coordinates of the sampling points on the motion trajectory based on a preset time interval, and the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, far-near Z-direction coordinates, and track time, and then based on the four-dimensional coordinates of each sampling point and a preset touch template to determine the touch type corresponding to the motion track, and execute it based on the touch type Touch operation.
  • step S420 includes the following steps S421 and S422.
  • step S421 when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the left-right sliding threshold is acquired data.
  • the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, it can be preliminarily judged that the current touch is sliding to the left. Touch or right sliding touch, so the left and right sliding threshold data in the preset touch template is obtained.
  • step S422 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data, it is determined that the touch type is left. Swipe touch or right swipe touch.
  • the smart terminal it is further determined whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data.
  • the above data matches, it can be further determined to be the left sliding touch Or slide right touch.
  • the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
  • step S422 includes the following steps a to c.
  • step a the first sampling point and the second sampling point corresponding to the maximum X-direction coordinate difference are obtained, and the X-direction coordinate of the first sampling point is greater than the X-direction coordinate of the second sampling point.
  • the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left and right sliding threshold data, it is further determined whether the current sliding touch is left or right.
  • the two sampling points corresponding to the maximum X-direction coordinate difference are obtained: the first sampling point and the second sampling point, the X-direction coordinate of the first sampling point is set to be greater than the X-direction coordinate of the second sampling point, and the two are further compared. The time corresponding to each sampling point.
  • step b when the trajectory time of the first sampling point is later than the trajectory time of the second sampling point, it is determined that the touch type is right sliding touch.
  • the X coordinate of the sampling point with the earlier track time is smaller than the X coordinate of the sampling point with the later track time.
  • the trajectory time of one sampling point is later than the trajectory time of the second sampling point, and it is determined that the touch type corresponding to the current motion trajectory is right sliding touch.
  • step c when the trajectory time of the first sampling point is earlier than the trajectory time of the second sampling point, it is determined that the touch type is left swipe touch.
  • the X coordinate of the sampling point with the earlier track time is greater than the X coordinate of the sampling point with the later track time.
  • the screen touch management method when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference , Acquiring the left and right sliding threshold data, and then determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference and the left and right sliding threshold data match when determining The touch type is left sliding touch or right sliding touch, and then it is accurately determined whether the touch type is left sliding touch or right sliding touch, which realizes that the position of the camera hole under the screen can respond to the screen touch operation.
  • step S420 further includes steps S423 and S424.
  • step S423 when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the up and down sliding threshold is acquired data.
  • step S424 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data, it is determined that the touch type is up Slide touch or slide touch.
  • the smart terminal it is further determined whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data.
  • the above data matches, it can be further determined that it is an upward sliding touch. Or slide the touch.
  • the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
  • step S424 includes the following steps d to f.
  • step d the third sampling point and the fourth sampling point corresponding to the maximum Y-direction coordinate difference are acquired, and the Y-direction coordinate of the third sampling point is greater than the Y-direction coordinate of the fourth sampling point.
  • the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data
  • two sampling points corresponding to the maximum Y-direction coordinate difference are obtained: the third sampling point and the fourth sampling point, the Y-direction coordinate of the third sampling point is set to be greater than the Y-direction coordinate of the fourth sampling point, and the two are further compared. The time corresponding to each sampling point.
  • step e when the trajectory time of the third sampling point is later than the trajectory time of the fourth sampling point, it is determined that the touch type is an up-slide touch.
  • the Y coordinate of the sampling point with the earlier track time is smaller than the Y coordinate of the sampling point with the later track time.
  • the trajectory time of the three sampling points is later than the trajectory time of the fourth sampling point, and it is determined that the touch type corresponding to the current motion trajectory is the up-slide touch.
  • step f when the trajectory time of the third sampling point is earlier than the trajectory time of the fourth sampling point, it is determined that the touch type is sliding touch.
  • the Y coordinate of the sampling point with the earlier track time is greater than the Y coordinate of the sampling point with the later track time, so when the third The trajectory time of the sampling point is earlier than the trajectory time of the fourth sampling point, and it is determined that the touch type corresponding to the current motion trajectory is sliding touch.
  • the screen touch management method when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference , Acquiring the upper and lower sliding threshold data, and then determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the upper and lower sliding threshold data.
  • the touch type is slide-up touch or slide-down touch, and then it is accurately determined whether the touch type is slide-up touch or slide-down touch, so that the hole position of the camera under the screen can respond to the screen touch operation.
  • step S420 further includes the following steps S425 and S426.
  • step S425 when it is determined that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference, the click threshold data is acquired .
  • the finger touches the screen position of the front camera opening from far and near, and leaves from near to far in a short time, such as 1s it is determined as a single click touch event.
  • Double-click touch in a short period of time is judged as a double-tap touch event.
  • the finger touches the screen position of the front camera opening from far and near it stays for a period of time, and then leaves from near and far. It is judged as a long-press touch event, whether it is a single-click touch, a double-click touch, or a long-press touch, the Z-direction coordinates of the motion trajectory corresponding to these click touch events change the most, and there are slight changes in the X and Y directions. .
  • step S426 when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data, it is determined that the touch type is a click touch. control.
  • the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data.
  • the above data does not match, the current motion track corresponds to an invalid touch, and the smart terminal does not perform any operation.
  • step S426 includes the following steps g to j.
  • step g the number of occlusions for the front camera to be occluded and the occlusion duration each time the front camera is occluded are determined based on the motion trajectory.
  • the click touch includes one-click touch, double-tap touch, and long-press touch.
  • the one-click touch and double-tap touch shield the front camera for a short period of time, for example, less than 1s, while the long-press touch Control the occlusion duration of the front camera for a long time, such as greater than or equal to 2s. Therefore, it is necessary to determine the number of occlusions of the front camera and the occlusion duration of each time the front camera is occluded according to the motion trajectory, and then it is further determined to be a single click Touch, double-tap touch or long-press touch.
  • step h when it is determined that the blocking duration satisfies the long-press condition, it is determined that the touch type is the long-press touch.
  • the occlusion duration when there is a long-press occlusion duration longer than the preset duration among all occlusion durations, it is determined that the occlusion duration satisfies the long-press condition, wherein the preset duration is determined according to actual conditions, for example, the preset duration is equal to 2s.
  • the preset duration is determined according to actual conditions, for example, the preset duration is equal to 2s.
  • step i when it is determined that the occlusion duration does not satisfy the long-press condition and the number of occlusions is equal to the first preset value, it is determined that the touch type is a single-click touch.
  • step j when it is determined that the occlusion duration does not meet the long-press condition, and the number of occlusions is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the second preset value.
  • the first preset value when it is determined that the occlusion duration does not meet the long-press condition, and the number of occlusions is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the second preset value.
  • the touch type is further determined according to the number of occlusions.
  • the second preset value is The set value is greater than the first preset value.
  • the first preset value is equal to 1
  • the second preset value is equal to 2.
  • FIG. 6 is a schematic diagram of functional modules of an embodiment of the screen touch management device of the present disclosure.
  • the screen touch management device of the present disclosure includes: an activation module 10 configured to activate the front camera when the front camera is not turned on and when the proximity sensor detects that an object approaches the front camera, and determines Whether the approaching object is a user's finger; the acquiring module 20 is configured to activate the tracking mode when it is determined that the approaching object is a user's finger, and obtain the information of the user's finger based on the front camera and the tracking mode Motion trajectory; sampling module 30 configured to obtain four-dimensional coordinates of sampling points on the motion trajectory based on a preset time interval, the four-dimensional coordinates including horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and track time; and The determining module 40 is configured to determine the touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and
  • the determining module 40 is further configured to determine the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference based on the four-dimensional coordinate values of the respective sampling points;
  • the coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and a preset touch template determine the touch type corresponding to the motion track.
  • the determining module 40 is further configured to determine that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference, and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
  • the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference
  • the maximum Z-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
  • the determining module 40 is further configured to obtain the first sampling point and the second sampling point corresponding to the maximum X-direction coordinate difference, and the X-direction coordinate of the first sampling point is greater than that of the second sampling point.
  • the trajectory time of the first sampling point is later than the trajectory time of the second sampling point, it is determined that the touch type is right sliding touch; when the trajectory time of the first sampling point is early At the track time of the second sampling point, it is determined that the touch type is left swipe touch.
  • the determining module 40 is further configured to determine that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference.
  • the vertical sliding threshold data is determined.
  • the touch type is sliding up touch or sliding down touch.
  • the determining module 40 is further configured to: obtain a third sampling point and a fourth sampling point corresponding to the maximum Y-direction coordinate difference, where the Y-direction coordinate of the third sampling point is greater than that of the fourth sampling point
  • the trajectory time of the third sampling point is later than the trajectory time of the fourth sampling point, it is determined that the touch type is up-slip touch; when the trajectory time of the third sampling point is less than The track time of the fourth sampling point determines that the touch type is sliding touch.
  • the determining module 40 is further configured to determine that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference, and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference.
  • the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference matching the click threshold data it is determined that the touch The control type is click touch.
  • the determining module 40 is further configured to determine the number of occlusions for the front camera to be blocked and the occlusion duration each time the front camera is occluded based on the motion trajectory; when determining that the occlusion duration satisfies the long-press condition When it is determined that the touch type is long-press touch; when it is determined that the blocking duration does not meet the long-press condition and the number of blocking times is equal to the first preset value, it is determined that the touch type is single-click touch; When it is determined that the blocking time does not meet the long-press condition, and the number of blocking times is greater than or equal to a second preset value, it is determined that the touch type is double-tap touch, and the second preset value is greater than the first preset value .
  • the determining module 40 is further configured to: when there is a long-press occlusion duration greater than a preset duration among all occlusion durations, determine that the occlusion duration satisfies the long-press condition.
  • the embodiments of the present disclosure also provide a readable storage medium, the readable storage medium stores a screen touch management program, and when the screen touch management program is executed by a processor, the screen touch in each of the above embodiments is realized. Control management methods.
  • the present disclosure activates the front camera when the front camera is not turned on and when the proximity sensor detects that an object approaches the front camera, and determines whether the approaching object is a user's finger, and then determines the
  • the tracking mode is activated, and the movement track of the user's finger is obtained based on the front camera and the tracking mode, and then the sampling points on the movement track are obtained based on a preset time interval.
  • the four-dimensional coordinates include horizontal X-direction coordinates, vertical Y-direction coordinates, near-far Z-direction coordinates, and trajectory time.
  • the touch type corresponding to the motion trajectory is determined , And perform a touch operation based on the touch type.
  • the technical solution of the present disclosure essentially or the part that contributes to the prior art can be embodied in the form of a software product, and the computer software product is stored in a readable storage medium (such as ROM) as described above. /RAM, magnetic disk, optical disk), including several instructions to make a system device (can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开提供了一种屏幕触控管理方法,包括:在前置摄像头未开启且基于接近传感器检测到物体接近所述前置摄像头时,启动前置摄像头,并确定接近物体是否为用户手指,而后在确定接近物体是用户手指时,启动追踪模式,并基于前置摄像头以及追踪模式获取用户手指的运动轨迹,接下来基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,然后基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于触控类型执行触控操作。本公开还公开了一种装置、智能终端及可读存储介质。

Description

屏幕触控管理方法、智能终端、装置及可读存储介质 技术领域
本公开涉及但不限于智能终端技术领域。
背景技术
随着手机、平板电脑等智能终端技术的快速发展,手机、平板电脑等智能终端的应用越来越广泛,用户对智能终端屏幕的要求也越来越高。
随着智能终端屏幕技术的发展,各类具有超高屏占比的屏幕开始出现,包括水滴、刘海等形态,全面屏成为智能终端的一大趋势。要做到真正的全面屏,屏下摄像头是必须要解决的技术问题,在屏下摄像头开孔位置能正常响应屏幕的触控操作,但是目前的水滴屏、刘海屏等技术都没有解决这个问题。
发明内容
本公开提供一种屏幕触控管理方法,所述的屏幕触控管理方法包括:在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
本公开还提供一种屏幕触控管理装置,所述屏幕触控管理装置包括:启动模块,配置为在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;获取模块,配置为在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及 所述追踪模式获取所述用户手指的运动轨迹;采样模块,配置为基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;确定模块,配置为基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
本公开还提供一种智能终端,所述智能终端包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的屏幕触控管理程序,所述屏幕触控管理程序被所述处理器执行时实现本文所述任一项屏幕触控管理方法。
本公开还提供一种可读存储介质,所述可读存储介质上存储有屏幕触控管理程序,所述屏幕触控管理程序被处理器执行时实现本文所述任一项屏幕触控管理方法。
附图说明
图1是本公开实施例涉及的硬件运行环境中智能终端的结构示意图;
图2为本公开实施例的屏幕触控管理方法的流程示意图;
图3为本公开实施例的屏幕触控管理方法的流程示意图;
图4为本公开实施例的屏幕触控管理方法的流程示意图;
图5为本公开实施例的屏幕触控管理方法的流程示意图;
图6为本公开实施例的屏幕触控管理装置的功能模块示意图。
具体实施方式
为了使本公开的目的、技术方案及优点更加清楚明白,下面通过具体实施方式结合附图对本公开实施例作进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本公开,并不用于限定本公开。此外,本公开实施例及实施例的特征,在不冲突的情况下,可以任意组合。
如图1所示,图1是本公开实施例方案涉及的硬件运行环境中 智能终端的结构示意图。如图1所示,该智能终端可以包括:处理器1001,例如CPU,网络接口1004,用户接口1003,存储器1005,通信总线1002。通信总线1002用于实现这些组件之间的连接通信。用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard),在一实施方式中,用户接口1003还可以包括标准的有线接口、无线接口。网络接口1004可以包括标准的有线接口、无线接口(如WI-FI接口)。存储器1005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如磁盘存储器。存储器1005还可以是独立于前述处理器1001的存储装置。
在一实施方式中,智能终端还可以包括摄像头、RF(Radio Frequency,射频)电路,传感器、音频电路、WiFi模块等。传感器比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,环境光传感器可根据环境光线的明暗来调节显示屏的亮度,接近传感器可在移动终端移动到耳边时,关闭显示屏和/或背光。作为运动传感器的一种,姿态传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别移动终端姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;另外,智能终端还可配置陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
本领域技术人员可以理解,图1中示出的智能终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
如图1所示,作为一种计算机存储介质的存储器1005中可以包括操作系统、网络通信模块、用户接口模块以及屏幕触控管理程序。
在图1所示的终端中,网络接口1004主要用于连接后台服务器,与后台服务器进行数据通信;用户接口1003主要用于连接客户端(用户端),与客户端进行数据通信;而处理器1001可以用于调用存储器1005中存储的屏幕触控管理程序。
在本实施例中,智能终端包括:存储器1005、处理器1001及存 储在所述存储器1005上并可在所述处理器1001上运行的屏幕触控管理程序,处理器1001调用存储器1005中存储的屏幕触控管理程序时,执行本公开各个实施例提供的屏幕触控管理方法。
本公开还提供一种屏幕触控管理方法,参照图2,图2为本公开实施例的屏幕触控管理方法的流程示意图。
本公开实施例提供了屏幕触控管理方法的实施例,需要说明的是,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
在本实施例中,该屏幕触控管理方法包括如下步骤S100至S400。
在步骤S100,在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指。
在本实施例中,目前智能终端的全面屏都是准全面屏,未来将会出现真正的全面屏手机,全面屏手机将是未来必然的趋势。全面屏手机目前已经解决了屏下指纹,屏幕发声等技术障碍,目前仍需要解决的技术障碍就是屏下摄像头,屏下摄像头需要解决两个主要问题:一是,屏下摄像头开孔位置屏幕的显示;二是屏下摄像头开孔位置的触控。本公开的技术方案解决的正是第二个问题:屏下摄像头开孔位置的触控。
具体地,智能终端在摄像头开孔位置设置接近传感器,用来检测是否有物体靠近屏下摄像头,以便触发屏下摄像头的启动。接近传感器是一种具有感知物体接近能力的器件,它利用位移传感器对接近的物体具有敏感特性来识别物体的接近,并输出相应开关信号,因此,通常又把接近传感器称为接近开关。它是代替开关等接触式检测式检测方式,以无需接触被检测对象为目的的传感器的总称,它能检测对象的移动和存在信息并转化成电信号。另外,智能终端还设有追踪模式,该追踪模式用于根据摄像头拍摄的视频图像捕捉用户手指的运动轨迹。当智能终端的前置摄像头未开启,并且接近传感器检测到物体接近前置摄像头时,则启动前置摄像头,拍摄接近物体的图像,根据预设的图像识别算法确定接近的物体是否为用户手指。需要说明的是, 如果智能终端的前置摄像头正在用于拍照、面部识别、摄像、视频通话等使用期间,如果接近传感器检测到有物体接近前置摄像头,则不响应屏下前置摄像头对应的开孔位置的触控事件。
在步骤S200,在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹。
在本实施例中,接近传感器检测到物体接近前置摄像头时,并根据智能终端的前置摄像头以及预设的图像识别算法确定接近的物体是用户手指后,启动智能终端的追踪模式,然后根据摄像头拍摄的视频图像捕捉用户手指的运动轨迹。运动轨迹是指身体的某一部分从开始位置到结束为止所经过的路线组成的动作的空间特征。运动轨迹由运动轨迹方向、运动轨迹形式和运动幅度表示。在本公开中,用户手指的运动轨迹是指,前置摄像头开启后,手指在拍摄区域内从开始位置到结束为止所经过的路线组成的动作的空间特征,手指的运动轨迹方向不断变化,运动轨迹形式是一条曲线。
具体地,运动目标跟踪就是在一段序列图像中的每幅图像中实时地找到所感兴趣的运动目标,包括位置、速度及加速度等运动参数,在本公开中,智能终端的追踪模式利用现有的追踪算法,从前置摄像头拍摄的视频图像识别出用户手指,手指所经过的路线就是手指的运动轨迹。
在步骤S300,基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;
在本实施例中,手指的运动轨迹可以用四维坐标系表示,分别为水平方向X轴、垂直方向Y轴、远近方向Z轴以及时间T轴,坐标系原点可以根据实际情况设定,在本公开中为了描述方便,将屏下前置摄像头对应的开孔位置的左下角为原点,相对于原点,水平向右为X轴正坐标方向,垂直向上为Y轴正坐标方向,垂直智能终端的屏幕,远离屏幕的方向为Z轴正坐标方向,时间T轴就是真实的时间。手指的运动轨迹是一条曲线,用四维坐标系表示,故可以根据预 设时间间隔对运动轨迹进行采样,得到多个采样点,每个采样点由四维坐标表示,四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间,这些采样点用于确定手指操作对应的触控类型,例如,确定当前的手指操作是向左滑动、向下滑动等。预设时间间隔根据实际情况确定,预设时间间隔确定了采样点的数量,至少要保证有2个采样点。
在步骤S400,基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
在本实施例中,根据预设时间间隔获取运动轨迹上采样点的四维坐标后,进一步根据采样点的四维坐标以及预设触控模板确定运动轨迹对应的触控类型。触控类型包括:左滑触控、右滑触控、上滑触控、下滑触控、单击触控、双击触控以及长按触控;预设触控模板保存了左右滑动阈值数据、上下滑动阈值数据以及点击阈值数据,左右滑动阈值数据是一个三维数组,包括X方向数据、Y方向数据、Z方向数据,左右滑动阈值数据用来判断是否为左右滑动触控;同理,上下滑动阈值数据以及点击阈值数据也都是三维数组,包括X方向数据、Y方向数据、Z方向数据,分别用来确定是否为上下滑动触控以及点击触控。
具体地,步骤S400包括如下步骤S410和S420。
在步骤S410,基于所述各个采样点的四维坐标值确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值。
在本实施例中,分别计算各个采样点之间的X方向坐标差值、Y方向坐标差值以及Z方向坐标差值,并在所有X方向坐标差值中获取最大X方向坐标差值,在所有Y方向坐标差值中获取最大Y方向坐标差值,以及在所有Z方向坐标差值中获取最大Z方向坐标差值。
在步骤S420,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型。
在本实施例中,预设触控模板中保存了左右滑动阈值数据、上 下滑动阈值数据以及点击阈值数据,将最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与预设触控模板进行匹配,根据匹配结果确定运动轨迹对应的触控类型。
本实施例提出的屏幕触控管理方法,在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指,而后基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹,接下来基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间,然后基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。通过智能终端的追踪模式获取用户手指的运动轨迹,将运动轨迹上采样点的四维坐标与预设触控模板进行匹配,进而确定触控类型,实现了屏下摄像头开孔位置可以响应屏幕触控操作。
基于本公开的上述示例性实施例,参照图3,提出本公开屏幕触控管理方法的另一示例性实施例,在本实施例中,步骤S420包括如下步骤S421和S422。
在步骤S421,在确定所述最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,获取所述左右滑动阈值数据。
在本实施例中,当手指由右及左接触到前置摄像头开孔的屏幕位置时,摄像头被手指遮挡的范围由右半边->全部->左半边变化时,则判定为一次左滑触控事件,左滑触控事件对应的运动轨迹的X方向坐标变化最大,Y方向和Z方向有微小变化。同理,当手指由左及右接触到前置摄像头开孔的屏幕位置,摄像头被手指遮挡的范围由左半边->全部->右半边变化时,则判定为一次右滑触控事件,右滑触控事件对应的运动轨迹的X方向坐标变化最大,Y方向和Z方向有微小变化。因此,当确定最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,可以初步判断当前的触控是左滑触控或右滑触控,故获取预设 触控模板中的左右滑动阈值数据。
在步骤S422,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控。
在本实施例中,进一步确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与左右滑动阈值数据是否匹配,当以上数据匹配时,可以进一步确定是左滑触控或右滑触控,当以上数据不匹配时,则当前的运动轨迹对应的是一次无效触控,智能终端不进行任何操作。
具体地,步骤S422包括如下步骤a至c。
在步骤a,获取所述最大X方向坐标差值对应的第一采样点和第二采样点,所述第一采样点的X方向坐标大于所述第二采样点的X方向坐标。
在本实施例中,当最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与左右滑动阈值数据匹配时,进一步确定当前是左滑触控还是右滑触控。具体地,获取最大X方向坐标差值对应的两个采样点:第一采样点和第二采样点,设定第一采样点的X方向坐标大于第二采样点的X方向坐标,进一步比较两个采样点对应的时间。
在步骤b,当所述第一采样点的轨迹时间晚于所述第二采样点的轨迹时间,确定所述触控类型是右滑触控。
在本实施例中,如果是右滑触控,即沿着X正坐标方向滑动,则轨迹时间较早的采样点的X方向坐标小于轨迹时间较晚的采样点的X方向坐标,故当第一采样点的轨迹时间晚于第二采样点的轨迹时间,确定当前运动轨迹对应的触控类型是右滑触控。
在步骤c,当所述第一采样点的轨迹时间早于所述第二采样点的轨迹时间,确定所述触控类型是左滑触控。
在本实施例中,如果是左滑触控,即沿着X负坐标方向滑动,则轨迹时间较早的采样点的X方向坐标大于轨迹时间较晚的采样点的X方向坐标,故当第一采样点的轨迹时间早于第二采样点的轨迹 时间时,确定当前运动轨迹对应的触控类型是左滑触控。
本实施例提出的屏幕触控管理方法,在确定所述最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,获取所述左右滑动阈值数据,而后在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控,进而准确确定了触控类型是左滑触控或右滑触控,实现了屏下摄像头开孔位置可以响应屏幕触控操作。
基于本公开的上述示例性实施例,参照图4,提出本公开屏幕触控管理方法的另一示例性实施例,在本实施例中,步骤S420还包括步骤S423和S424。
在步骤S423,在确定所述最大Y方向坐标差值大于所述最大X方向坐标差值,且所述最大Y方向坐标差值大于所述最大Z方向坐标差值时,获取所述上下滑动阈值数据。
在本实施例中,当手指由下及上接触到前置摄像头开孔的屏幕位置,摄像头被手指遮挡的范围由下半边->全部->上半边变化时,则判定为一次上滑触控事件,上滑触控事件对应的运动轨迹的Y方向坐标变化最大,X方向和Z方向有微小变化。同理,当手指由上及下接触到前置摄像头开孔的屏幕位置,摄像头被手指遮挡的范围由上半边->全部->下半边变化时,则判定为一次下滑触控事件,下滑触控事件对应的运动轨迹的Y方向坐标变化最大,X方向和Z方向有微小变化。因此最大Y方向坐标差值大于最大X方向坐标差值,且最大Y方向坐标差值大于最大Z方向坐标差值时,可以初步判断当前的触控是上滑触控或下滑触控,故获取预设触控模板中的上下滑动阈值数据。
在步骤S424,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控。
在本实施例中,进一步确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与上下滑动阈值数据是否匹配,当 以上数据匹配时,可以进一步确定是上滑触控或下滑触控,当以上数据不匹配时,则当前的运动轨迹对应的是一次无效触控,智能终端不进行任何操作。
具体地,步骤S424包括如下步骤d至f。
在步骤d,获取所述最大Y方向坐标差值对应的第三采样点和第四采样点,所述第三采样点的Y方向坐标大于所述第四采样点的Y方向坐标。
在本实施例中,当最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与上下滑动阈值数据匹配时,进一步确定当前是上滑触控还是下滑触控。具体地,获取最大Y方向坐标差值对应的两个采样点:第三采样点和第四采样点,设定第三采样点的Y方向坐标大于第四采样点的Y方向坐标,进一步比较两个采样点对应的时间。
在步骤e,当所述第三采样点的轨迹时间晚于所述第四采样点的轨迹时间,确定所述触控类型是上滑触控。
在本实施例中,如果是上滑触控,即沿着Y正坐标方向滑动,则轨迹时间较早的采样点的Y方向坐标小于轨迹时间较晚的采样点的Y方向坐标,故当第三采样点的轨迹时间晚于第四采样点的轨迹时间,确定当前运动轨迹对应的触控类型是上滑触控。
在步骤f,当所述第三采样点的轨迹时间早于所述第四采样点的轨迹时间,确定所述触控类型是下滑触控。
在本实施例中,如果是下滑触控,即沿着Y负坐标方向滑动,则轨迹时间较早的采样点的Y方向坐标大于轨迹时间较晚的采样点的Y方向坐标,故当第三采样点的轨迹时间早于第四采样点的轨迹时间,确定当前运动轨迹对应的触控类型是下滑触控。
本实施例提出的屏幕触控管理方法,在确定所述最大Y方向坐标差值大于所述最大X方向坐标差值,且所述最大Y方向坐标差值大于所述最大Z方向坐标差值时,获取所述上下滑动阈值数据,而后在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述 触控类型是上滑触控或下滑触控,进而准确确定了触控类型是上滑触控或下滑触控,实现了屏下摄像头开孔位置可以响应屏幕触控操作。
基于本公开上述示例性实施例,参照图5,提出本公开屏幕触控管理方法的另一示例性实施例,在本实施例中,步骤S420还包括如下步骤S425和S426。
在步骤S425,在确定所述最大Z方向坐标差值大于所述最大X方向坐标差值,且所述最大Z方向坐标差值大于所述最大Y方向坐标差值时,获取所述点击阈值数据。
在本实施例中,当手指由远及近接触到前置摄像头开孔的屏幕位置时,在较短时间内由近及远离开,例如1s,则判定为一次单击触控事件,在较短时间内连续两次单击触控,则判定为一次双击触控事件,当手指由远及近接触到前置摄像头开孔的屏幕位置时,停留一段时间后,再由近及远离开,则判定为一次长按触控事件,无论是单击触控、双击触控还是长按触控,这些点击触控事件对应的运动轨迹的Z方向坐标变化最大,X方向和Y方向有微小变化。因此最大Z方向坐标差值大于最大X方向坐标差值,且最大Z方向坐标差值大于最大Y方向坐标差值时,可以初步判断当前的触控是点击触控,故获取预设触控模板中的点击阈值数据。
在步骤S426,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控。
在本实施例中,进一步确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值与点击阈值数据是否匹配,当以上数据匹配时,可以进一步确定是单击触控、双击触控还是长按触控,当以上数据不匹配时,则当前的运动轨迹对应的是一次无效触控,智能终端不进行任何操作。
具体地,步骤S426包括如下步骤g至j。
在步骤g,基于所述运动轨迹确定所述前置摄像头被遮挡的遮挡次数以及所述前置摄像头每次被遮挡的遮挡时长。
在本实施例中,点击触控包括单击触控、双击触控和长按触控, 单击触控和双击触控对前置摄像头的遮挡时长较短,例如小于1s,而长按触控对前置摄像头的遮挡时长较长,例如大于或等于2s,因此,需要根据运动轨迹确定前置摄像头被遮挡的遮挡次数以及前置摄像头每次被遮挡的遮挡时长,而后进一步确定是单击触控、双击触控或长按触控。
在步骤h,在确定遮挡时长满足长按条件时,确定所述触控类型是长按触控。
在本实施例中,在全部遮挡时长中存在大于预设时长的长按遮挡时长时,确定所述遮挡时长满足长按条件,其中预设时长根据实际情况确定,例如,预设时长等于2s。在确定遮挡时长满足长按条件时,确定当前的触控类型是长按触控。
在步骤i,在确定遮挡时长不满足长按条件时,所述遮挡次数等于第一预设值时,确定所述触控类型是单击触控。
在步骤j,在确定遮挡时长不满足长按条件时,所述遮挡次数大于或等于第二预设值时,确定所述触控类型是双击触控,所述第二预设值大于所述第一预设值。
在本实施例中,当全部遮挡时长中不存在大于预设时长的长按遮挡时长时,确定遮挡时长不满足长按条件,则当前的触控类型是单击触控或双击触控,则进一步根据遮挡次数确定触控类型。
具体地,当遮挡次数等于第一预设值时,确定当前的触控类型是单击触控,遮挡次数大于或等于第二预设值时,确定触控类型是双击触控,第二预设值大于第一预设值,在一示例性实施例中,第一预设值等于1,第二预设值等于2。
本实施例提出的屏幕触控管理方法,在确定所述最大Z方向坐标差值大于所述最大X方向坐标差值,且所述最大Z方向坐标差值大于所述最大Y方向坐标差值时,获取所述点击阈值数据,而后在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控,进而准确确定了触控类型是单击触控、双击触控或长按触控,实现了屏下摄像头开孔位置可以响应屏幕触控操作。
本公开进一步提供一种屏幕触控管理装置,参照图6,图6为本公开屏幕触控管理装置实施例的功能模块示意图。本公开的屏幕触控管理装置包括:启动模块10,配置为在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;获取模块20,配置为在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;采样模块30,配置为基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;以及确定模块40,配置为基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
进一步地,所述确定模块40还配置为:基于所述各个采样点的四维坐标值确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值;基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型。
进一步地,所述确定模块40还配置为:在确定所述最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,获取所述左右滑动阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控。
进一步地,所述确定模块40还配置为:获取所述最大X方向坐标差值对应的第一采样点和第二采样点,所述第一采样点的X方向坐标大于所述第二采样点的X方向坐标;当所述第一采样点的轨迹时间晚于所述第二采样点的轨迹时间,确定所述触控类型是右滑触控;当所述第一采样点的轨迹时间早于所述第二采样点的轨迹时间,确定所述触控类型是左滑触控。
进一步地,所述确定模块40还配置为:在确定所述最大Y方向 坐标差值大于所述最大X方向坐标差值,且所述最大Y方向坐标差值大于所述最大Z方向坐标差值时,获取所述上下滑动阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控。
进一步地,所述确定模块40还配置为:获取所述最大Y方向坐标差值对应的第三采样点和第四采样点,所述第三采样点的Y方向坐标大于所述第四采样点的Y方向坐标;当所述第三采样点的轨迹时间晚于所述第四采样点的轨迹时间,确定所述触控类型是上滑触控;当所述第三采样点的轨迹时间小于所述第四采样点的轨迹时间,确定所述触控类型是下滑触控。
进一步地,所述确定模块40还配置为:在确定所述最大Z方向坐标差值大于所述最大X方向坐标差值,且所述最大Z方向坐标差值大于所述最大Y方向坐标差值时,获取所述点击阈值数据;在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控。
进一步地,所述确定模块40还配置为:基于所述运动轨迹确定所述前置摄像头被遮挡的遮挡次数以及所述前置摄像头每次被遮挡的遮挡时长;在确定遮挡时长满足长按条件时,确定所述触控类型是长按触控;在确定遮挡时长不满足长按条件时,所述遮挡次数等于第一预设值时,确定所述触控类型是单击触控;在确定遮挡时长不满足长按条件时,所述遮挡次数大于或等于第二预设值时,确定所述触控类型是双击触控,所述第二预设值大于所述第一预设值。
进一步地,所述确定模块40还配置为:在全部遮挡时长中存在大于预设时长的长按遮挡时长时,确定所述遮挡时长满足长按条件。
此外,本公开实施例还提出一种可读存储介质,所述可读存储介质上存储有屏幕触控管理程序,所述屏幕触控管理程序被处理器执行时实现上述各个实施例中屏幕触控管理方法。
本公开在所述前置摄像头未开启且基于所述接近传感器检测到 物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指,而后在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹,接下来基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间,然后基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。通过智能终端的追踪模式获取用户手指的运动轨迹,将运动轨迹上采样点的四维坐标与预设触控模板进行匹配,进而确定触控类型,实现了屏下摄像头开孔位置可以响应屏幕触控操作。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者系统中还存在另外的相同要素。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一个可读存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台系统设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例所述的方法。
以上仅为本公开的示例性实施例,并非因此限制本公开的专利范围,凡是利用本公开说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本公 开的专利保护范围内。

Claims (12)

  1. 一种屏幕触控管理方法,应用于设有接近传感器的智能终端,所述接近传感器安装在所述智能终端的前置摄像头预设范围内,所述智能终端设有追踪模式,其中,所述的屏幕触控管理方法包括:
    在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;
    在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;
    基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,其中,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;
    基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
  2. 如权利要求1所述的屏幕触控管理方法,其中,基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:
    基于所述各个采样点的四维坐标值确定最大X方向坐标差值、最大Y方向坐标差值、最大Z方向坐标差值;
    基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型。
  3. 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括左滑触控和右滑触控,所述预设触控模板包括左右滑动阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:
    在确定所述最大X方向坐标差值大于所述最大Y方向坐标差值,且所述最大X方向坐标差值大于所述最大Z方向坐标差值时,获取所述左右滑动阈值数据;
    在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控。
  4. 如权利要求3所述的屏幕触控管理方法,其中,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述左右滑动阈值数据匹配时,确定所述触控类型是左滑触控或右滑触控的步骤包括:
    获取所述最大X方向坐标差值对应的第一采样点和第二采样点,所述第一采样点的X方向坐标大于所述第二采样点的X方向坐标;
    响应于所述第一采样点的轨迹时间晚于所述第二采样点的轨迹时间,确定所述触控类型是右滑触控;
    响应于所述第一采样点的轨迹时间早于所述第二采样点的轨迹时间,确定所述触控类型是左滑触控。
  5. 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括上滑触控和下滑触控,所述预设触控模板包括上下滑动阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:
    在确定所述最大Y方向坐标差值大于所述最大X方向坐标差值,且所述最大Y方向坐标差值大于所述最大Z方向坐标差值时,获取所述上下滑动阈值数据;
    在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控。
  6. 如权利要求5所述的屏幕触控管理方法,其中,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述上下滑动阈值数据匹配时,确定所述触控类型是上滑触控或下滑触控的步骤包括:
    获取所述最大Y方向坐标差值对应的第三采样点和第四采样点,所述第三采样点的Y方向坐标大于所述第四采样点的Y方向坐标;
    响应于所述第三采样点的轨迹时间晚于所述第四采样点的轨迹时间,确定所述触控类型是上滑触控;
    响应于所述第三采样点的轨迹时间小于所述第四采样点的轨迹时间,确定所述触控类型是下滑触控。
  7. 如权利要求2所述的屏幕触控管理方法,其中,所述触控类型包括点击触控,所述预设触控模板包括点击阈值数据,基于所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值以及预设触控模板确定所述运动轨迹对应的触控类型的步骤包括:
    在确定所述最大Z方向坐标差值大于所述最大X方向坐标差值,且所述最大Z方向坐标差值大于所述最大Y方向坐标差值时,获取所述点击阈值数据;
    在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控。
  8. 如权利要求7所述的屏幕触控管理方法,其中,所述点击触控包括单击触控、双击触控和长按触控,在确定所述最大X方向坐标差值、所述最大Y方向坐标差值、所述最大Z方向坐标差值与所述点击阈值数据匹配时,确定所述触控类型是点击触控的步骤包括:
    基于所述运动轨迹确定所述前置摄像头被遮挡的遮挡次数以及所述前置摄像头每次被遮挡的遮挡时长;
    在确定遮挡时长满足长按条件时,确定所述触控类型是长按触 控;
    在确定遮挡时长不满足长按条件,且所述遮挡次数等于第一预设值时,确定所述触控类型是单击触控;
    在确定遮挡时长不满足长按条件,且所述遮挡次数大于或等于第二预设值时,确定所述触控类型是双击触控,所述第二预设值大于所述第一预设值。
  9. 如权利要求8所述的屏幕触控管理方法,其中,所述在全部遮挡时长中存在大于预设时长的长按遮挡时长时,确定所述遮挡时长满足长按条件。
  10. 一种屏幕触控管理装置,包括:
    启动模块,配置为在所述前置摄像头未开启且基于所述接近传感器检测到物体接近所述前置摄像头时,启动所述前置摄像头,并确定所述接近物体是否为用户手指;
    获取模块,配置为在确定所述接近物体是用户手指时,启动所述追踪模式,并基于所述前置摄像头以及所述追踪模式获取所述用户手指的运动轨迹;
    采样模块,配置为基于预设时间间隔获取所述运动轨迹上采样点的四维坐标,其中,所述四维坐标包括水平X方向坐标、垂直Y方向坐标、远近Z方向坐标以及轨迹时间;以及
    确定模块,配置为基于各个采样点的四维坐标以及预设触控模板确定所述运动轨迹对应的触控类型,并基于所述触控类型执行触控操作。
  11. 一种智能终端,包括:存储器、处理器及存储在所述存储器上并可在所述处理器上运行的屏幕触控管理程序,所述屏幕触控管理程序被所述处理器执行时实现如权利要求1至9中任一项所述的屏幕触控管理方法。
  12. 一种可读存储介质,其中,所述可读存储介质上存储有所述屏幕触控管理程序,所述屏幕触控管理程序被处理器执行时实现如权利要求1至9中任一项所述的屏幕触控管理方法。
PCT/CN2020/116484 2019-09-23 2020-09-21 屏幕触控管理方法、智能终端、装置及可读存储介质 WO2021057654A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910900469.3 2019-09-23
CN201910900469.3A CN112540696A (zh) 2019-09-23 2019-09-23 屏幕触控管理方法、智能终端、装置及可读存储介质

Publications (1)

Publication Number Publication Date
WO2021057654A1 true WO2021057654A1 (zh) 2021-04-01

Family

ID=75013168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116484 WO2021057654A1 (zh) 2019-09-23 2020-09-21 屏幕触控管理方法、智能终端、装置及可读存储介质

Country Status (2)

Country Link
CN (1) CN112540696A (zh)
WO (1) WO2021057654A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150550A (zh) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 用于终端的拍照处理方法、装置、电子设备和存储介质
CN115150551A (zh) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 用于终端的拍照处理方法、装置、电子设备和存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538762B (zh) * 2021-09-16 2021-12-14 深圳市海清视讯科技有限公司 门禁平板设备菜单控制方法、装置、系统、介质及产品
CN114020192B (zh) * 2021-09-18 2024-04-02 特斯联科技集团有限公司 一种基于曲面电容实现非金属平面的互动方法和系统
CN114115673B (zh) * 2021-11-25 2023-10-27 海信集团控股股份有限公司 车载屏幕的控制方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150286281A1 (en) * 2014-04-04 2015-10-08 Alibaba Group Holding Limited Generating a screenshot
CN106055143A (zh) * 2016-05-20 2016-10-26 广州视睿电子科技有限公司 触摸点位置检测方法和系统
CN109298798A (zh) * 2018-09-21 2019-02-01 歌尔科技有限公司 触控板的操作控制方法、设备以及智能终端

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150286281A1 (en) * 2014-04-04 2015-10-08 Alibaba Group Holding Limited Generating a screenshot
CN106055143A (zh) * 2016-05-20 2016-10-26 广州视睿电子科技有限公司 触摸点位置检测方法和系统
CN109298798A (zh) * 2018-09-21 2019-02-01 歌尔科技有限公司 触控板的操作控制方法、设备以及智能终端

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150550A (zh) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 用于终端的拍照处理方法、装置、电子设备和存储介质
CN115150551A (zh) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 用于终端的拍照处理方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN112540696A (zh) 2021-03-23

Similar Documents

Publication Publication Date Title
WO2021057654A1 (zh) 屏幕触控管理方法、智能终端、装置及可读存储介质
US10739854B2 (en) Terminal and touch response method and device
WO2019033957A1 (zh) 交互位置确定方法、系统、存储介质和智能终端
US20140270413A1 (en) Auxiliary device functionality augmented with fingerprint sensor
US9268407B1 (en) Interface elements for managing gesture control
US20170083741A1 (en) Method and device for generating instruction
US20140270414A1 (en) Auxiliary functionality control and fingerprint authentication based on a same user input
WO2022110614A1 (zh) 手势识别方法及装置、电子设备和存储介质
CN108920202B (zh) 应用预加载管理方法、装置、存储介质及智能终端
WO2013000381A1 (zh) 控制移动终端状态的方法及移动终端
CN109558000B (zh) 一种人机交互方法及电子设备
WO2005119591A1 (ja) 表示制御方法および装置、プログラム、並びに携帯機器
WO2014105012A1 (en) System and method for gesture based touchscreen control of displays
CN113253908B (zh) 按键功能执行方法、装置、设备及存储介质
CN110730298A (zh) 一种显示控制方法及电子设备
US12022190B2 (en) Photographing method and electronic device
CN106325623A (zh) 在触摸屏上监测触摸的方法、装置及终端设备
JP6911870B2 (ja) 表示制御装置、表示制御方法及びコンピュータプログラム
US11886643B2 (en) Information processing apparatus and information processing method
CN112749590B (zh) 目标检测方法、装置、计算机设备和计算机可读存储介质
CN113867550A (zh) 电子设备的姿态检测方法及装置、存储介质
WO2021204101A1 (zh) 显示方法及电子设备
JP7413546B2 (ja) 撮影方法及び電子機器
US9350918B1 (en) Gesture control for managing an image view display
CN106325622B (zh) 自电容式压力触摸装置及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20870068

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20870068

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.02.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20870068

Country of ref document: EP

Kind code of ref document: A1