CN112540696A - Screen touch control management method, intelligent terminal, device and readable storage medium - Google Patents

Screen touch control management method, intelligent terminal, device and readable storage medium Download PDF

Info

Publication number
CN112540696A
CN112540696A CN201910900469.3A CN201910900469A CN112540696A CN 112540696 A CN112540696 A CN 112540696A CN 201910900469 A CN201910900469 A CN 201910900469A CN 112540696 A CN112540696 A CN 112540696A
Authority
CN
China
Prior art keywords
touch
maximum
direction coordinate
coordinate difference
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910900469.3A
Other languages
Chinese (zh)
Inventor
冯凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN201910900469.3A priority Critical patent/CN112540696A/en
Priority to PCT/CN2020/116484 priority patent/WO2021057654A1/en
Publication of CN112540696A publication Critical patent/CN112540696A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1318Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The invention discloses a screen touch control management method, which comprises the following steps: when the front camera is not started and an object is detected to be close to the front camera based on the proximity sensor, the front camera is started, whether the close object is a user finger is determined, then when the close object is determined to be the user finger, the tracking mode is started, the motion track of the user finger is obtained based on the front camera and the tracking mode, the four-dimensional coordinates of sampling points on the motion track are obtained based on a preset time interval, then the touch type corresponding to the motion track is determined based on the four-dimensional coordinates of the sampling points and a preset touch template, and touch operation is executed based on the touch type. The invention also discloses a device, an intelligent terminal and a readable storage medium. The opening position of the camera under the screen can respond to the touch operation of the screen.

Description

Screen touch control management method, intelligent terminal, device and readable storage medium
Technical Field
The invention relates to the technical field of intelligent terminals, in particular to a screen touch control management method, an intelligent terminal, a device and a readable storage medium.
Background
With the rapid development of intelligent terminal technologies such as mobile phones and tablet computers, the applications of intelligent terminals such as mobile phones and tablet computers are more and more extensive, and the requirements of users on the screens of the intelligent terminals are higher and higher.
With the development of the screen technology of the intelligent terminal, various screens with ultrahigh screen occupation ratios begin to appear, including water drops, bang and other forms, and the comprehensive screen becomes a great trend of the intelligent terminal. To accomplish real full screen, the camera is the technical problem that must solve under the screen, and the touch-control operation of screen can normally be responded to camera trompil position under the screen, but this problem has not been solved to present techniques such as water droplet screen, bang screen.
Disclosure of Invention
The invention mainly aims to provide a screen touch control management method, an intelligent terminal, a device and a readable storage medium, and aims to solve the technical problem that the existing under-screen camera hole opening position cannot respond to screen touch control operation.
In order to achieve the above object, the present invention provides a screen touch management method, which includes the following steps:
when the front camera is not started and an object is detected to be close to the front camera based on the proximity sensor, starting the front camera and determining whether the close object is a finger of a user;
when the approaching object is determined to be a user finger, starting the tracking mode, and acquiring a motion track of the user finger based on the front camera and the tracking mode;
acquiring four-dimensional coordinates of the sampling points on the motion trail based on a preset time interval, wherein the four-dimensional coordinates comprise horizontal X-direction coordinates, vertical Y-direction coordinates, far and near Z-direction coordinates and trail time;
and determining a touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and executing touch operation based on the touch type.
In addition, to achieve the above object, the present invention also provides an intelligent terminal, including: the touch screen comprises a memory, a processor and a screen touch management program which is stored on the memory and can run on the processor, wherein when the screen touch management program is executed by the processor, the touch screen management program realizes the steps of any one of the touch screen management methods.
In addition, to achieve the above object, the present invention further provides a readable storage medium, where a screen touch management program is stored, and the screen touch management program, when executed by a processor, implements the steps of the screen touch management method according to any one of the above aspects.
According to the method, when the front camera is not started and the proximity sensor detects that an object approaches the front camera, the front camera is started, whether the approaching object is a user finger is determined, then when the approaching object is determined to be the user finger, the tracking mode is started, the motion track of the user finger is obtained based on the front camera and the tracking mode, then the four-dimensional coordinates of sampling points on the motion track are obtained based on a preset time interval, wherein the four-dimensional coordinates comprise horizontal X-direction coordinates, vertical Y-direction coordinates, far-near Z-direction coordinates and track time, then the touch type corresponding to the motion track is determined based on the four-dimensional coordinates of each sampling point and a preset touch template, and touch operation is executed based on the touch type. The motion track of the user finger is obtained through the tracking mode of the intelligent terminal, the four-dimensional coordinates of the sampling point on the motion track are matched with the preset touch template, the touch type is further determined, and the fact that the opening position of the camera under the screen can respond to screen touch operation is achieved.
Drawings
Fig. 1 is a schematic structural diagram of an intelligent terminal in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a first embodiment of a screen touch management method according to the present invention;
FIG. 3 is a flowchart illustrating a second embodiment of a screen touch management method according to the present invention;
FIG. 4 is a flowchart illustrating a third embodiment of a method for managing touch on a screen according to the present invention;
FIG. 5 is a flowchart illustrating a fourth embodiment of a screen touch management method according to the present invention;
FIG. 6 is a functional block diagram of an embodiment of a screen touch management device according to the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic structural diagram of an intelligent terminal in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the smart terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the smart terminal may further include a camera, a Radio Frequency (RF) circuit, a sensor, an audio circuit, a WiFi module, and the like. Such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display screen according to the brightness of ambient light, and a proximity sensor that may turn off the display screen and/or the backlight when the mobile terminal is moved to the ear. As one of the motion sensors, the attitude sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the mobile terminal, and related functions (such as pedometer and tapping) for vibration recognition; of course, the intelligent terminal may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
Those skilled in the art will appreciate that the intelligent terminal architecture shown in fig. 1 is not intended to be limiting of the terminal and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a screen touch management program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be used to invoke a screen touch manager stored in the memory 1005.
In this embodiment, the intelligent terminal includes: the touch screen management method includes a memory 1005, a processor 1001 and a touch screen management program stored in the memory 1005 and capable of running on the processor 1001, wherein when the processor 1001 calls the touch screen management program stored in the memory 1005, the steps of the touch screen management method provided by each embodiment of the present application are executed.
The invention also provides a screen touch control management method, and referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of the screen touch control management method according to the invention.
While a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that shown or described herein.
In this embodiment, the screen touch management method includes:
step S100, when the front camera is not started and an object is detected to be close to the front camera based on the proximity sensor, starting the front camera and determining whether the close object is a finger of a user;
in this embodiment, the full-screen of the current intelligent terminal is a quasi-full-screen, so that a real full-screen mobile phone will appear in the future, and the full-screen mobile phone will be a necessary trend in the future. The technical obstacles such as fingerprint under the screen, screen sound production have been solved at present to comprehensive screen cell-phone, and the technical obstacle that still needs to solve at present is exactly camera under the screen, and two main problems need to be solved to camera under the screen: firstly, displaying a screen at the position of an opening of a camera under the screen; and secondly, touch control of the opening position of the camera under the screen. The technical scheme of the invention solves the second problem: and (5) touch control of the opening position of the camera under the screen.
Specifically, intelligent terminal sets up proximity sensor in camera trompil position for whether detect the object and be close to camera under the screen, so that trigger the start-up of camera under the screen. A proximity sensor is a device capable of sensing the proximity of an object, and recognizes the proximity of the object by using the sensitivity of a displacement sensor to the approaching object and outputs a corresponding switch signal, and thus, the proximity sensor is also generally called a proximity switch. It is a generic name of a sensor for the purpose of not touching an object to be detected, instead of a contact detection type detection method such as a switch, and it can detect movement and presence information of an object and convert it into an electric signal. In addition, the intelligent terminal is also provided with a tracking mode, and the tracking mode is used for capturing the motion trail of the finger of the user according to the video image shot by the camera. When the front camera of the intelligent terminal is not started and the proximity sensor detects that the object is close to the front camera, the front camera is started to shoot an image of the close object, and whether the close object is a finger of the user is determined according to a preset image recognition algorithm. It should be noted that, if the front-facing camera of the intelligent terminal is being used in the use periods of photographing, face recognition, photographing, video call, etc., if the proximity sensor detects that an object is close to the front-facing camera, the touch event at the opening position corresponding to the front-facing camera under the screen is not responded.
Step S200, when the approaching object is determined to be the finger of the user, starting the tracking mode, and acquiring the motion track of the finger of the user based on the front camera and the tracking mode;
in this embodiment, when the proximity sensor detects that an object is close to the front camera, and after it is determined that the object is a user finger according to the front camera of the intelligent terminal and a preset image recognition algorithm, a tracking mode of the intelligent terminal is started, and then a motion track of the user finger is captured according to a video image captured by the camera. A motion trajectory refers to a spatial feature of motion that is composed of the path that a part of the body takes from the starting position to the end. The motion trail is represented by a motion trail direction, a motion trail form and a motion amplitude. In the invention, the motion trail of the finger of the user refers to the spatial characteristics of the motion formed by the paths which the finger passes through from the starting position to the end in the shooting area after the front camera is started, the direction of the motion trail of the finger is changed continuously, and the motion trail is in the form of a curve.
Specifically, the moving target tracking means that a moving target of interest is found in each image in a sequence of images in real time, and the moving target comprises motion parameters such as position, speed and acceleration.
Step S300, acquiring four-dimensional coordinates of the sampling points on the motion trail based on a preset time interval, wherein the four-dimensional coordinates comprise horizontal X-direction coordinates, vertical Y-direction coordinates, far and near Z-direction coordinates and trail time;
in this embodiment, the movement locus of the finger may be represented by a four-dimensional coordinate system, which is an X axis in the horizontal direction, a Y axis in the vertical direction, a Z axis in the near-far direction, and a time T axis, respectively, where an origin of the coordinate system may be set according to an actual situation. The motion trajectory of the finger is a curve and is represented by a four-dimensional coordinate system, so that the motion trajectory can be sampled according to a preset time interval to obtain a plurality of sampling points, each sampling point is represented by a four-dimensional coordinate, the four-dimensional coordinate comprises a horizontal X-direction coordinate, a vertical Y-direction coordinate, a far-near Z-direction coordinate and trajectory time, and the sampling points are used for determining a touch type corresponding to the finger operation, for example, determining that the current finger operation is leftward sliding, downward sliding and the like. The preset time interval is determined according to the actual condition, the number of sampling points is determined by the preset time interval, and at least 2 sampling points are ensured.
Step S400, determining a touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and executing touch operation based on the touch type.
In this embodiment, after the four-dimensional coordinates of the sampling point on the motion trajectory are acquired according to the preset time interval, the touch type corresponding to the motion trajectory is further determined according to the four-dimensional coordinates of the sampling point and the preset touch template. Wherein the touch type includes: left-sliding touch, right-sliding touch, up-sliding touch, down-sliding touch, single-click touch, double-click touch and long-press touch; the preset touch template stores left-right sliding threshold data, up-down sliding threshold data and clicking threshold data, the left-right sliding threshold data is a three-dimensional array and comprises X-direction data, Y-direction data and Z-direction data, and the left-right sliding threshold data is used for judging whether left-right sliding touch is performed or not; similarly, the up-down sliding threshold data and the clicking threshold data are also three-dimensional arrays, and include X-direction data, Y-direction data, and Z-direction data, which are respectively used to determine whether the up-down sliding touch and the clicking touch are performed.
Specifically, step S400 includes:
step S410, determining a maximum X-direction coordinate difference value, a maximum Y-direction coordinate difference value and a maximum Z-direction coordinate difference value based on the four-dimensional coordinate values of the sampling points;
in this embodiment, an X-direction coordinate difference value, a Y-direction coordinate difference value, and a Z-direction coordinate difference value between each sampling point are calculated, respectively, and a maximum X-direction coordinate difference value among all the X-direction coordinate difference values, a maximum Y-direction coordinate difference value among all the Y-direction coordinate difference values, and a maximum Z-direction coordinate difference value among all the Z-direction coordinate difference values are obtained.
Step S420, determining a touch type corresponding to the motion trajectory based on the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and a preset touch template.
In this embodiment, left-right sliding threshold data, up-down sliding threshold data, and clicking threshold data are stored in the preset touch template, the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value, and the maximum Z-direction coordinate difference value are matched with the preset touch template, and the touch type corresponding to the motion trajectory is determined according to the matching result.
In the method for managing touch on a screen provided in this embodiment, when the front-facing camera is not turned on and an object is detected to approach the front-facing camera based on the proximity sensor, the front-facing camera is turned on, and it is determined whether the approaching object is a user finger, then a motion trajectory of the user finger is obtained based on the front-facing camera and the tracking mode, and then a four-dimensional coordinate of a sampling point on the motion trajectory is obtained based on a preset time interval, where the four-dimensional coordinate includes a horizontal X-direction coordinate, a vertical Y-direction coordinate, a far-near Z-direction coordinate, and a trajectory time, and then a touch type corresponding to the motion trajectory is determined based on the four-dimensional coordinate of each sampling point and a preset touch template, and a touch operation is performed based on the touch type. The motion track of the user finger is obtained through the tracking mode of the intelligent terminal, the four-dimensional coordinates of the sampling point on the motion track are matched with the preset touch template, the touch type is further determined, and the fact that the opening position of the camera under the screen can respond to screen touch operation is achieved.
Based on the first embodiment, referring to fig. 3, a second embodiment of the screen touch management method of the present invention is provided, in this embodiment, step S420 includes:
step S421, when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, acquiring the left-right sliding threshold data;
in this embodiment, when a finger touches a screen position of the front camera opening from the right and left, and a range blocked by the finger changes from the right half- > all- > left half, it is determined as a left-sliding touch event, a coordinate change in the X direction of a motion trajectory corresponding to the left-sliding touch event is the largest, and there is a slight change in the Y direction and the Z direction. Similarly, when the finger touches the position of the screen with the opening of the front camera from left and right, and the range shielded by the finger changes from left half- > all- > right half, the right sliding touch event is determined to be a right sliding touch event, the coordinate change of the motion track corresponding to the right sliding touch event in the X direction is the largest, and the Y direction and the Z direction have small changes. Therefore, when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, it may be preliminarily determined that the current touch is a left-sliding touch or a right-sliding touch, so that left-right sliding threshold data in the preset touch template is obtained.
Step S422, when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left-right sliding threshold data, it is determined that the touch type is left-right sliding touch or right-right sliding touch.
In this embodiment, whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and the left-right sliding threshold data are matched is further determined, when the data are matched, left-sliding touch or right-sliding touch can be further determined, and when the data are not matched, the current motion trajectory corresponds to one-time invalid touch, and the intelligent terminal does not perform any operation.
Specifically, step S422 includes:
step a, acquiring a first sampling point and a second sampling point corresponding to the maximum X-direction coordinate difference value, wherein the X-direction coordinate of the first sampling point is larger than the X-direction coordinate of the second sampling point;
in this embodiment, when the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left-right sliding threshold data, it is further determined whether left-sliding touch or right-sliding touch is currently performed. Specifically, two sampling points corresponding to the maximum X-direction coordinate difference are obtained: and setting the X-direction coordinate of the first sampling point to be larger than that of the second sampling point, and further comparing the time corresponding to the two sampling points.
B, when the track time of the first sampling point is later than the track time of the second sampling point, determining that the touch type is right-sliding touch;
in this embodiment, if the touch type is right-slide touch, that is, if the touch type is right-slide touch, the X-direction coordinate of the sampling point with the earlier track time is smaller than the X-direction coordinate of the sampling point with the later track time.
And c, when the track time of the first sampling point is earlier than the track time of the second sampling point, determining that the touch type is left-sliding touch.
In this embodiment, if the touch is left-sliding touch, that is, if the touch slides along the negative X coordinate direction, the X-direction coordinate of the sampling point with the earlier track time is greater than the X-direction coordinate of the sampling point with the later track time, so that when the track time of the first sampling point is earlier than the track time of the second sampling point, it is determined that the touch type corresponding to the current motion track is left-sliding touch.
In the screen touch management method provided by this embodiment, when it is determined that the maximum X-direction coordinate difference is greater than the maximum Y-direction coordinate difference and the maximum X-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the left-right sliding threshold data is obtained, and then when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left-right sliding threshold data, it is determined that the touch type is left-right sliding touch or right-right sliding touch, and thus it is accurately determined that the touch type is left-right sliding touch or right-right sliding touch, and it is achieved that the position where the camera is opened under the screen can respond to the screen touch operation.
Based on the second embodiment, referring to fig. 4, a third embodiment of the screen touch management method of the present invention is provided, in this embodiment, step S420 further includes:
step S423, when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference, obtaining the vertical sliding threshold data;
in this embodiment, when a finger touches a screen position of the opening of the front camera from the bottom and the top, and a range shielded by the camera is changed from the bottom half- > all- > top half, it is determined that the touch event is a single slide-up touch event, a coordinate change of a motion trajectory in the Y direction corresponding to the touch event is the largest, and the X direction and the Z direction are slightly changed. Similarly, when the finger contacts the screen position of the front camera opening from top to bottom and the range shielded by the finger is changed from the upper half- > all- > lower half, the camera is judged to slide down the touch event at one time, the coordinate change of the motion track corresponding to the slide down touch event in the Y direction is the largest, and the X direction and the Z direction are slightly changed. Therefore, when the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference, it may be preliminarily determined that the current touch is an up-sliding touch or a down-sliding touch, so that up-and-down sliding threshold data in the preset touch template is obtained.
Step S424, when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the up-down sliding threshold data, it is determined that the touch type is an up-sliding touch or a down-sliding touch.
In this embodiment, whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference are matched with the up-down sliding threshold data is further determined, when the data are matched, an up-sliding touch or a down-sliding touch may be further determined, and when the data are not matched, the current motion trajectory corresponds to an invalid touch, and the intelligent terminal does not perform any operation.
Specifically, step S424 includes:
step d, acquiring a third sampling point and a fourth sampling point corresponding to the maximum Y-direction coordinate difference value, wherein the Y-direction coordinate of the third sampling point is larger than that of the fourth sampling point;
in this embodiment, when the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the up-down sliding threshold data, it is further determined whether the current up-sliding touch or the down-sliding touch is performed. Specifically, two sampling points corresponding to the maximum Y-direction coordinate difference are obtained: and setting the Y-direction coordinate of the third sampling point to be larger than that of the fourth sampling point, and further comparing the time corresponding to the two sampling points.
Step e, when the track time of the third sampling point is later than the track time of the fourth sampling point, determining that the touch type is upglide touch;
in this embodiment, if the touch is a slide-up touch, that is, the touch slides along the Y-positive coordinate direction, the Y-direction coordinate of the sampling point with the earlier track time is smaller than the Y-direction coordinate of the sampling point with the later track time, so that when the track time of the third sampling point is later than the track time of the fourth sampling point, it is determined that the touch type corresponding to the current motion track is a slide-up touch.
And f, when the track time of the third sampling point is earlier than that of the fourth sampling point, determining that the touch type is gliding touch.
In this embodiment, if the motion trajectory is a downslide touch, that is, the motion trajectory slides along the Y negative coordinate direction, the Y-direction coordinate of the sampling point with the earlier trajectory time is greater than the Y-direction coordinate of the sampling point with the later trajectory time, so that when the trajectory time of the third sampling point is earlier than the trajectory time of the fourth sampling point, it is determined that the touch type corresponding to the current motion trajectory is a downslide touch.
In the screen touch management method provided by this embodiment, when it is determined that the maximum Y-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Y-direction coordinate difference is greater than the maximum Z-direction coordinate difference, the up-and-down sliding threshold data is obtained, and then when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference and the maximum Z-direction coordinate difference match the up-and-down sliding threshold data, it is determined that the touch type is an up-sliding touch or a down-sliding touch, and it is further accurately determined that the touch type is an up-sliding touch or a down-sliding touch, so that the position of the opening of the camera under the screen can respond to the screen touch operation.
Based on the third embodiment, referring to fig. 5, a fourth embodiment of the screen touch management method of the present invention is provided, in this embodiment, step S420 further includes:
step S425, when it is determined that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference, acquiring the click threshold data;
in this embodiment, when a finger touches the screen position of the front camera opening from far to near, the finger leaves from near to far in a short time, for example, 1s, it is determined as a single-click touch event, and two-click touches are continuously performed within a short time, it is determined as a single-double-click touch event. Therefore, when the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference, it may be preliminarily determined that the current touch is a click touch, so that click threshold data in the preset touch template is obtained.
Step S426, determining that the touch type is click touch when determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data.
In this embodiment, whether the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference and the click threshold data are matched is further determined, when the data are matched, it may be further determined whether click touch, double click touch or long press touch is performed, and when the data are not matched, the current motion trajectory corresponds to one-time invalid touch, and the intelligent terminal does not perform any operation.
Step g, determining the shielding times of the front camera and the shielding duration of the front camera for each shielding based on the motion track;
in this embodiment, the click touch includes a click touch, a double click touch, and a long press touch, where the blocking duration of the click touch and the double click touch on the front camera is shorter, for example, less than 1s, and the blocking duration of the long press touch on the front camera is longer, for example, greater than or equal to 2s, so that the blocking times of the front camera being blocked and the blocking duration of the front camera being blocked each time need to be determined according to the motion trajectory, and then the click touch, the double click touch, or the long press touch is further determined.
H, when the shielding duration is determined to meet the long-press condition, determining that the touch type is long-press touch;
in this embodiment, when a long-press blocking duration longer than a preset duration exists in all blocking durations, it is determined that the blocking duration meets a long-press condition, where the preset duration is determined according to an actual situation, for example, the preset duration is equal to 2 s. And when the shielding duration is determined to meet the long-press condition, determining that the current touch type is long-press touch.
Step i, when the shielding duration is determined not to meet the long-time pressing condition and the shielding times are equal to a first preset value, determining that the touch type is single-click touch;
and j, when the shielding duration is determined not to meet the length pressing condition and the shielding times are greater than or equal to a second preset value, determining that the touch type is double-click touch, wherein the second preset value is greater than the first preset value.
In this embodiment, when there is no long-press blocking duration greater than the preset duration in all blocking durations, it is determined that the blocking duration does not satisfy the long-press condition, and then the current touch type is single-click touch or double-click touch, and then the touch type is further determined according to the blocking times.
Specifically, when the shielding frequency is equal to a first preset value, it is determined that the current touch type is single-click touch, and when the shielding frequency is greater than or equal to a second preset value, it is determined that the touch type is double-click touch, where the second preset value is greater than the first preset value, preferably, the first preset value is equal to 1, and the second preset value is equal to 2.
In the screen touch management method provided by this embodiment, when it is determined that the maximum Z-direction coordinate difference is greater than the maximum X-direction coordinate difference and the maximum Z-direction coordinate difference is greater than the maximum Y-direction coordinate difference, the click threshold data is obtained, and then when it is determined that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data, it is determined that the touch type is click touch, and it is further accurately determined that the touch type is click touch, double click touch, or long press touch, and it is achieved that the position of the opening of the camera under the screen can respond to the screen touch operation.
The invention further provides a screen touch management device, and referring to fig. 6, fig. 6 is a functional module schematic diagram of an embodiment of the screen touch management device of the invention.
The starting module 10 is configured to start the front-facing camera and determine whether the approaching object is a finger of a user when the front-facing camera is not turned on and the proximity sensor detects that the object approaches the front-facing camera;
the obtaining module 20 is configured to start the tracking mode when it is determined that the approaching object is a finger of a user, and obtain a motion trajectory of the finger of the user based on the front-facing camera and the tracking mode;
the sampling module 30 is configured to obtain four-dimensional coordinates of the sampling point on the motion trajectory based on a preset time interval, where the four-dimensional coordinates include a horizontal X-direction coordinate, a vertical Y-direction coordinate, a far-near Z-direction coordinate, and a trajectory time;
and the determining module 40 is configured to determine a touch type corresponding to the motion trajectory based on the four-dimensional coordinates of each sampling point and a preset touch template, and execute a touch operation based on the touch type.
Further, the determining module 40 is further configured to:
determining a maximum X-direction coordinate difference value, a maximum Y-direction coordinate difference value and a maximum Z-direction coordinate difference value based on the four-dimensional coordinate values of the sampling points;
and determining a touch type corresponding to the motion track based on the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value, the maximum Z-direction coordinate difference value and a preset touch template.
Further, the determining module 40 is further configured to:
when the maximum X-direction coordinate difference is determined to be larger than the maximum Y-direction coordinate difference and the maximum X-direction coordinate difference is determined to be larger than the maximum Z-direction coordinate difference, acquiring the left-right sliding threshold value data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the left-right sliding threshold data, determining that the touch type is left-sliding touch or right-sliding touch.
Further, the determining module 40 is further configured to:
acquiring a first sampling point and a second sampling point corresponding to the maximum X-direction coordinate difference value, wherein the X-direction coordinate of the first sampling point is larger than that of the second sampling point;
when the track time of the first sampling point is later than that of the second sampling point, determining that the touch type is right-sliding touch;
and when the track time of the first sampling point is earlier than the track time of the second sampling point, determining that the touch type is left-sliding touch.
Further, the determining module 40 is further configured to:
when the maximum Y-direction coordinate difference is determined to be larger than the maximum X-direction coordinate difference and the maximum Y-direction coordinate difference is determined to be larger than the maximum Z-direction coordinate difference, acquiring up-down sliding threshold data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the up-down sliding threshold data, determining that the touch type is up-sliding touch or down-sliding touch.
Further, the determining module 40 is further configured to:
acquiring a third sampling point and a fourth sampling point corresponding to the maximum Y-direction coordinate difference value, wherein the Y-direction coordinate of the third sampling point is greater than the Y-direction coordinate of the fourth sampling point;
when the track time of the third sampling point is later than that of the fourth sampling point, determining that the touch type is an upward-sliding touch;
and when the track time of the third sampling point is less than the track time of the fourth sampling point, determining that the touch type is a gliding touch.
Further, the determining module 40 is further configured to:
when the maximum Z-direction coordinate difference is determined to be larger than the maximum X-direction coordinate difference and the maximum Z-direction coordinate difference is determined to be larger than the maximum Y-direction coordinate difference, acquiring the click threshold value data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the click threshold value data, determining that the touch type is click touch.
Further, the determining module 40 is further configured to:
determining the shielding times of the front camera and the shielding duration of the front camera for each shielding based on the motion track;
when the shielding duration is determined to meet the long-press condition, determining that the touch type is long-press touch;
when the shielding duration is determined not to meet the long-time pressing condition and the shielding times are equal to a first preset value, determining that the touch type is single-click touch;
and when the shielding duration is determined not to meet the long pressing condition and the shielding times are greater than or equal to a second preset value, determining that the touch type is double-click touch, wherein the second preset value is greater than the first preset value.
Further, the determining module 40 is further configured to:
and when the long pressing shielding time length which is longer than the preset time length exists in all the shielding time lengths, determining that the shielding time length meets the long pressing condition.
In addition, an embodiment of the present invention further provides a readable storage medium, where a screen touch management program is stored on the readable storage medium, and the screen touch management program, when executed by a processor, implements the steps of the screen touch management method in the foregoing embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a readable storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above, and includes several instructions for enabling a system device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (12)

1. A screen touch control management method is applied to an intelligent terminal provided with a proximity sensor, the proximity sensor is installed in a preset range of a front camera of the intelligent terminal, the intelligent terminal is provided with a tracking mode, and the screen touch control management method is characterized by comprising the following steps:
when the front camera is not started and an object is detected to be close to the front camera based on the proximity sensor, starting the front camera and determining whether the close object is a finger of a user;
when the approaching object is determined to be a user finger, starting the tracking mode, and acquiring a motion track of the user finger based on the front camera and the tracking mode;
acquiring four-dimensional coordinates of the sampling points on the motion trail based on a preset time interval, wherein the four-dimensional coordinates comprise horizontal X-direction coordinates, vertical Y-direction coordinates, far and near Z-direction coordinates and trail time;
and determining a touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and executing touch operation based on the touch type.
2. The screen touch management method of claim 1, wherein the determining the touch type corresponding to the motion trajectory based on the four-dimensional coordinates of each sampling point and a preset touch template comprises:
determining a maximum X-direction coordinate difference value, a maximum Y-direction coordinate difference value and a maximum Z-direction coordinate difference value based on the four-dimensional coordinate values of the sampling points;
and determining a touch type corresponding to the motion track based on the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value, the maximum Z-direction coordinate difference value and a preset touch template.
3. The screen touch management method of claim 2, wherein the touch types include left-slide touch and right-slide touch, the preset touch template includes left-right slide threshold data, and the determining the touch type corresponding to the motion trajectory based on the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value, the maximum Z-direction coordinate difference value, and the preset touch template includes:
when the maximum X-direction coordinate difference is determined to be larger than the maximum Y-direction coordinate difference and the maximum X-direction coordinate difference is determined to be larger than the maximum Z-direction coordinate difference, acquiring the left-right sliding threshold value data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the left-right sliding threshold data, determining that the touch type is left-sliding touch or right-sliding touch.
4. The screen touch management method of claim 3, wherein the determining that the touch type is a left-slide touch or a right-slide touch when determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the left-right-slide threshold data comprises:
acquiring a first sampling point and a second sampling point corresponding to the maximum X-direction coordinate difference value, wherein the X-direction coordinate of the first sampling point is larger than that of the second sampling point;
when the track time of the first sampling point is later than that of the second sampling point, determining that the touch type is right-sliding touch;
and when the track time of the first sampling point is earlier than the track time of the second sampling point, determining that the touch type is left-sliding touch.
5. The screen touch management method of claim 2, wherein the touch types include an up-sliding touch and a down-sliding touch, the preset touch template includes up-sliding threshold data, and the determining the touch type corresponding to the motion trajectory based on the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, the maximum Z-direction coordinate difference, and the preset touch template includes:
when the maximum Y-direction coordinate difference is determined to be larger than the maximum X-direction coordinate difference and the maximum Y-direction coordinate difference is determined to be larger than the maximum Z-direction coordinate difference, acquiring up-down sliding threshold data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the up-down sliding threshold data, determining that the touch type is up-sliding touch or down-sliding touch.
6. The screen touch management method of claim 5, wherein the determining that the touch type is a swipe up touch or a swipe down touch when determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the swipe up and down threshold data comprises:
acquiring a third sampling point and a fourth sampling point corresponding to the maximum Y-direction coordinate difference value, wherein the Y-direction coordinate of the third sampling point is greater than the Y-direction coordinate of the fourth sampling point;
when the track time of the third sampling point is later than that of the fourth sampling point, determining that the touch type is an upward-sliding touch;
and when the track time of the third sampling point is less than the track time of the fourth sampling point, determining that the touch type is a gliding touch.
7. The screen touch management method of claim 2, wherein the touch type comprises a click touch, the preset touch template comprises click threshold data, and the determining the touch type corresponding to the motion trajectory based on the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value, the maximum Z-direction coordinate difference value, and the preset touch template comprises:
when the maximum Z-direction coordinate difference is determined to be larger than the maximum X-direction coordinate difference and the maximum Z-direction coordinate difference is determined to be larger than the maximum Y-direction coordinate difference, acquiring the click threshold value data;
and when the maximum X-direction coordinate difference value, the maximum Y-direction coordinate difference value and the maximum Z-direction coordinate difference value are determined to be matched with the click threshold value data, determining that the touch type is click touch.
8. The screen touch management method of claim 7, wherein the click touch comprises a click touch, a double click touch, and a long click touch, and wherein the determining that the touch type is a click touch when determining that the maximum X-direction coordinate difference, the maximum Y-direction coordinate difference, and the maximum Z-direction coordinate difference match the click threshold data comprises:
determining the shielding times of the front camera and the shielding duration of the front camera for each shielding based on the motion track;
when the shielding duration is determined to meet the long-press condition, determining that the touch type is long-press touch;
when the shielding duration is determined not to meet the long-time pressing condition and the shielding times are equal to a first preset value, determining that the touch type is single-click touch;
and when the shielding duration is determined not to meet the long pressing condition and the shielding times are greater than or equal to a second preset value, determining that the touch type is double-click touch, wherein the second preset value is greater than the first preset value.
9. The screen touch management method of claim 8, wherein the occlusion duration is determined to satisfy a long-press condition when a long-press occlusion duration greater than a preset duration exists among all the occlusion durations.
10. A screen touch management device, comprising:
the starting module is used for starting the front camera when the front camera is not started and an object is detected to be close to the front camera based on the proximity sensor, and determining whether the close object is a finger of a user;
the acquisition module is used for starting the tracking mode when the approaching object is determined to be the finger of the user, and acquiring the motion track of the finger of the user based on the front camera and the tracking mode;
the sampling module is used for acquiring four-dimensional coordinates of sampling points on the motion trail based on a preset time interval, wherein the four-dimensional coordinates comprise horizontal X-direction coordinates, vertical Y-direction coordinates, far and near Z-direction coordinates and trail time;
and the determining module is used for determining a touch type corresponding to the motion track based on the four-dimensional coordinates of each sampling point and a preset touch template, and executing touch operation based on the touch type.
11. An intelligent terminal, characterized in that, intelligent terminal includes: a memory, a processor and a screen touch management program stored on the memory and executable on the processor, the screen touch management program when executed by the processor implementing the steps of the screen touch management method according to any one of claims 1 to 9.
12. A readable storage medium, wherein the screen touch management program is stored thereon, and when executed by a processor, the screen touch management program implements the steps of the screen touch management method according to any one of claims 1 to 9.
CN201910900469.3A 2019-09-23 2019-09-23 Screen touch control management method, intelligent terminal, device and readable storage medium Pending CN112540696A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910900469.3A CN112540696A (en) 2019-09-23 2019-09-23 Screen touch control management method, intelligent terminal, device and readable storage medium
PCT/CN2020/116484 WO2021057654A1 (en) 2019-09-23 2020-09-21 Screen touch management method, smart terminal, device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910900469.3A CN112540696A (en) 2019-09-23 2019-09-23 Screen touch control management method, intelligent terminal, device and readable storage medium

Publications (1)

Publication Number Publication Date
CN112540696A true CN112540696A (en) 2021-03-23

Family

ID=75013168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910900469.3A Pending CN112540696A (en) 2019-09-23 2019-09-23 Screen touch control management method, intelligent terminal, device and readable storage medium

Country Status (2)

Country Link
CN (1) CN112540696A (en)
WO (1) WO2021057654A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538762A (en) * 2021-09-16 2021-10-22 深圳市海清视讯科技有限公司 Entrance guard flat panel device menu control method, device, equipment, medium and product
CN114020192A (en) * 2021-09-18 2022-02-08 特斯联科技集团有限公司 Interaction method and system for realizing non-metal plane based on curved surface capacitor
CN114115673A (en) * 2021-11-25 2022-03-01 海信集团控股股份有限公司 Control method of vehicle-mounted screen
CN115150551A (en) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 Photographing processing method and device for terminal, electronic equipment and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150550A (en) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 Photographing processing method and device for terminal, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978133A (en) * 2014-04-04 2015-10-14 阿里巴巴集团控股有限公司 Screen capturing method and screen capturing device for intelligent terminal
CN106055143B (en) * 2016-05-20 2019-03-26 广州视睿电子科技有限公司 Touch point method for detecting position and system
CN109298798B (en) * 2018-09-21 2021-08-17 歌尔科技有限公司 Operation control method and device of touch pad and intelligent terminal

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538762A (en) * 2021-09-16 2021-10-22 深圳市海清视讯科技有限公司 Entrance guard flat panel device menu control method, device, equipment, medium and product
CN113538762B (en) * 2021-09-16 2021-12-14 深圳市海清视讯科技有限公司 Menu control method, device, system, medium and product of entrance guard flat panel device
CN114020192A (en) * 2021-09-18 2022-02-08 特斯联科技集团有限公司 Interaction method and system for realizing non-metal plane based on curved surface capacitor
CN114020192B (en) * 2021-09-18 2024-04-02 特斯联科技集团有限公司 Interaction method and system for realizing nonmetal plane based on curved surface capacitor
CN114115673A (en) * 2021-11-25 2022-03-01 海信集团控股股份有限公司 Control method of vehicle-mounted screen
CN114115673B (en) * 2021-11-25 2023-10-27 海信集团控股股份有限公司 Control method of vehicle-mounted screen
CN115150551A (en) * 2022-06-20 2022-10-04 湖北星纪时代科技有限公司 Photographing processing method and device for terminal, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2021057654A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
CN112540696A (en) Screen touch control management method, intelligent terminal, device and readable storage medium
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
EP2817694B1 (en) Navigation for multi-dimensional input
US20170083741A1 (en) Method and device for generating instruction
US9268407B1 (en) Interface elements for managing gesture control
KR102165818B1 (en) Method, apparatus and recovering medium for controlling user interface using a input image
CN109558000B (en) Man-machine interaction method and electronic equipment
KR102321562B1 (en) Dynamic motion detection method, dynamic motion control method and apparatus
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
WO2022110614A1 (en) Gesture recognition method and apparatus, electronic device, and storage medium
CN113253908B (en) Key function execution method, device, equipment and storage medium
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
JP2021531589A (en) Motion recognition method, device and electronic device for target
CN106325623A (en) Method and apparatus for monitoring touch on touch screen and terminal device
CN112486394A (en) Information processing method and device, electronic equipment and readable storage medium
CN112905136A (en) Screen projection control method and device and storage medium
CN110865765A (en) Terminal and map control method
CN107544740B (en) Application processing method and device, storage medium and electronic equipment
JP2023511156A (en) Shooting method and electronic equipment
JP7266667B2 (en) GESTURE RECOGNITION METHOD, GESTURE PROCESSING METHOD AND APPARATUS
US9350918B1 (en) Gesture control for managing an image view display
CN111107271B (en) Shooting method and electronic equipment
CN111093030B (en) Equipment control method and electronic equipment
CN111147750B (en) Object display method, electronic device, and medium
CN116204073A (en) Touch control method, touch control device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination