CN112346597A - Touch processing method and device and electronic equipment - Google Patents

Touch processing method and device and electronic equipment Download PDF

Info

Publication number
CN112346597A
CN112346597A CN202011254174.2A CN202011254174A CN112346597A CN 112346597 A CN112346597 A CN 112346597A CN 202011254174 A CN202011254174 A CN 202011254174A CN 112346597 A CN112346597 A CN 112346597A
Authority
CN
China
Prior art keywords
touch
type
image
screen
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011254174.2A
Other languages
Chinese (zh)
Inventor
吴诗乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202011254174.2A priority Critical patent/CN112346597A/en
Publication of CN112346597A publication Critical patent/CN112346597A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Abstract

The application provides a touch processing method, a touch processing device and electronic equipment, wherein when a first type of touch device and a second type of touch device touch a touch screen, a touch image corresponding to the first type of touch device is acquired; determining whether a touch area of a first type touch device in the touch screen is a preset invalid area or not according to the shape of the touch area of the first type touch device in the touch screen in the touch image, and not responding to a touch signal of a second type touch device when the touch area of the second type touch device in the touch screen is the preset invalid area; when the touch area of the second type touch device in the touch screen is not a preset invalid area, responding to the touch signals of the second type touch device instead of not responding to all the touch signals of the second type touch device. Therefore, the first-type touch device and the second-type touch device can write at the same time, and user experience is improved.

Description

Touch processing method and device and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of image recognition, in particular to a touch processing method and device and electronic equipment.
Background
For an electronic device provided with a touch screen, when a finger, a palm and a pen point touch the touch screen, the touch screen is recognized. When a user writes on the touch screen by using the touch pen, the side of the habitual hand is supported on the touch screen so as to facilitate writing. Therefore, the supporting point of the hand can also write or erase traces on the touch screen, which affects the writing of the pen.
Therefore, in order to avoid the influence of the supporting point of the hand on the writing of the stylus when the stylus is used for writing, in the prior art, the separation of the stylus and the hand is usually realized, that is, the stylus writing or the hand writing is distinguished by distinguishing the touch signal of the stylus from the touch signal of the hand.
Therefore, when writing by the stylus pen, the touch signal of the hand is shielded, and when writing by the stylus pen, the touch signal of the stylus pen is shielded, so that the simultaneous writing by the stylus pen and the hand cannot be realized.
Disclosure of Invention
The embodiment of the application provides a touch processing method and device and electronic equipment, and the touch processing method and device can be used for realizing that at least two types of touch equipment can write on a touch screen at the same time.
In a first aspect, an embodiment of the present application provides a touch processing method, including:
when a first type of touch device and a second type of touch device touch a touch screen, acquiring a touch image corresponding to the first type of touch device, wherein values of pixel points in a touch area of the first type of touch device in the touch screen and values of pixel points in other areas in the touch image are different;
and determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image.
Optionally, the determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image includes:
inputting the touch image into a preset image recognition model, acquiring the shape of the touch area in the touch image, and outputting a result indicating whether to respond to a touch signal of the first type of touch device touching the touch screen by the image recognition model.
Optionally, when the preset image recognition model recognizes that the shape of the touch area in the touch image is the shape of the preset invalid touch area, outputting a result indicating that the touch data of the first type of touch device touching the touch screen is not responded;
otherwise, outputting a result indicating that the touch data of the first type touch device touching the touch screen is responded.
Optionally, before the inputting the touch image into a preset image recognition model, the method further includes:
acquiring a training sample set, wherein the training sample set comprises a plurality of touch images containing shapes of preset invalid touch areas;
and training an image recognition model according to the training sample set to obtain a preset image recognition model.
Optionally, the first type of touch device is a hand, the second type of touch device is a stylus, and the shape of the preset invalid touch area is a shape of a touch area formed when the hand supports the touch screen.
Optionally, the acquiring a touch image corresponding to the first type of touch device includes:
and obtaining a touch image corresponding to the first type of touch control device according to the touch signal of the first type of touch control device to the touch screen.
Optionally, the touch image is a grayscale image.
In a second aspect, an embodiment of the present application provides a touch processing apparatus, including:
the touch screen comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a touch image corresponding to a first type of touch device when the first type of touch device and a second type of touch device touch the touch screen, and the touch area of the first type of touch device in the touch screen in the touch image is different from the values of pixel points in other areas;
and the determining module is used for determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image.
Optionally, when the determining module determines whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image, the determining module is specifically configured to:
and inputting the touch image into a preset image recognition model, and outputting a result indicating whether to respond to a touch signal of the first type of touch device touching the touch screen according to the shape of the touch area in the touch image by the image recognition model.
Optionally, when the preset image recognition model recognizes that the shape of the touch area in the touch image is the shape of the preset invalid touch area, outputting a result indicating that the touch data of the first type of touch device touching the touch screen is not responded;
otherwise, outputting a result indicating that the touch data of the touch screen is touched by the first type of touch device.
Optionally, before the determining module inputs the touch image to the preset image recognition model, the obtaining module is further configured to:
acquiring a training sample set, wherein the training sample set comprises a plurality of touch images containing shapes of preset invalid touch areas;
and training the image recognition model according to the training sample set to obtain a preset image recognition model.
Optionally, the first type of touch device is a hand, the second type of touch device is a stylus, and the shape of the preset invalid touch area is the shape of a touch area formed when the hand supports the touch screen.
Optionally, when the obtaining module obtains the touch image corresponding to the first type of touch device, the obtaining module is specifically configured to:
and obtaining a touch image corresponding to the first type of touch device according to the touch signal of the first type of touch device to the touch screen.
Optionally, the touch image is a grayscale image.
In a third aspect, an embodiment of the present application provides a chip, including: at least one processor and memory;
the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored by the memory to perform the method of any one of the first aspect of the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a touch screen, at least one processor, and a memory;
the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored by the memory to perform the method of any one of the first aspect of the embodiments of the present application.
In a fifth aspect, the present application provides a computer-readable storage medium, in which program instructions are stored, and when the program instructions are executed by a processor, the method according to any one of the first aspect of the embodiments of the present invention is implemented.
In a sixth aspect, this application embodiment provides a program product, which includes a computer program, where the computer program is stored in a readable storage medium, and the computer program can be read by at least one processor of an electronic device from the readable storage medium, and the computer program is executed by the at least one processor to enable the electronic device to implement the method according to any one of the first aspect of the invention embodiment of this application.
The embodiment of the application provides a touch processing method, a touch processing device and electronic equipment, wherein when a first type of touch device and a second type of touch device touch a touch screen, a touch image corresponding to the first type of touch device is obtained, wherein values of pixel points in a touch area of the first type of touch device in the touch screen and values of pixel points in other areas in the touch image are different; determining whether a touch area of a first type of touch device in the touch screen is a preset invalid area or not according to the touch area of the first type of touch device in the touch screen in the touch image, and not responding to a touch signal of a second type of touch device when the touch area of the second type of touch device in the touch screen is the preset invalid area; when the touch area of the second type touch device in the touch screen is not a preset invalid area, responding to the touch signals of the second type touch device instead of not responding to all the touch signals of the second type touch device. Therefore, the first type touch device and the second type touch device can write at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a touch processing method according to an embodiment of the present application;
FIG. 3 is a touch image corresponding to a finger according to an embodiment of the present disclosure;
FIG. 4 is a touch image corresponding to a side of a hand provided by an embodiment of the present application;
fig. 5 is a schematic flowchart of a touch processing method according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of a touch processing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a block diagram of a touch processing device 800 according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 1, when a user writes on a touch screen by using a stylus, in order to facilitate writing, a side surface of a hand holding the stylus is supported on the touch screen, and a support point of the side surface of the hand on the touch screen also has writing traces, which affect writing of the stylus.
In the prior art, in order to avoid the influence of the supporting point of the hand on the writing of the stylus when the stylus is used for writing, the technical scheme is adopted to realize the separation of the writing of the stylus and the writing of the hand, and the writing of the stylus or the writing of the hand is distinguished by distinguishing the touch signal of the stylus from the touch signal of the hand.
However, in the prior art, when the stylus pen writes, the touch signal of the hand is shielded, and when the stylus pen writes, the touch signal of the stylus pen is shielded, so that the stylus pen and the hand cannot write at the same time. Like this, when the user needed stylus and hand to write simultaneously on the touch-sensitive screen, for example, the user used the in-process of stylus writing on the touch-sensitive screen, when needing to write on the touch-sensitive screen through the hand, still can shield the touch signal of hand, can't satisfy user's demand, and user experience is poor.
Therefore, to solve the problems existing in the prior art, the present application proposes: when the stylus pen writes on the touch screen, according to the characteristics of the side surface of the hand and the habit of the user writing, the shape of the side surface of the hand and the touch area of the touch screen is different from the shape of the touch area when the stylus pen writes on the touch screen and the shape of the touch area when the hand writes on the touch screen.
For example, when the side surface of the hand is supported on the touch screen, the contact area between the side surface of the hand and the touch screen is large, and the side surface of the hand is bent to a certain degree due to the pen-holding posture of the hand, so that the shape of the touch area formed on the touch screen by the side surface of the hand has a certain arc, and has a certain length and width. When a finger touches the touch screen, the contact area between the finger and the touch screen is small, and the shape of a touch area formed on the touch screen is similar to an ellipse.
Therefore, when the stylus pen writes on the touch screen, whether the shape of the touch area in the touch screen is the shape of the side of the hand and the touch area of the touch screen can be determined according to the shape of the touch area, the shape of the touch area in the touch screen is the shape of the side of the hand and the touch area of the touch screen, a touch signal when the side of the hand touches the touch screen is shielded, and a response is made to the touch signal (for example, the touch signal when the hand writes on the touch screen) when the shape of the touch area in the touch screen is not the shape of the side of the hand and the touch area of the touch screen. The touch signal of the hand during non-writing is only shielded, for example, the touch signal of the hand during the side face of the hand touching the touch screen, so that the touch signal of the hand during writing is responded, the simultaneous writing of the stylus and the hand is realized, and the user experience is improved.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The features of the embodiments and examples described below may be combined with each other without conflict between the embodiments.
Fig. 2 is a flowchart illustrating a touch processing method according to an embodiment of the present application. The execution subject of the method in this embodiment may be an electronic device, a server, or the like. The method in this embodiment may be implemented by software, hardware, or a combination of software and hardware. As shown in fig. 2, the method may include:
s201, when the first-type touch device and the second-type touch device touch the touch screen, obtaining a touch image corresponding to the first-type touch device.
The touch area of the first type of touch device in the touch screen in the touch image is different from the values of the pixel points in other areas.
In this step, the first type of touch device is a hand, the second type of touch device is a stylus,
for example, when a user holds a stylus pen in his/her hand and prepares to write on the touch screen, the side of the hand and the tip of the stylus pen may contact the touch screen at the same time due to different writing habits of the user, or the side of the hand of the user may contact the touch screen first and then the tip of the stylus pen may contact the touch screen. Or, the stylus pen is writing on the touch screen, and then the hand is also in contact with the touch screen, so that when the hand and the pen point of the stylus pen are in contact with the touch screen at the same time, because the touch signal generated when the pen point of the stylus pen is in contact with the touch screen can be recognized, the touch image corresponding to the hand can be acquired.
Although the touch signal when the hand is in contact with the touch panel can be recognized, it is not possible to recognize whether the touch signal is a touch signal when the hand is writing or a touch signal when the side surface of the hand is supported on the touch panel. Accordingly, a touch image corresponding to the hand is acquired, and whether the touch signal is a touch signal when the hand is writing or a touch signal when the side of the hand is supported on the touch screen is determined from the touch image, thereby determining whether to respond to the touch signal.
The touch image comprises a touch area and a non-touch area of a hand in the touch screen, and the touch area and the non-touch area are distinguished through values of pixel points.
Optionally, the touch image may be a grayscale image, for example, the touch image is a binary image in which the value of the pixel point in the touch region may be 1, and the value of the pixel point in the non-touch region may be 0, for example.
Optionally, one possible implementation scheme of S201 is: and obtaining a touch image corresponding to the first type of touch device according to the touch signal of the first type of touch device to the touch screen.
Specifically, when any part of the hand touches the touch screen, for example, a finger writes on the touch screen, or a side surface of the hand is supported on the touch screen, a touch module of the touch screen detects the touch of the hand to obtain a touch signal, the touch signal may be, for example, coordinates of a corresponding point in a touch area, or a pressing pressure value, a voltage value, and the like of the touch area, the touch module sends the touch signal to a processor of the electronic device, and the processor obtains a touch image corresponding to the hand according to the touch signal.
Or the touch module arranges the touch signal into a touch image and sends the touch image to a processor of the electronic device, so that the processor of the electronic device acquires the touch image corresponding to the hand.
S202, determining whether a touch signal of the first type of touch device touching the touch screen is responded according to the touch image.
In this step, the touch image includes a touch area and a non-touch area of the hand on the touch screen, for example, when a finger writes on the touch screen, the touch image corresponding to the finger is as shown in fig. 3, where area 1 in fig. 3 is the touch area of the finger on the touch screen, and the other areas are the non-touch areas. When the side surface of the hand is supported on the touch panel, a touch image corresponding to the side surface of the hand is shown in fig. 4, where an area 2 in the figure is a touch area when the side surface of the hand is supported on the touch panel. And comparing the shape of the touch area contained in the touch image with the shape of a preset invalid touch area, namely comparing the shape of the touch area contained in the touch image with the shape of the touch area formed when the side face of the hand is supported on the touch screen, and determining whether the shape of the touch area contained in the touch image is the shape of the touch area formed when the side face of the hand is supported on the touch screen, thereby determining whether the touch signal of touching the touch screen by the hand is responded.
A shape of a touch area formed when a side of a hand is supported on the touch screen, not responding to a touch signal of the hand touching the touch screen; otherwise, in response to a touch signal of a hand touching the touch screen, for example, when a finger writes over the touch screen, the written contents are displayed on the touch screen.
In the touch processing method provided by this embodiment, when a first-type touch device and a second-type touch device touch a touch screen, a touch image corresponding to the first-type touch device is obtained, where values of pixel points in a touch area of the first-type touch device in the touch screen and values of pixel points in other areas in the touch image are different; determining whether a touch area of a first type touch device in the touch screen is a preset invalid area or not according to the shape of the touch area of the first type touch device in the touch screen in the touch image, and not responding to a touch signal of a second type touch device when the touch area of the second type touch device in the touch screen is the preset invalid area; when the touch area of the second type touch device in the touch screen is not a preset invalid area, responding to the touch signals of the second type touch device instead of not responding to all the touch signals of the second type touch device. Therefore, the first-type touch device and the second-type touch device can write at the same time, and user experience is improved.
Fig. 5 is a flowchart illustrating a touch processing method according to another embodiment of the present application. As shown in fig. 5, on the basis of the embodiment shown in fig. 2, the method of the present embodiment includes:
s501, when the first type of touch device and the second type of touch device touch the touch screen, obtaining a touch image corresponding to the first type of touch device.
In this step, S201 may be referred to for a specific implementation manner of S501, and details are not described here.
S502, inputting the touch image into a preset image recognition model, and obtaining the result of the image recognition model for outputting whether to respond to the touch signal of the first type of touch device touching the touch screen or not according to the shape of the touch area in the touch image.
In this step, the preset image recognition model is obtained by training the image recognition model according to the corresponding touch image when the side surface of the hand is supported on the touch screen. Next, the image recognition model will be described as a convolutional neural network model. The training process of the convolutional neural network model is as follows:
the method includes the steps of collecting a corresponding touch image when the side face of the hand supports on the touch screen, for example, obtaining the corresponding touch image when the side face of the hand supports on the touch screen when a user writes on the touch screen by holding a stylus with the hand, or obtaining the corresponding touch image when the side face of the hand supports on the touch screen downloaded from a website, and obtaining a training data set.
It should be noted that, since the pen holding habits of different users are different, the shapes of the touch areas when the side surfaces of the hands of the users are supported on the touch screen are different, and thus the corresponding touch images are different. Also, the shape of the touch area when the side of the hand is supported on the touch screen is related to the age of the user. Therefore, when collecting a touch image corresponding to a touch screen in which a side surface of a hand is supported, it is necessary to collect a touch image corresponding to a touch screen in which a side surface of a hand is supported by a different user or a different pen holding habit.
Then, the corresponding touch image when the side face of each hand in the training data set is supported on the touch screen is processed, so that the touch area of the side face of the hand supported on the touch screen can be distinguished from other areas, and the shape characteristic of the touch area of the side face of the hand supported on the touch screen is obtained. For example, the corresponding touch image when the side of each hand is supported on the touch screen is arranged as a binary image having a uniform image size. The value of the pixel point in the touch area may be 1, for example, and the value of the pixel point in the non-touch area may be 0, for example, so that the shape of the touch area in the touch image is determined according to the value of the pixel point.
Inputting the gray level image corresponding to the touch image corresponding to the side surface of each hand supported on the touch screen into the convolutional neural network model, and training the convolutional neural network model to obtain the preset convolutional neural network model. The preset convolutional neural network model outputs a result indicating whether to respond to the touch signal of the first type of touch device touching the touch screen.
When a touch pen writes on the touch screen and a hand touches the touch screen, a touch image corresponding to the hand is obtained, and after gray processing is carried out on the touch image, a gray touch image is obtained. And inputting the gray scale touch image into a preset convolution neural network model. And the preset convolution neural network model acquires the shape of the touch area according to the value of the pixel point in the gray-scale touch image, and outputs a result indicating whether to respond to the touch signal of the hand touch screen or not according to the shape of the touch area.
Optionally, when recognizing that the shape of the touch area in the grayscale touch image is the shape of the touch area formed by the manual side support on the touch screen, the preset convolutional neural network model outputs a result indicating that the touch information of the hand touching the touch screen is not responded.
When recognizing that the shape of the touch area in the gray-scale touch image is not the shape of the touch area formed on the touch screen by manual side support, the preset convolutional neural network model may be a result of writing a finger on the touch screen and outputting touch information indicating a response to the touch of the hand on the touch screen.
In this embodiment, the preset image recognition model is obtained by training the touch image including the shape of the preset invalid touch area, so that the shape of the touch area of the first type touch device in the touch image in the touch screen can be accurately and quickly recognized through the preset image recognition model, and when the touch area of the second type touch device in the touch screen is the preset invalid area, the touch signal of the second type touch device is not responded; when the touch area of the second type touch device in the touch screen is not a preset invalid area, responding to the touch signals of the second type touch device instead of not responding to all the touch signals of the second type touch device. Therefore, the first-type touch device and the second-type touch device can write at the same time, and user experience is improved.
The embodiment shown in fig. 5 adopts an artificial intelligence method to identify the shape of the touch area in the touch image through a preset image identification model, so as to output a result indicating whether to respond to the touch signal of the first type of touch device touching the touch screen. Optionally, the shape of the touch area in the touch image may be identified by a non-artificial intelligence method, which is not described herein again.
It should be noted that, on the basis of any of the above embodiments, one possible implementation manner of S201 is: when the first type of touch device and the second type of touch device touch the touch screen, not only the touch image corresponding to the first type of touch device is obtained, but also the identifier of the touch image corresponding to the first type of touch device is obtained, and the identifier is used for uniquely identifying the touch image corresponding to the first type of touch device.
For example, when the stylus touches the touch screen, the side of the hand is supported on the touch screen, and a touch image corresponding to the side of the hand supported on the touch screen and the identifier of the touch image are acquired. The mark can be in the form of numbers, letters or the like.
Optionally, when the first type of touch device and the second type of touch device touch the touch screen, a touch device belonging to the second type of touch device also touches the touch screen, and at this time, the number of touch devices belonging to the second type of touch device touching the touch screen is greater than 2. At this time, the touch images of different touch devices belonging to the second type of touch device can be distinguished through the identifiers of the touch images.
Optionally, when the number of the touch devices belonging to the second type of touch device touching the touch screen is greater than 2, the identifier of the touch image corresponding to each touch device belonging to the second type of touch device is related to the order in which the touch devices touch the touch screen in the second type of touch device.
For example, when the stylus touches the touch screen, the side of the hand is supported on the touch screen, and a touch image corresponding to the side of the hand supported on the touch screen is obtained, and the identifier of the touch image is, for example, 01. At this time, when the stylus pen is writing on the touch screen, and the user writes on the touch screen with a finger, a touch image corresponding to the finger is acquired, and the identifier of the touch image is 02, for example.
Fig. 6 is a schematic structural diagram of a touch processing device according to an embodiment of the present application. As shown in fig. 6, the apparatus of the present embodiment includes: an acquisition module 61 and a determination module 62. Wherein the content of the first and second substances,
the acquiring module 61 is configured to acquire a touch image corresponding to a first type of touch device when the first type of touch device and a second type of touch device touch a touch screen, where values of pixel points in a touch area of the first type of touch device in the touch screen and pixel points in other areas in the touch image are different;
and a determining module 62, configured to determine whether to respond to a touch signal of the first type touch device touching the touch screen according to the touch image.
Optionally, when determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image, the determining module 62 is specifically configured to:
and inputting the touch image into a preset image recognition model, and outputting a result indicating whether to respond to a touch signal of the first type of touch device touching the touch screen according to the shape of the touch area in the touch image by the image recognition model.
Optionally, when the preset image recognition model recognizes that the shape of the touch area in the touch image is the shape of the preset invalid touch area, outputting a result indicating that the touch data of the first type of touch device touching the touch screen is not responded;
otherwise, outputting a result indicating that the touch data of the touch screen is touched by the first type of touch device.
Optionally, before the determining module 62 inputs the touch image to the preset image recognition model, the obtaining module 61 is further configured to:
acquiring a training sample set, wherein the training sample set comprises a plurality of touch images containing shapes of preset invalid touch areas;
and training the image recognition model according to the training sample set to obtain a preset image recognition model.
Optionally, the first type of touch device is a hand, the second type of touch device is a stylus, and the shape of the preset invalid touch area is the shape of a touch area formed when the hand supports the touch screen.
Optionally, when the obtaining module 61 obtains the touch image corresponding to the first type of touch device, the obtaining module is specifically configured to:
and obtaining a touch image corresponding to the first type of touch device according to the touch signal of the first type of touch device to the touch screen.
Optionally, the touch image is a grayscale image.
The touch processing apparatus described above in this embodiment may be configured to execute the technical solutions in the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Based on the inventive concept of the present application, the present application further provides a chip, including: at least one processor and memory;
the memory stores computer-executable instructions; the at least one processor executes the computer execution instructions stored in the memory to execute the technical solutions in the above method embodiments, which have similar implementation principles and technical effects and are not described herein again.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 7, the electronic device according to the embodiment may include: a touch screen 71, at least one processor 72 and a memory 72. Fig. 7 shows an electronic device as an example of a processor, wherein,
the touch screen 71 is configured to receive a touch signal input by a user through the touch device, and output data corresponding to the touch signal to the user. For example, when a user writes through a touch device, the content written by the user is displayed on the touch screen.
And a memory 72 for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory 72 may comprise a Random Access Memory (RAM) and may also include a non-volatile memory (e.g., at least one disk memory).
A processor 72 for executing computer-executable instructions stored by the memory 72 to implement the antenna selection method of any of the above embodiments.
The processor 72 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present Application.
Alternatively, in a specific implementation, if the memory 72 and the processor 72 are implemented independently, the memory 72 and the processor 72 may be connected to each other via a bus and communicate with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The buses may be divided into address buses, data buses, control buses, etc., but do not represent only one bus or one type of bus.
Alternatively, in a specific implementation, if the memory 72 and the processor 72 are integrated on a single chip, the memory 72 and the processor 72 may perform the same communication through an internal interface.
The electronic device described above in this embodiment may be configured to execute the technical solutions in the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 8 is a block diagram of a touch processing device 800 according to an embodiment of the present disclosure. For example, the apparatus 800 may be a device with a touch screen, such as a mobile phone, a computer, a game console, a tablet device, a medical device, a personal digital assistant, and the like.
Referring to fig. 8, the apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 806 provides power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media capable of storing program codes, such as Read-Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disk, and the like.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A touch processing method is characterized by comprising the following steps:
when a first type of touch device and a second type of touch device touch a touch screen, acquiring a touch image corresponding to the first type of touch device, wherein values of pixel points in a touch area of the first type of touch device in the touch screen and values of pixel points in other areas in the touch image are different;
and determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image.
2. The method of claim 1, wherein the determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image comprises:
inputting the touch image into a preset image recognition model, acquiring the shape of the touch area in the touch image, and outputting a result indicating whether to respond to a touch signal of the first type of touch device touching the screen by the image recognition model.
3. The method according to claim 2, wherein the preset image recognition model outputs a result indicating that the touch data of the touch screen is not responded to by the first type of touch device when recognizing that the shape of the touch area in the touch image is the shape of a preset invalid touch area;
otherwise, outputting a result indicating that the touch data of the first type touch device touching the touch screen is responded.
4. The method according to claim 3, wherein before inputting the touch image into a preset image recognition model, the method further comprises:
acquiring a training sample set, wherein the training sample set comprises a plurality of touch images containing shapes of preset invalid touch areas;
and training an image recognition model according to the training sample set to obtain a preset image recognition model.
5. The method according to claim 3 or 4, wherein the first type of touch device is a hand, the second type of touch device is a stylus, and the predetermined invalid touch area has a shape of a touch area formed when the hand supports the touch screen.
6. The method according to any one of claims 1-4, wherein the acquiring the touch image corresponding to the first type of touch device comprises:
and obtaining a touch image corresponding to the first type of touch control device according to the touch signal of the first type of touch control device to the touch screen.
7. The method of any of claims 1-4, wherein the touch image is a grayscale image.
8. A touch processing apparatus, comprising:
the touch screen comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a touch image corresponding to a first type of touch device when the first type of touch device and a second type of touch device touch the touch screen, and the touch image is different from the values of pixel points in other areas in a touch area of the first type of touch device in the touch screen;
and the determining module is used for determining whether to respond to the touch signal of the first type of touch device touching the touch screen according to the touch image.
9. An electronic device, comprising: the touch screen, a memory for storing program instructions, and at least one processor for calling the program instructions in the memory to execute the touch processing method according to any one of claims 1 to 7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a computer program; the computer program, when executed, implements a touch processing method as recited in any of claims 1-7.
CN202011254174.2A 2020-11-11 2020-11-11 Touch processing method and device and electronic equipment Pending CN112346597A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011254174.2A CN112346597A (en) 2020-11-11 2020-11-11 Touch processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011254174.2A CN112346597A (en) 2020-11-11 2020-11-11 Touch processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112346597A true CN112346597A (en) 2021-02-09

Family

ID=74363369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011254174.2A Pending CN112346597A (en) 2020-11-11 2020-11-11 Touch processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112346597A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024066896A1 (en) * 2022-09-29 2024-04-04 荣耀终端有限公司 Touch control method and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1797305A (en) * 2004-11-23 2006-07-05 微软公司 Reducing accidental touch-sensitive device activation
CN104737116A (en) * 2012-10-17 2015-06-24 感知像素股份有限公司 Input classification for multi-touch systems
CN106462347A (en) * 2014-05-22 2017-02-22 索尼公司 Selective turning off/dimming of touch screen display region
CN107533430A (en) * 2014-09-12 2018-01-02 微软技术许可有限责任公司 Touch input is as unexpected or expected classification
CN110109563A (en) * 2018-02-01 2019-08-09 奇手公司 A kind of method and system of the contact condition of determining object relative to touch sensitive surface
CN111566604A (en) * 2018-02-13 2020-08-21 三星电子株式会社 Electronic device and operation method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1797305A (en) * 2004-11-23 2006-07-05 微软公司 Reducing accidental touch-sensitive device activation
CN104737116A (en) * 2012-10-17 2015-06-24 感知像素股份有限公司 Input classification for multi-touch systems
CN106462347A (en) * 2014-05-22 2017-02-22 索尼公司 Selective turning off/dimming of touch screen display region
CN107533430A (en) * 2014-09-12 2018-01-02 微软技术许可有限责任公司 Touch input is as unexpected or expected classification
CN110109563A (en) * 2018-02-01 2019-08-09 奇手公司 A kind of method and system of the contact condition of determining object relative to touch sensitive surface
CN111566604A (en) * 2018-02-13 2020-08-21 三星电子株式会社 Electronic device and operation method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024066896A1 (en) * 2022-09-29 2024-04-04 荣耀终端有限公司 Touch control method and electronic device

Similar Documents

Publication Publication Date Title
US9671911B2 (en) Touch input control method and device
US20150332439A1 (en) Methods and devices for hiding privacy information
RU2679568C1 (en) Method and device for fingerprints recognition
US20150242118A1 (en) Method and device for inputting
EP3208742B1 (en) Method and apparatus for detecting pressure
US10061497B2 (en) Method, device and storage medium for interchanging icon positions
CN105335198A (en) Font addition method and device
JP6609266B2 (en) Fingerprint identification method, apparatus, program, and recording medium
CN106919283A (en) The touch event processing method of terminal, device and terminal
WO2022041606A1 (en) Method and apparatus for adjusting display position of control
CN112346597A (en) Touch processing method and device and electronic equipment
CN108319885B (en) Fingerprint identification method and device
CN107688765B (en) Fingerprint acquisition method and device
CN109144317B (en) Screen gesture detection method and device
CN107168631B (en) Application program closing method and device and terminal electronic equipment
CN111651082B (en) Touch screen unlocking method and device, electronic equipment and storage medium
CN105094466B (en) Ambient light measurement method and device
US20170220847A1 (en) Method and device for fingerprint recognition
CN117234405A (en) Information input method and device, electronic equipment and storage medium
CN108153478A (en) The processing method of touch and terminal of terminal
CN114063876A (en) Virtual keyboard setting method, device and storage medium
CN107608506B (en) Picture processing method and device
CN107643821B (en) Input control method and device and electronic equipment
CN112904997A (en) Equipment control method and related product
CN106778493A (en) Finger prints processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210209

RJ01 Rejection of invention patent application after publication