CN112764565A - Electronic device and object information identification method using touch data - Google Patents

Electronic device and object information identification method using touch data Download PDF

Info

Publication number
CN112764565A
CN112764565A CN201911058745.2A CN201911058745A CN112764565A CN 112764565 A CN112764565 A CN 112764565A CN 201911058745 A CN201911058745 A CN 201911058745A CN 112764565 A CN112764565 A CN 112764565A
Authority
CN
China
Prior art keywords
frame
touch
cell
adjacent
touch sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911058745.2A
Other languages
Chinese (zh)
Other versions
CN112764565B (en
Inventor
黄志文
艾瑞克·崔
徐文正
杨朝光
黄彦硕
曹凌帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN201911058745.2A priority Critical patent/CN112764565B/en
Publication of CN112764565A publication Critical patent/CN112764565A/en
Application granted granted Critical
Publication of CN112764565B publication Critical patent/CN112764565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition

Abstract

The invention provides an electronic device and an object information identification method using touch data. And acquiring a current touch sensing frame through the touch panel. The current touch sensing frame includes a plurality of frame cells respectively corresponding to the plurality of touch sensing units, each frame cell having touch raw data, and the frame cells include a target frame cell. Generating a new cell value corresponding to the target frame cell according to touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value. And converting the conversion frame into a touch sensing image, and analyzing the touch sensing image to identify the object information of the touch object.

Description

Electronic device and object information identification method using touch data
Technical Field
The present disclosure relates to electronic devices, and particularly to an electronic device and an object information identification method using touch data.
Background
In the information society of today, the dependence of human beings on consumer electronic devices is gradually increasing. In order to achieve the purpose of convenience and humanization, many electronic products may use a touch panel as an input device. In recent years, touch-control electronic products are popular with consumers because of their convenience and intuition. However, in the electronic products on the market, the touch screen integrated with the display is used to receive the touch event sent by the user's hand or the touch pen, so that the electronic products can execute the subsequent actions according to the touch event. In addition to detecting touch events made by a user's hand or a stylus, it is also a concern of those skilled in the art how to extend touch sensing of a touch screen to other applications.
Disclosure of Invention
In view of the above, the present invention provides an electronic device and an object information identification method using touch data thereof, which can accurately identify object information of a touch object on a touch panel.
The embodiment of the invention provides an object information identification method by utilizing touch data. The method comprises the following steps. And acquiring a current touch sensing frame through the touch panel. The current touch sensing frame includes a plurality of frame cells respectively corresponding to the plurality of touch sensing units, each frame cell having touch raw data, and the frame cells include a target frame cell. Generating a new cell value corresponding to the target frame cell according to touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value. And converting the conversion frame into a touch sensing image, and analyzing the touch sensing image to identify the object information of the touch object.
An embodiment of the invention provides an electronic device, which includes a touch panel, a storage device storing a plurality of instructions, and a processor. The processor is coupled with the touch panel and the storage device. The processor is configured to execute the instructions to perform the following steps. And acquiring a current touch sensing frame through the touch panel. The current touch sensing frame includes a plurality of frame cells respectively corresponding to the plurality of touch sensing units, each frame cell having touch raw data, and the frame cells include a target frame cell. Generating a new cell value corresponding to the target frame cell according to touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value. And converting the conversion frame into a touch sensing image, and analyzing the touch sensing image to identify the object information of the touch object.
Based on the above, in the embodiment of the invention, the current touch sensing frame sensed by the touch panel can be converted into a touch sensing image, and the object information of the touch object can be identified according to the image feature of the touch sensing image. Therefore, when the touch object is contacted with or close to the touch panel, the electronic device can acquire the object information of the touch object and execute other functions according to the object information, so that a new user operation experience is provided and the functionality of the electronic device is increased. In addition, the embodiment of the invention also performs data normalization processing on the current touch sensing frame so as to eliminate the adverse effect of the sensing noise of the touch panel on the identification accuracy.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the invention.
Fig. 2 is a flowchart illustrating an object information recognition method using touch data according to an embodiment of the invention.
Fig. 3A is a schematic view of a touch panel according to an embodiment of the invention.
FIG. 3B is a diagram of a touch sensing frame according to an embodiment of the invention.
Fig. 4 is a schematic diagram illustrating an object information identification method using touch data according to an embodiment of the invention.
Fig. 5 is a flowchart illustrating an object information recognition method using touch data according to an embodiment of the invention.
FIG. 6 is a diagram illustrating a target frame cell and its neighboring frame cells according to an embodiment of the invention.
FIG. 7 is a diagram illustrating a previous touch sensing frame and a current touch sensing frame according to an embodiment of the invention.
Description of reference numerals:
10: an electronic device;
110: a touch panel;
120: a storage device;
130: a processor;
20: a touch sensing circuit;
111: an array of sensing elements;
112: a receive sensing circuit;
113: a scan driving circuit;
114: scanning a line;
115: a sense line;
116: a timing generation circuit;
d 1: touch original data;
CS11, CS12, CS 21: a touch sensing unit;
30: touching an object;
n1: a notification message;
f1, F51, F71: a current touch sensing frame;
f72: a previous touch sensing frame;
f52: converting the image frame;
img 5: touch sensing an image;
info 1: object information
FC _ T: a target picture frame cell;
FC 71: a picture frame cell;
FC _ N1-FC _ N8: a first layer of adjacent picture frame cells;
FC _ N9-FC _ N12: a second layer of adjacent picture frame cells;
w1, W71, W72: a sampling window;
s201 to S204, S501 to S504: and (5) carrying out the following steps.
Detailed Description
Some embodiments of the invention will now be described in detail with reference to the drawings, wherein like reference numerals are used to refer to like or similar elements throughout the several views. These embodiments are merely exemplary of the invention and do not disclose all possible embodiments of the invention. Rather, these embodiments are merely exemplary of the methods and apparatus of the present invention as claimed.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In other words, unless specifically limited otherwise, the terms "connected" and "coupled" are intended to encompass both direct and indirect connection and coupling of two elements.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the invention, which is for convenience of illustration only and is not intended to limit the invention. First, fig. 1 first describes all the components and the arrangement of the electronic device, and the detailed functions and operations will be disclosed together with fig. 2.
Referring to fig. 1, the electronic device 10 of the present embodiment is, for example, an electronic device with a touch function, such as a notebook computer, a smart phone, a tablet computer, an electronic book, a game machine, and the invention is not limited thereto. The electronic device 10 includes a touch panel 110, a storage device 120 and a processor 130, and the functions thereof are described as follows.
The touch panel 110 is, for example, a capacitive touch panel, a resistive touch panel, an electromagnetic touch panel, an optical touch panel, or the like. In one embodiment, the touch panel 110 can be integrated with a display device (not shown) as a touch screen. The Display device is, for example, a Liquid Crystal Display (LCD), a Light-Emitting Diode (LED) Display, a Field Emission Display (FED), or other types of displays. The touch panel 110 includes a plurality of touch sensing units arranged in an array for performing touch sensing to obtain a touch sensing frame including touch raw data respectively corresponding to the touch sensing units.
The storage device 120 is used for storing touch data, instructions, program codes, software elements, and other data, and may be any type of fixed or removable Random Access Memory (RAM), read-only memory (ROM), flash memory (flash memory), hard disk or other similar devices, integrated circuits, and combinations thereof.
The Processor 130 is coupled to the touch panel 110 and the storage Device 120 for controlling operations between components of the electronic Device 10, such as a Central Processing Unit (CPU), or other Programmable general purpose or special purpose Microprocessor (Microprocessor), a Digital Signal Processor (DSP), a Programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Graphics Processing Unit (GPU), or the like, or a combination thereof, the Processor 130 may execute program codes, software modules, instructions, etc. recorded in the storage Device 120 to implement the object information identification method using touch data according to the embodiment of the present invention.
However, in addition to the touch panel 110, the storage device 120, and the processor 130, the electronic device 10 may further include other elements not shown in fig. 1, such as a speaker, a microphone, a display device, a camera, a communication module, a keyboard, and the like, which is not limited in this respect.
Fig. 2 is a flowchart illustrating an object information recognition method using touch data according to an embodiment of the invention. Referring to fig. 1 and fig. 2, the method of the present embodiment is applied to the electronic device 10 in fig. 1, and the following describes a detailed process of the method of the present embodiment with various elements in the electronic device 10.
In step S201, the processor 130 obtains a current touch sensing frame through the touch panel 110. The current touch sensing frame includes touch raw data respectively corresponding to the touch sensing units of the touch panel 110. The number of touch raw data is determined according to the number of the touch sensing units of the touch panel 110. For example, assuming that the touch panel 110 has m × n touch sensing units, the current touch sensing frame will include m × n frame cells (frame cells) corresponding to m × n touch raw data, respectively. It should be noted that, in general, the touch IC or other processing circuit in the electronic device 10 compares the touch raw data with the filtering threshold value to filter out the touch raw data smaller than the filtering threshold value, so as to detect the touch event applied on the touch panel 110. However, in one embodiment, the processor 130 obtains the touch raw data that has not been filtered. That is, each frame cell (frame cell) in the current touch sensing frame acquired by the processor 130 has touch raw data.
Further, fig. 3A is a schematic diagram of a touch panel according to an embodiment of the invention. FIG. 3B is a diagram of a touch sensing frame according to an embodiment of the invention. Referring to fig. 3A, the touch panel 110 may include a sensing element array 111 and a touch sensing circuit 20, wherein the sensing element array 111 includes a plurality of touch sensing units (e.g., touch sensing units CS11, CS12, CS21) arranged in an array. The touch sensing circuit 20 includes a scan driving circuit 113, a receiving sensing circuit 112, a timing generating circuit 116, and the like. The scan driving circuit 113 applies a driving signal to the touch sensing units column by column through a scan line (e.g., scan line 114). The receiving sensing circuit 112 senses the charge variation of the touch sensing unit through a sensing line (e.g., the sensing line 115) to receive a touch sensing signal and output touch raw data d 1. The receiving sensing circuit 112 may use A Digital Converter (ADC) to convert the touch sensing signal generated by the touch sensing unit into digital touch raw data d1 for output.
Referring to fig. 3B again, the current touch sensing frame F1 includes a plurality of frame cells (e.g., frame cells FC11, FC12, FC21) respectively corresponding to a plurality of touch sensing units (e.g., touch sensing units CS11, CS12, CS 21). For example, assuming that the touch panel 110 has 44 × 76 touch sense units, the current touch sense frame F1 includes 44 × 76 frame cells. Moreover, each frame cell has corresponding touch raw data. For example, the frame cell FC11 has touch raw data '3379'; the frame cell FC12 has touch raw data '3323'; the frame cell FC21 has raw touch data '3267'. In other words, the current touch sensing frame F1 can also be regarded as a 44 × 76 data array, and the array elements in the data array are the original touch data. However, fig. 3B is only exemplary, and the values shown are exemplary and not intended to limit the present invention.
Next, in step S202, the processor 130 generates a new cell value corresponding to the target frame cell according to the touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value. Specifically, the frame cells include a target frame cell. In one embodiment, each frame cell in the current touch-sensing frame may be a target frame cell. The processor 130 sequentially takes each frame cell in the current touch-sensing frame as a target frame cell to generate a new cell value corresponding to each frame cell in the current touch-sensing frame. Here, the conversion frame is generated according to the new cell value of each frame cell in the current touch sensing frame. Alternatively, in one embodiment, the frame cells in the partial block of the current touch sensing frame may be target frame cells.
That is, by calculating the new cell value in the converted frame according to the touch raw data of a plurality of adjacent frame cells of the target frame cell, the processor 130 can perform a smoothing process on the touch raw data in the current touch sensing frame to smooth the touch sensing noise component in the current touch sensing frame. Therefore, the adverse effect of the touch sensing noise on the subsequent image identification processing can be reduced. The touch sensing noise may be caused by the manufacturing process of the touch panel 110, the installation manner of the touch panel 110 on the electronic device 10, environmental factors, or other factors. In other words, the converted frame can also be regarded as a result of the processor 130 performing some de-noising processing on the current touch sensing frame.
In step S203, the processor 130 converts the converted frame into a touch sensing image. Based on a specific image format required for subsequent image processing (e.g., an image format required by a machine learning model), the processor 130 may image the converted frame as a grayscale image or a color image (i.e., a touch-sensing image) conforming to the specific image format. For example, the processor 130 may generate an N-bits grayscale image based on the converted frame. That is, the processor 130 needs to normalize the new cell values in the transition frame to be in the gray level range 0 to (2)N-1). This normalization operation can be used to normalize the range of values for the new cell value in the translated frame. In addition, in an embodiment, if the subsequent image processing has a special requirement for the image size, the processor 130 may also scale and/or fill the redundant image blocks to make the size of the touch sensing image meet the requirement of the subsequent image processing.
Then, in step S204, the processor 130 analyzes the touch sensing image to identify object information of the touch object. In one embodiment, the processor 130 may input the touch sensing image to a machine learning model, so as to identify object information of the touch object through the machine learning model. The object information includes an object type, an object model, a position of a component of the touch object, or other information. In other words, by making the touch object contact or close enough to the touch panel, the touch sensing image can be used to identify the related information of the touch object.
In one embodiment, the machine learning model is a neural network model constructed in advance by performing machine learning (e.g., deep learning) based on a training image set, and is stored in the storage device 120. In other words, model parameters of the machine learning model (such as the number of neural network layers and the weight of each neural network layer) have been determined by prior training and stored in the storage device 120. When the touch sensing image is input to the machine learning model, the machine learning model first performs Feature acquisition to generate a Feature vector (Feature vector). Then, the feature vectors are input to a classifier in the machine learning model, and the classifier classifies the feature vectors according to the feature vectors to identify object information of the touch object in the touch sensing image. The machine learning model may be a Convolutional Neural Network (CNN) model, such as R-CNN, Fast R-CNN, YOLO, or SSD, for object detection. It should be noted that, in an embodiment, the training image set for training the machine learning model is also generated according to a similar manner as steps S201 to S203.
Fig. 4 is a schematic diagram illustrating an object information identification method using touch data according to an embodiment of the invention. Referring to fig. 4, the touch panel 110 and the display device of the electronic device 10 are integrated into a touch screen. When the touch object 30 (e.g., a mobile phone) is placed on the touch screen, the processor 130 may obtain a current touch sensing Frame generated by the touch panel 110 in a Frame period (Frame period). Next, the processor 130 may perform denoising processing on the current touch sensing frame to generate a conversion frame, and convert the conversion frame into a touch sensing image. Thus, in the present example, the processor 130 can identify the mobile phone model of the touch object 30 by using the machine learning model and the touch sensing image, and display the notification message N1 including the mobile phone model on the touch screen. In detail, the touch raw data sensed by the touch sensing unit will be reflected in the proximity of the internal/external components of the touch object 30 and the housing to generate corresponding charge changes, so that the touch sensing image carries the characteristic information of the touch object 30, so that the object information of the touch object 30 can be identified based on the touch sensing image. For example, the configuration of the metal elements inside a 'model a' handset is not the same as the configuration of the metal elements inside a 'model B' handset. Therefore, the image characteristics of the touch sensing image generated in response to the model a mobile phone being placed on the touch panel 110 are different from the image characteristics of another touch sensing image generated in response to the model B mobile phone being placed on the touch panel 110.
However, the implementation of the present invention is not limited to the above description, and the contents of the above embodiments may be modified or extended as needed for practical needs. For the sake of clarity, the following description will be made by way of example of the components of the electronic device 10 of fig. 1. Fig. 5 is a flowchart illustrating an object information recognition method using touch data according to an embodiment of the invention.
Referring to fig. 1 and fig. 5, in step S501, the processor 130 obtains a current touch sensing frame F51 from the touch panel 110, where the current touch sensing frame F51 includes touch raw data generated by each touch sensing unit on the touch panel 110.
In step S502, the processor 130 performs frame processing according to the current touch-sensing frame F51 to generate a new cell value corresponding to each frame cell on the current touch-sensing frame F51, so as to generate a conversion frame F52 according to a plurality of new cell values. In one embodiment, the processor 130 sequentially calculates new cell values using each frame cell on the current touch-sensing frame F51 as a target frame cell. The processor 130 will generate a new cell value corresponding to the target frame cell according to the absolute difference between the touch raw data of the target frame cell and the touch raw data of the adjacent frame cell and the number of the adjacent frame cells. For example, the processor 130 may generate a new cell value corresponding to the target frame cell according to equation (1) below.
Figure BDA0002257280990000081
Wherein, in equation (1), Num represents the number of adjacent frame cells, RawT represents the touch Raw data of the target frame cell, and RawiRepresenting the touch raw data of the adjacent picture frame cell.
In one embodiment, the processor 130 may determine the target frame cell and its neighboring frame cells based on a sampling window (sampling mask). The target frame cell is located at the center of the sampling window, and the adjacent frame cells may be the frame cells located in the sampling window in the vertically adjacent direction, the horizontally adjacent direction, or the dual diagonal adjacent direction of the target frame cell. The size of the sampling window may be configured to be 3 × 3, 5 × 5, or other sizes, which the present invention is not limited to.
In one embodiment, the neighboring frame cells may include a plurality of first-tier neighboring frame cells located around the target frame cell and directly adjacent to the target frame cell. In other words, the first-tier adjacent frame cells may be eight adjacent frame cells in the sampling window that are directly adjacent to the target frame cell, the eight adjacent frame cells being located in a vertically adjacent direction, a horizontally adjacent direction, and a diagonally adjacent direction of the target frame cell.
In one embodiment, the adjacent frame cells used to calculate the new cell value include a plurality of second-tier adjacent frame cells located vertically adjacent to the target frame cell and horizontally adjacent to the target frame cell and spaced apart from the target frame cell by the first-tier adjacent frame cell, in addition to eight first-tier adjacent frame cells directly adjacent to the target frame cell. Referring to fig. 6, fig. 6 is a schematic diagram of a target frame cell and its neighboring frame cells according to an embodiment of the invention. The target frame cell FC _ T is located at the center of the sampling window W1. The neighboring frame cells of the target frame cell FC _ T include first-layer neighboring frame cells FC _ N1, FC _ N2 located directly adjacent to the target frame cell FC _ T in a vertically adjacent direction. The neighboring frame cells of the target frame cell FC _ T include first-tier neighboring frame cells FC _ N3, FC _ N4 located directly adjacent to the target frame cell FC _ T in a horizontally neighboring direction. The neighboring frame cells of the target frame cell FC _ T include first-layer neighboring frame cells FC _ N5, FC _ N6, FC _ N7, FC _ N8 located directly adjacent to the target frame cell FC _ T in dual diagonal adjacent directions. In addition, the adjacent frame cells of the target frame cell FC _ T further include second-layer adjacent frame cells FC _ N9, FC _ N10, FC _ N11, FC _ N12 located in vertically adjacent and horizontally adjacent directions of the target frame cell. That is, in the example of fig. 6, the number of adjacent frame cells is 12.
In the example of fig. 6, the processor 130 uses the calculation result of [ |12 × 3217 | ]/12 as the new cell value corresponding to the target frame cell FC _ T according to equation (1), wherein 38713 is the sum of the touch raw data of the first-tier adjacent frame cells FC _ N1-FC _ N8 and the second-tier adjacent frame cells FC _ N9-FC _ N12. However, the values shown in fig. 6 are only exemplary and not intended to limit the present invention. In one embodiment, by taking the average of the absolute differences between the touch raw data of the target frame cell and the touch raw data of the adjacent frame cells as the new cell value of the converted frame, the touch-sensing image generated based on the converted frame can more accurately represent the object feature of the touch object. Therefore, the identification accuracy of identifying the object information according to the touch sensing image can be improved.
Returning to the flow of fig. 5, in step S503, the processor 130 converts the conversion frame F52 into the touch sensing image Img 5. In one embodiment, the processor 130 may perform a numerical normalization operation on the new cell values within the converted frame F52. The processor 130 normalizes the new cell values in the conversion frame F52 according to the maximum cell value and the minimum cell value in the conversion frame F52 to generate the pixel values of the touch-sensing image Img 5. The touch sensing image Img5 is, for example, an 8-bits grayscale image.
The value normalization operation may be used to normalize the range of values of the new cell values in the translation frame F52. For example, the processor 130 may perform a numerical normalization operation according to equation (2) below.
Figure BDA0002257280990000101
In equation (2), X represents a new cell value in the conversion frame F52, XminRepresents the minimum cell value, X, in the converted frame F52maxRepresents the maximum cell value in the conversion frame F52, and X' represents the pixel value on the touch-sensing image Img 5. Each new cell value in the conversion frame F52 can be adjusted to a gray-scaled value (gray-scaled value) between 0 and 255 according to equation (2).
Finally, in step S504, the processor 130 may input the touch sensing image Img5 into a machine learning model to identify the position of the touch object relative to the touch panel 110 and the object information Info1 of the touch object. For example, the processor 130 may identify a device model of the touch object according to the touch sensing image Img5, identify a camera lens position of the touch object, and so on.
However, the aforementioned embodiments all use a current touch sensing frame to generate a touch sensing image for identifying object information, but the invention is not limited thereto. In one embodiment, the processor 130 may generate a touch sensing image for identifying object information according to the current touch sensing frame and the at least one current touch sensing frame.
Specifically, in one embodiment, the processor 130 may generate a new cell value corresponding to the target frame cell according to the touch raw data of the previous touch sensing frame and the touch raw data of the current touch sensing frame. FIG. 7 is a diagram illustrating a previous touch sensing frame and a current touch sensing frame according to an embodiment of the invention. Referring to fig. 7, the processor 130 may obtain a current touch sensing frame F71 corresponding to a current time point t1 and a previous touch sensing frame F72 corresponding to a previous time point t 2. When a new cell value corresponding to the target frame cell FC _ T is to be calculated, the processor 130 may perform the calculation according to the touch-sensing data of the neighboring frame cells within the sampling window W71 on the current touch-sensing frame F71 and the touch-sensing data of the reference frame cells within the sampling window W72 on the previous touch-sensing frame F72. In one embodiment, the cell position of the frame cell FC71 in the sampling window W72 is the same as the cell position of the target frame cell FC _ T, and the cell position of the reference frame cell within the sampling window W72 may be the same as the cell position of the neighboring frame cell within the sampling window W71.
In summary, in the embodiments of the invention, the original touch data in the touch sensing frame can be used to identify object information of the touch object, rather than only detecting a touch gesture or an input of a stylus. By contacting or sufficiently approaching the touch object to the touch panel, the electronic device can accurately identify the related object information of the touch object, so as to execute other subsequent applications by using the related object information of the touch object. Therefore, the user can realize other applications of the electronic device through a more intuitive operation mode so as to improve the operation convenience of the electronic device. In addition, by denoising the touch sensing image frame, the adverse effect of the touch sensing noise on image recognition can be eliminated, and the recognition accuracy of machine learning can be improved.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (16)

1. An object information identification method using touch data, the method comprising:
acquiring a current touch sensing frame through a touch panel, wherein the current touch sensing frame comprises a plurality of frame cells respectively corresponding to a plurality of touch sensing units, each of the plurality of frame cells has touch raw data, and the plurality of frame cells comprises a target frame cell;
generating a new cell value corresponding to the target frame cell according to touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value;
converting the converted image frame into a touch sensing image; and
and analyzing the touch sensing image to identify object information of the touch object.
2. The object information identifying method using touch data according to claim 1, wherein the converting the conversion frame into the touch sensing image comprises:
normalizing the new cell value in the converted frame according to the maximum cell value and the minimum cell value in the converted frame to generate the pixel value of the touch sensing image.
3. The method of claim 1, wherein generating the new cell value corresponding to the target frame cell according to touch raw data of the plurality of adjacent frame cells adjacent to the target frame cell in the current touch-sensing frame to generate the converted frame according to the new cell value comprises:
generating the new cell value corresponding to the target frame cell according to an absolute difference between the touch raw data of the target frame cell and the touch raw data of the plurality of adjacent frame cells and the number of the plurality of adjacent frame cells.
4. The method of claim 1, wherein the plurality of neighboring frame cells comprises a plurality of first-layer neighboring frame cells located around and directly adjacent to the target frame cell.
5. The method of claim 4, wherein the plurality of adjacent picture frame cells further comprises a plurality of second-level adjacent picture frame cells located in vertically adjacent and horizontally adjacent directions of the target picture frame cell and separated from the target picture frame cell by the plurality of first-level adjacent picture frame cells.
6. The method of claim 1, wherein generating the new cell value corresponding to the target frame cell according to touch raw data of the plurality of adjacent frame cells adjacent to the target frame cell in the current touch-sensing frame to generate the converted frame according to the new cell value comprises:
generating the new cell value corresponding to the target frame cell according to touch raw data of a previous touch sensing frame.
7. The object information identification method using touch data according to claim 1, wherein the step of analyzing the touch sensing image to identify the object information of the touch object comprises:
inputting the touch sensing image to a machine learning model to identify the object information of the touch object through the machine learning model.
8. The method of claim 1, wherein the object information includes an object type, an object model, or a position of a component of the touch object.
9. An electronic device, comprising:
a touch panel including a plurality of touch sensing units;
a storage device storing a plurality of instructions; and
a processor coupled to the touch panel and the storage device and configured to execute the plurality of instructions;
acquiring a current touch sensing frame through the touch panel, wherein the current touch sensing frame comprises a plurality of frame cells respectively corresponding to the plurality of touch sensing units, each of the plurality of frame cells has touch raw data, and the plurality of frame cells comprises a target frame cell;
generating a new cell value corresponding to the target frame cell according to touch raw data of a plurality of adjacent frame cells adjacent to the target frame cell in the current touch sensing frame, so as to generate a conversion frame according to the new cell value;
converting the converted image frame into a touch sensing image; and
and analyzing the touch sensing image to identify object information of the touch object.
10. The electronic device of claim 9, wherein the processor is also configured to:
normalizing the new cell value in the converted frame according to the maximum cell value and the minimum cell value in the converted frame to generate the pixel value of the touch sensing image.
11. The electronic device of claim 9, wherein the processor is also configured to:
generating the new cell value corresponding to the target frame cell according to an absolute difference between the touch raw data of the target frame cell and the touch raw data of the plurality of adjacent frame cells and the number of the plurality of adjacent frame cells.
12. The electronic device of claim 9, wherein the plurality of neighboring frame cells comprises a plurality of first-tier neighboring frame cells located around and directly adjacent to the target frame cell.
13. The electronic device of claim 9, wherein the plurality of adjacent picture frame cells further comprises a plurality of second-tier adjacent picture frame cells located in vertically adjacent and horizontally adjacent directions to the target picture frame cell and separated from the target picture frame cell by the plurality of first-tier adjacent picture frame cells.
14. The electronic device of claim 9, wherein the processor is also configured to:
generating the new cell value corresponding to the target frame cell according to touch raw data of a previous touch sensing frame.
15. The electronic device of claim 9, wherein the processor is further configured to:
inputting the touch sensing image to a machine learning model to identify the object information of the touch object through the machine learning model.
16. The electronic device of claim 9, wherein the object information comprises an object type, an object model, or a location of a component of the touch object.
CN201911058745.2A 2019-11-01 2019-11-01 Electronic device and object information identification method using touch data thereof Active CN112764565B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911058745.2A CN112764565B (en) 2019-11-01 2019-11-01 Electronic device and object information identification method using touch data thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911058745.2A CN112764565B (en) 2019-11-01 2019-11-01 Electronic device and object information identification method using touch data thereof

Publications (2)

Publication Number Publication Date
CN112764565A true CN112764565A (en) 2021-05-07
CN112764565B CN112764565B (en) 2023-11-14

Family

ID=75691936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911058745.2A Active CN112764565B (en) 2019-11-01 2019-11-01 Electronic device and object information identification method using touch data thereof

Country Status (1)

Country Link
CN (1) CN112764565B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408821A (en) * 2008-08-19 2009-04-15 友达光电股份有限公司 Method anddevice of multi-point tracing
TW201232738A (en) * 2010-09-14 2012-08-01 Samsung Mobile Display Co Ltd Touch screen panel and display device having the same
TW201612875A (en) * 2014-09-26 2016-04-01 Acer Inc Display device and operating method thereof
CN106687967A (en) * 2014-09-25 2017-05-17 齐科斯欧公司 Method and apparatus for classifying contacts with a touch sensitive device
US20180329536A1 (en) * 2017-04-28 2018-11-15 Synaptics Japan Gk Device and method for touch sensing on display panel
CN109964236A (en) * 2016-11-01 2019-07-02 斯纳普公司 Neural network for the object in detection image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408821A (en) * 2008-08-19 2009-04-15 友达光电股份有限公司 Method anddevice of multi-point tracing
TW201232738A (en) * 2010-09-14 2012-08-01 Samsung Mobile Display Co Ltd Touch screen panel and display device having the same
CN106687967A (en) * 2014-09-25 2017-05-17 齐科斯欧公司 Method and apparatus for classifying contacts with a touch sensitive device
TW201612875A (en) * 2014-09-26 2016-04-01 Acer Inc Display device and operating method thereof
CN109964236A (en) * 2016-11-01 2019-07-02 斯纳普公司 Neural network for the object in detection image
US20180329536A1 (en) * 2017-04-28 2018-11-15 Synaptics Japan Gk Device and method for touch sensing on display panel

Also Published As

Publication number Publication date
CN112764565B (en) 2023-11-14

Similar Documents

Publication Publication Date Title
TWI731442B (en) Electronic apparatus and object information recognition method by using touch data thereof
JP4790653B2 (en) Image processing apparatus, control program, computer-readable recording medium, electronic apparatus, and control method for image processing apparatus
KR20150130554A (en) Optimized adaptive thresholding for touch sensing
WO2020220809A1 (en) Action recognition method and device for target object, and electronic apparatus
JP6877446B2 (en) Systems and methods for recognizing multiple object structures
US11861808B2 (en) Electronic device, image processing method, and computer-readable recording medium
CN114402356A (en) Network model training method, image processing method and device and electronic equipment
CN111382644A (en) Gesture recognition method and device, terminal equipment and computer readable storage medium
CN112764565B (en) Electronic device and object information identification method using touch data thereof
US20170085784A1 (en) Method for image capturing and an electronic device using the method
CN113641292B (en) Method and electronic equipment for operating on touch screen
US10713463B2 (en) Display method of user interface and electronic apparatus thereof
TWI715252B (en) Electronic apparatus and object information recognition method by using touch data thereof
CN112764594B (en) Electronic device and object information identification method using touch data thereof
KR102569170B1 (en) Electronic device and method for processing user input based on time of maintaining user input
CN104978016A (en) Electronic device with virtual input function
KR102643243B1 (en) Electronic device to support improved visibility for user interface
CN113127272A (en) Screen detection method and electronic equipment for screen detection
KR102570007B1 (en) Method and electronic device for correcting handwriting input
KR20210073196A (en) Electronic device and method for processing writing input
CN113806532B (en) Training method, device, medium and equipment for metaphor sentence judgment model
US20220415089A1 (en) Data processing system and data processing method
CN111813639B (en) Method and device for evaluating equipment operation level, storage medium and electronic equipment
US10606419B2 (en) Touch screen control
US11460928B2 (en) Electronic device for recognizing gesture of user from sensor signal of user and method for recognizing gesture using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant