WO2023146380A1 - A system and method to identify hover data patterns as operational on touch screen panel - Google Patents

A system and method to identify hover data patterns as operational on touch screen panel Download PDF

Info

Publication number
WO2023146380A1
WO2023146380A1 PCT/KR2023/001379 KR2023001379W WO2023146380A1 WO 2023146380 A1 WO2023146380 A1 WO 2023146380A1 KR 2023001379 W KR2023001379 W KR 2023001379W WO 2023146380 A1 WO2023146380 A1 WO 2023146380A1
Authority
WO
WIPO (PCT)
Prior art keywords
hover
cluster
touch screen
screen panel
operational
Prior art date
Application number
PCT/KR2023/001379
Other languages
French (fr)
Inventor
Vijayanand KUMAR
Sachin Mittal
Pradeep Singh
Rameez RAJA
Vivek GULATI
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2023146380A1 publication Critical patent/WO2023146380A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • Touch screens typically include a touch panel, a controller and a software driver.
  • the touch panel is characteristically an optically clear panel with a touch sensitive surface that is positioned in front of a display screen.
  • the touch panel registers touch events and sends signals indicative of these events to the controller.
  • the controller processes these signals and sends the resulting data to the software driver.
  • the software driver in turn, translates the resulting data into events recognizable by the electronic system, for example, finger movements and selections, contact patches modelled as ellipses, fingerprint images.
  • an aspect of the disclosure is to provide a system and method to identify hover data patterns as operational on a touch screen panel of a mobile device.
  • the present invention relates to a system and method to identify hover data patterns as operational on a touch screen panel.
  • the system to identify hover data patterns as operational on a touch screen panel mainly comprises a sensing panel, a hover detection module, a clustering analysis module, a hover feature extraction module, a finger identity module, and user identity module.
  • the system comprises a sensing panel to sense a user finger touching or hovering over the touch screen panel.
  • the hover detection module is configured to detect hover events and register hover data patterns on the touch screen panel when the user's finger is in contact with the touch screen panel for a predefined time interval.
  • the clustering analysis module is configured to identify a cluster size based on the number of hover data patterns registered on the touch screen panel. The clustering analysis module then identifies cluster centroid based on the cluster size, distance between the inter-clusters, distance within the intra-cluster.
  • FIG 3A and FIG 3B illustrate tables depicting results of the clustering analysis module.
  • FIG 3C is a table depicting the functionality of the results of the hover identity module, according to one embodiment of the present invention.

Abstract

The invention relates to a system and method to identify hover data patterns as operational on a touch screen panel of a mobile device. The system comprises a sensing panel to sense a user finger touching or hovering over the touch screen panel. The system detects hover events and register hover data patterns when the user's finger is in contact with the touch screen panel for a predefined time interval. The system identifies a cluster size based on the number of hover data patterns registered on the touch screen panel. The system identifies the hover data patterns as operational hover area when the cluster density is low. The system predicts user finger type and age group based on hover data extracted from operational hover area.

Description

A SYSTEM AND METHOD TO IDENTIFY HOVER DATA PATTERNS AS OPERATIONAL ON TOUCH SCREEN PANEL
The present invention relates to a system and method to identify hover data patterns as operational on a touch screen panel of a mobile device so as to predict user finger type and age group.
The use of touch screen systems has become increasingly popular. Touch screens typically include a touch panel, a controller and a software driver. The touch panel is characteristically an optically clear panel with a touch sensitive surface that is positioned in front of a display screen. The touch panel registers touch events and sends signals indicative of these events to the controller. The controller processes these signals and sends the resulting data to the software driver. The software driver, in turn, translates the resulting data into events recognizable by the electronic system, for example, finger movements and selections, contact patches modelled as ellipses, fingerprint images.
Unlike earlier input devices, touch-surfaces are now capable of simultaneously detecting multiple objects as they approach the touch-surface. Some can detect object shapes in much more detail (e.g. finger contact-patch values and/or fingerprint attributes). However, conventional methods of child controlling access to various types of content includes placing time-limits on usage or forbidding certain types of usage. However, such methods may be time consuming, tedious, and repetitive for an adult user of a computing device that implements them. Hence, there exists a need to obtain user's age biometrically and to implement parental controls accordingly.
For instance, Korean Patent Application No. KR20130129914A titled "FINGER IDENTIFICATION ON A TOUCHSCREEN" discloses about a method and system that uses touch pointer and device orientation with respect to the user to find the finger identity. The method comprises the steps of receiving, at a mobile computing device, a first input indicating that the user has touched a touchscreen display of the mobile computing device with a pointer. The method also includes identifying the pointer as a particular finger or type of finger of the user and using the identified finger or type of finger at least based on the determined position of the mobile computing device relative to the user.
For instance, US Patent Application No. US2014330650A1 titled "SETTING COMPUTING DEVICE FUNCTIONALITY BASED ON TOUCH-EVENT PROPERTIES" discloses about a method for receiving a finger-contact patch attribute from a user of a touch screen system. The method determines a user's age group based on the received finger-contact patch attribute. The user's age group is provided to a server. An advertisement to display on a computing device of the touch-screen system is received by the server. The advertisement is filtered when a current user is in the child-age group and the appropriate age group of the advertisement is an adult age group.
Some touch sensitive devices can also recognize a hover event, i.e., an object near but not touching the touch sensor panel, and the position of the hover event at the panel. The touch sensitive device can then process the hover event in a manner similar to that for a touch event, where the computing system can interpret the hover event in according with the display appearing at the time of the hover event, and thereafter can perform one or more actions based on the hover event.
For instance, US Patent No. US8614693B2·titled "Touch and hover signal drift compensation" discloses about a touch and hover sensing device that senses an object touching or hovering over a sensing panel. The touch and hover sensing device comprises a control system to measure capacitance of the panel either when there has been no touching or hovering object or when there is a substantially stationary touching or hovering object at the panel for a determinative time period.
However, the existing prior art documents accepts either touch or hover as input and concurrent operations of touch and hover is prohibited for use. Further, eliminating detection of false hover events is essential to provide a stable hover data for finger/user identity.
Thus, stable hover data is essential to distinguish finger distribution for different age group, and profiling of user from finger.
Aspects of the disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the disclosure is to provide a system and method to identify hover data patterns as operational on a touch screen panel of a mobile device.
The present invention relates to a system and method to identify hover data patterns as operational on a touch screen panel. The system to identify hover data patterns as operational on a touch screen panel mainly comprises a sensing panel, a hover detection module, a clustering analysis module, a hover feature extraction module, a finger identity module, and user identity module. The system comprises a sensing panel to sense a user finger touching or hovering over the touch screen panel. In an embodiment, the hover detection module is configured to detect hover events and register hover data patterns on the touch screen panel when the user's finger is in contact with the touch screen panel for a predefined time interval. The clustering analysis module is configured to identify a cluster size based on the number of hover data patterns registered on the touch screen panel. The clustering analysis module then identifies cluster centroid based on the cluster size, distance between the inter-clusters, distance within the intra-cluster.
In an embodiment, the hover identity module identifies the hover data patterns as operational hover area when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster.
In another embodiment, the hover identity module identifies the hover data patterns as non-operational hover area when the cluster density is high or very low, and inter-cluster proximity is close. The hover feature extraction module collects hover data from the operational hover area and convert to hover features. Here, the hover features comprise users operational finger type, and dimensions.
In another embodiment, the finger identity module and user identity module use artificial intelligence techniques to predict user finger type and age group based on the extracted hover features from the hover feature extraction module.
In an embodiment, the method to identify hover data patterns as operational on touch screen panel comprises the steps of sensing a user finger touching or hovering over the touch screen panel.
The method detects hover events and registers hover data patterns on the touch screen panel when the user's finger is in proximity with the touch screen panel for a predefined time interval by using a hover detection module. The method identifies a cluster size based on the number of hover data patterns registered simultaneously on the touch screen panel by using a clustering analysis module. The method also identifies cluster centroid for each cluster, identifies the distance between the inter-clusters, identifies distance within the intra-cluster.
The method identifies hover data patterns as operational hover area when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster by using a hover identity module. The method identifies the hover data patterns as non-operational hover area when the cluster density is high or very low, and inter-cluster proximity is close by the hover identity module.
The method collects hover data from the operational hover area and convert to hover features by using a hover feature extraction module. Here the hover features comprise users operational finger type, and dimensions by using a hover feature extraction module. The method predicts user finger type and age group based on the extracted hover features from the hover feature extraction module by using a finger identity module and a user identity module.
Thus, detecting hover events and registering hover data patterns on the touch screen panel only when the user's finger is in contact with the touch screen panel provides a stable hover data for finger/user identity. Further, the system and method of the present invention eliminates detection of false hover events by using stable hover data for finger/user identity.
The foregoing and other features of embodiments will become more apparent from the following detailed description of embodiments when read in conjunction with the accompanying drawings.
FIG 1 illustrates a system to identify hover data patterns as operational on touch screen panel, according to one embodiment of the present invention.
FIG 2 illustrates a cluster size generated by a clustering analysis module, according to one embodiment of the present invention.
FIG 3A and FIG 3B illustrate tables depicting results of the clustering analysis module, according to one embodiment of the present invention.
FIG 3C is a table depicting the functionality of the results of the hover identity module, according to one embodiment of the present invention.
FIG 3D is a screenshot depicting the functionality of the results of the hover identity module, according to one embodiment of the present invention.
FIG 4 illustrates a screenshot with operational hover area and non-operational hover area, according to one embodiment of the present invention.
FIG 5 is a screenshot illustrating hover data distribution when the user's finger is above a touch screen panel, according to one embodiment of the present invention.
FIG 6 is a screenshot illustrating hover data distribution when the user's finger is in contact with the touch screen panel, according to one embodiment of the present invention.
FIG 7 illustrates the sensing lines of the touch screen panel, according to one embodiment of the present invention.
FIG 8A and FIG 8B illustrates a method to identify hover data patterns as operational on touch screen panel, according to one embodiment of the present invention.
FIG 9 illustrates a block diagram of a mobile device, according to one embodiment of the present invention.
FIG 10 illustrates the application of stable hover data, according to one embodiment of the present invention.
In order to make the matter of the invention clear and concise, the following definitions are provided for specific terms used in the following description.
The term "hover event" refers to detecting an object that is near the touch sensor panel, and the position of the hover event at the touch screen panel.
FIG 1 illustrates a system to identify hover data patterns as operational on touch screen panel, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 1, the system to identify hover data patterns as operational on touch screen panel mainly comprises a sensing panel, a hover detection module (102), a clustering analysis module (103), a hover feature extraction module (105), a finger identity module (106), and user identity module (107). The system comprises a sensing panel to sense a user finger touching or hovering over the touch screen panel (201). In an embodiment, the system comprises a touch detection module (101) to sense a user finger touching over the touch screen panel (201).
In an embodiment, the hover detection module (102) is configured to detect hover events and register hover data patterns on the touch screen panel (201) when the user's finger is in contact with the touch screen panel (201) for a predefined time interval. The clustering analysis module (103) is configured to identify a cluster size based on the number of hover data patterns registered on the touch screen panel (201) simultaneously. The clustering analysis module (103) then identifies cluster centroid (103a, 103b, 103c, 103d, as exemplarily illustrated in FIG 2) based on the cluster size, distance between the inter-clusters (103a, 103b, 103c, 103d as exemplarily illustrated in FIG 2), distance within the intra-cluster (103a, 103b, 103c, or 103d as exemplarily illustrated in FIG 2).
In an embodiment, the hover identity module (104) identifies the hover data patterns as operational hover area (201a, FIG 4) when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster (103a, 103b, 103c, or 103d) by a hover identity module (104).
In another embodiment, the hover identity module (104) identifies the hover data patterns as non-operational hover area (201b, FIG 4) when the cluster density is high or very low, and inter-cluster proximity is close by the hover identity module (104). The hover feature extraction module (105) collects hover data from the operational hover area (201a) and convert to hover features. Here, the hover features comprise users operational finger type, and dimensions.
In another embodiment, the finger identity module (106) and user identity module (107) use artificial intelligence techniques to predict user finger type and age group based on the extracted hover features from the hover feature extraction module. In an embodiment, the finger identity module (106) and user identity module (107) are trained using collected hover distribution data for various finger orientation and user age groups when the user's finger is in contact with the touch screen panel (201). The finger orientation includes, for example, finger length and diameter for various age group. Thus, the finger identity module (106) and user identity module (107) use the trained data to predict user finger type and age group based on the extracted hover features. Thus, profiling the user based on the age group by using the stable hover data shows advertisement with respect to age groups, provides age wise recommendations for health, entertainment, accessories, and clothing's. Further, profiling user based on age group using stable hover data also allows provided run-time permission for content with respect to age groups. Also, profiling user based on age group using stable hover data also allows parents to limit screen time and change user interface based on kid's usage such as enhance icons, enable reading modes, dialer pads etc.
In an embodiment, touch + stable hover data can be used in joystick for gaming applications. In another embodiment, the combination of touch event and operational hover area can be used to open a folder in secure mode without knowing other person near-by in proximity.
In an embodiment, the modules a touch detection module (101), a hover detection module (102), a clustering analysis module (103), a hover feature extraction module (105), a finger identity module (106), and user identity module (107) of the system is stored in a memory (200a, as exemplarily illustrated in FIG 9) of the mobile device (200). The processor (200b, as exemplarily illustrated in FIG 9) executes the instruction stored in the memory (200a, as exemplarily illustrated in FIG 9) to detect hover events and register hover data patterns on the touch screen panel (201) only when the user's finger is in contact/proximity with the touch screen panel (201) to provide a stable hover data for finger/user identity.
FIG 3A and FIG 3B illustrate tables depicting results of the clustering analysis module.
As exemplarily illustrated in FIG 3A and FIG 3B, the clustering analysis module (103) uses a cluster size with K = 10, where K represents the maximum number of hover presence on a touch screen panel (201) simultaneously. The clustering analysis module (103) then finds cluster center (103a, 103b, 103c, 103d) for K cluster size using K-mean clustering algorithm. Cluster centroids are represented as hover data patterns (103a, 103b, 103c, 103d as exemplarily illustrated in FIG 2) on the touch screen panel (201). The clustering analysis module (103) then identifies the distance between the inter-clusters (103a, 103b, 103c, 103d as exemplarily illustrated in FIG 2), distance within the intra-cluster (103a, 103b, 103c, or 103d as exemplarily illustrated in FIG 2). The clustering analysis module (103) also identifies the cluster density for each cluster (103a, 103b, 103c, 103d). The clustering analysis module (103) then checks whether the cluster formation with K cluster size (Dunn Index) is good, and then accepts the cluster (103a, 103b, 103c or 103d), cluster density, distance between the inter-clusters (103a, 103b, 103c, 103d as exemplarily illustrated in FIG 2), distance within the intra-cluster (103a, 103b, 103c, or 103d as exemplarily illustrated in FIG 2.
FIG 3C is a table depicting the functionality of the results of the hover identity module, according to one embodiment of the present invention.
As depicted in FIG 3C, the hover identity module (104) identifies the hover data patterns as operational hover area (201a, FIG 4) when the cluster density is low/mid, and proximity of the hover data pattern are close for intra-cluster and far for inter-cluster (103a, 103b, 103c, or 103d) by a hover identity module (104).
In another embodiment, the hover identity module (104) identifies the hover data patterns as non-operational hover area (201b, FIG 4) when the cluster density is high or very low, and inter-cluster proximity is close by the hover identity module (104). The hover feature extraction module (105) collects hover data from the operational hover area (201a) and convert to hover features. Here, the hover features comprise users operational finger type, and dimensions.
FIG 3D is a screenshot depicting the functionality of the results of the hover identity module, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 3D, the hover identity module (104) identifies the hover data patterns (103b) as operational hover area (201a, FIG 4) as the cluster density is low, and proximity of the hover data pattern are close for intra-cluster and far for inter-cluster (103a, 103b, 103c, or 103d) by a hover identity module (104).
In another embodiment, the hover identity module (104) identifies the hover data patterns (103a, 103c, 103d) as non-operational hover area (201b, FIG 4) when the cluster density is high or very low, and inter-cluster proximity is close by the hover identity module (104). The hover feature extraction module (105) collects hover data from the operational hover area (201a) and convert to hover features. Here, the hover features comprise users operational finger type, and dimensions.
FIG 5 is a screenshot illustrating hover data distribution when the user's finger is above a touch screen panel, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 5, when the user's finger is above a touch screen panel (201) the hover data distribution is wide.
FIG 6 is a screenshot illustrating hover data distribution when the user's finger is in contact with the touch screen panel, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 6, the hover data distribution spread is less when the user's finger is in contact with the touch screen panel (201) when compared to the hover data distribution when the user's finger is above the touch screen panel (201, as exemplarily illustrated in FIG 5).
Thus, detecting hover events and registering hover data patterns on the touch screen panel (201) only when the user's finger is in contact/proximity with the touch screen panel (201) provides a stable hover data for finger/user identity.
FIG 7 illustrates the sensing lines of the touch screen panel, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 7, the sensing lines (701, 702) for sensing hover events and touch events are separated using capacitance difference to detect hover event and touch event concurrently.
Thus, in presence of weak hover due to thumb sensing line 1 (702) and another hover due to index finger (in air) on sensing line 2 (701) can be virtually separated using capacitance difference from sensing line 1 and sensing line 2 so that the in air hover action can be performed.
FIG 8A and FIG 8B illustrates a method to identify hover data patterns as operational on touch screen panel, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 8A and FIG 8B, the method to identify hover data patterns as operational on touch screen panel comprises the steps of sensing a user finger touching or hovering over the touch screen panel (201) at step 801.
At step 802, the method detects hover events and registering hover data patterns on the touch screen panel (201) when the user's finger is in proximity with the touch screen panel (201) for a predefined time interval by using a hover detection module (102). The method at step 803, identifies a cluster size based on the number of hover data patterns registered on the touch screen panel (201) simultaneously by using a clustering analysis module (103). The method also identifies cluster centroid based on the cluster size. The cluster centroid is represented as hover data patterns (103a, 103b, 103c, 103d) on touch screen panel (201) (as exemplarily illustrated in FIG 2). The method identifies the cluster density, distance between the inter-clusters (103a, 103b, 103c, or 103d, as exemplarily illustrated in FIG 2), distance within the intra-cluster (103a, 103b, 103c, or 103d).
At step 804, the method identifies hover data patterns (103b) as operational hover area (201a) when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster by using a hover identity module (104). At step 805, the method identifies the hover data patterns (103a, 103c, 103d) as non-operational hover area (201b) when the cluster density is high or very low, and inter-cluster proximity is close by the hover identity module (104).
At step 806, the method collects hover data from the operational hover area (201a) and convert to hover features by using a hover feature extraction module (105). Here the hover features comprise users operational finger type, and dimensions by using a hover feature extraction module (105). At step 807, the method predicts user finger type and age group based on the extracted hover features from the hover feature extraction module by using a finger identity module (106) and a user identity module (107).
In an embodiment, the method senses a user finger touching over the touch screen panel (201) by using a touch detection module (101).
FIG 10 illustrates the application of stable hover data obtained from operational hover area, according to one embodiment of the present invention.
As exemplarily illustrated in FIG 10, a combination of touch and hover data collected from the operational hover area (201a) can be used in text editing attributes to select, change the font, highlight the word on a touch screen panel (201). Further, the hover data obtained from non-operational hover area (201b) can also be used for other text editing attributes such as, deleting the selected text, changing the font size of the selected text etc.
Thus, detecting hover events and registering hover data patterns only when the user's finger is in contact with the touch screen panel (201) provides a stable hover data for finger/user identity. Further, the system and method of the present invention eliminates detection of false hover events by using stable hover data for finger/user identity.
At least one of the plurality of modules may be implemented through an AI model. A function associated with AI may be performed through the non-volatile memory, the volatile memory, and the processor.
The processor may include one or a plurality of processors. At this time, one or a plurality of processors may be a general purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU).
The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning.
Here, being provided through learning means that, by applying a learning algorithm to a plurality of learning data, a predefined operating rule or AI model of a desired characteristic is made. The learning may be performed in a device itself in which AI according to an embodiment is performed, and/o may be implemented through a separate server/system.
The AI model may consist of a plurality of neural network layers. Each layer has a plurality of weight values and performs a layer operation through calculation of a previous layer and an operation of a plurality of weights. Examples of neural networks include, but are not limited to, convolutional neural network (CNN), deep neural network (DNN), recurrent neural network (RNN), restricted Boltzmann Machine (RBM), deep belief network (DBN), bidirectional recurrent deep neural network (BRDNN), generative adversarial networks (GAN), and deep Q-networks.
The learning algorithm is a method for training a predetermined target device (for example, a robot) using a plurality of learning data to cause, allow, or control the target device to make a determination or prediction. Examples of learning algorithms include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.

Claims (10)

  1. A system to identify hover data patterns as operational on a touch screen panel of a mobile device, wherein the system comprises:
    a sensing panel to sense a user finger touching or hovering over the touch screen panel;
    a hover detection module to:
    detect hover events and register hover data patterns on the touch screen panel when the user’s finger is in contact with the touch screen panel for a predefined time interval;
    a clustering analysis module to:
    identify a cluster size based on the number of hover data patterns registered on the touch screen panel simultaneously;
    identify cluster centroid based on the cluster size;
    identify distance between the inter-clusters;
    identify distance within the intra-cluster;
    a hover identity module to:
    identify the hover data patterns as operational hover area when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster;
    identify the hover data patterns as non-operational hover area when the cluster density is high or very low, and inter-cluster proximity is close;
    a hover feature extraction module to collect hover data from the operational hover area and convert the collected hover data to hover features, wherein the hover features comprise users operational finger type, and dimensions; and
    a finger identity module and user identity module use artificial intelligence techniques to predict user finger type and age group based on the extracted hover features from the hover feature extraction module.
  2. The system as claimed in claim 1, wherein the sensing lines for sensing hover events and touch events are separated using capacitance difference to detect hover event and touch event concurrently.
  3. The system as claimed in claim 1, wherein the system comprises a touch detection module to sense a user finger touching over the touch screen panel.
  4. A method to identify hover data patterns as operational on a touch screen panel of a mobile device, wherein the method comprising the steps of:
    sensing a user finger touching or hovering over the touch screen panel;
    detecting hover events and registering hover data patterns on the touch screen panel when the user’s finger is in proximity/contact with the touch screen panel for a predefined time interval;
    identifying a cluster size based on the number of hover data patterns registered on the touch screen panel simultaneously, identifying cluster centroid based on the cluster size, wherein cluster centroid are represented as hover data patterns on touch screen panel, identifying the cluster density, distance between the inter-clusters, distance within the intra-cluster;
    identifying the hover data patterns as operational hover area when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster;
    identifying the hover data patterns as non-operational hover area when the cluster density is high or very low, and inter-cluster proximity is close;
    collecting hover data from the operational hover area and converting to hover features by using a hover feature extraction module, wherein the hover features comprise users operational finger type, and dimensions; and
    predicting user finger type and age group based on the extracted hover features by using artificial intelligence techniques.
  5. The method as claimed in claim 4, wherein the method further comprises the steps of:
    detecting hover events and registering hover data patterns on the touch screen panel when the user’s finger is in proximity/contact with the touch screen panel for a predefined time interval by using a hover detection module.
  6. The method as claimed in claim 4, wherein the method further comprises the steps of:
    identifying a cluster size based on the number of hover data patterns registered on the touch screen panel simultaneously by using a clustering analysis module, identifying cluster centroid based on the cluster size, wherein cluster centroid are represented as hover data patterns on the touch screen panel, identifying the cluster density, distance between the inter-clusters, identifying distance within the intra-cluster by using a clustering analysis module.
  7. The method as claimed in claim 4, wherein the method further comprises the steps of:
    identifying the hover data patterns as operational hover area when the cluster density is low, and proximity of the hover data pattern and centroid are close for intra-cluster and far for inter-cluster by using a hover identity module; and
    identifying the hover data patterns as non-operational hover area when the cluster density is high or very low, and inter-cluster proximity is close by using the hover identity module.
  8. The method as claimed in claim 4, wherein the method further comprises the steps of:
    collecting hover data from the operational hover area and converting to hover features by using a hover feature extraction module, wherein the hover features comprise users operational finger type, and dimensions.
  9. The method as claimed in claim 4, wherein the method further comprises the steps of:
    predicting user finger type and age group based on the extracted hover features from the hover feature extraction module by using a finger identity module and a user identity module.
  10. The method as claimed in claim 4, wherein the method further comprises the steps of:
    sensing a user finger touching over the touch screen panel by using a touch detection module.
PCT/KR2023/001379 2022-01-31 2023-01-31 A system and method to identify hover data patterns as operational on touch screen panel WO2023146380A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202241005047 2022-01-31
IN202241005047 2022-01-31

Publications (1)

Publication Number Publication Date
WO2023146380A1 true WO2023146380A1 (en) 2023-08-03

Family

ID=87472315

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/001379 WO2023146380A1 (en) 2022-01-31 2023-01-31 A system and method to identify hover data patterns as operational on touch screen panel

Country Status (1)

Country Link
WO (1) WO2023146380A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150033194A1 (en) * 2013-07-25 2015-01-29 Yahoo! Inc. Multi-finger user identification
US9203835B2 (en) * 2013-03-01 2015-12-01 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
JP2015228231A (en) * 2015-07-10 2015-12-17 カシオ計算機株式会社 User identification device and program
US9851829B2 (en) * 2010-08-27 2017-12-26 Apple Inc. Signal processing for touch and hover sensing display device
US20210055814A1 (en) * 2019-08-23 2021-02-25 Samsung Electronics Co., Ltd. Method for determining proximity of at least one object using electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9851829B2 (en) * 2010-08-27 2017-12-26 Apple Inc. Signal processing for touch and hover sensing display device
US9203835B2 (en) * 2013-03-01 2015-12-01 Paypal, Inc. Systems and methods for authenticating a user based on a biometric model associated with the user
US20150033194A1 (en) * 2013-07-25 2015-01-29 Yahoo! Inc. Multi-finger user identification
JP2015228231A (en) * 2015-07-10 2015-12-17 カシオ計算機株式会社 User identification device and program
US20210055814A1 (en) * 2019-08-23 2021-02-25 Samsung Electronics Co., Ltd. Method for determining proximity of at least one object using electronic device

Similar Documents

Publication Publication Date Title
JP6409507B2 (en) Method, computer system, and program for associating user activity label with gaze
WO2018151449A1 (en) Electronic device and methods for determining orientation of the device
Le et al. Investigating the feasibility of finger identification on capacitive touchscreens using deep learning
CN1797305A (en) Reducing accidental touch-sensitive device activation
JP2002501271A (en) Method and apparatus for integrating manual input
EP2668555A2 (en) Terminal having touch screen and method for identifying touch event therein
EP2614422A1 (en) Motion control touch screen method and apparatus
CN106446786A (en) Fingerprint identifying method, fingerprint identifying apparatus and terminal device
WO2017155292A1 (en) Anomaly detection method and detection program
WO2011055930A2 (en) Method, terminal device, and computer-readable recording medium for setting an initial value for a graph cut
WO2014054861A1 (en) Terminal and method for processing multi-point input
WO2022055099A1 (en) Anomaly detection method and device therefor
WO2021071288A1 (en) Fracture diagnosis model training method and device
WO2013100727A1 (en) Display apparatus and image representation method using the same
Wahyono et al. Face emotional detection using computational intelligence based ubiquitous computing
WO2023146380A1 (en) A system and method to identify hover data patterns as operational on touch screen panel
CN107368717A (en) The method and terminal of a kind of identification
CN109343694B (en) Gesture recognition system and method for finger-guessing and punching game
WO2016129773A1 (en) Method, device and system for providing feedback, and non-transitory computer readable recording medium
WO2013115493A1 (en) Method and apparatus for managing an application in a mobile electronic device
WO2013005988A2 (en) Touch screen device and signal processing method therefor
WO2012077909A2 (en) Method and apparatus for recognizing sign language using electromyogram sensor and gyro sensor
WO2018070657A1 (en) Electronic apparatus, and display apparatus
WO2010143862A2 (en) User interface control apparatus and method for producing same
CN109542229B (en) Gesture recognition method, user equipment, storage medium and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23747421

Country of ref document: EP

Kind code of ref document: A1