WO2017131251A1 - Dispositif d'affichage et procédé de traitement d'entrée tactile associé - Google Patents

Dispositif d'affichage et procédé de traitement d'entrée tactile associé Download PDF

Info

Publication number
WO2017131251A1
WO2017131251A1 PCT/KR2016/000894 KR2016000894W WO2017131251A1 WO 2017131251 A1 WO2017131251 A1 WO 2017131251A1 KR 2016000894 W KR2016000894 W KR 2016000894W WO 2017131251 A1 WO2017131251 A1 WO 2017131251A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch input
key
region
area
gaussian
Prior art date
Application number
PCT/KR2016/000894
Other languages
English (en)
Korean (ko)
Inventor
김태호
이슬
이동욱
정대웅
Original Assignee
주식회사 노타
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 노타 filed Critical 주식회사 노타
Priority to KR1020187019858A priority Critical patent/KR102122438B1/ko
Priority to PCT/KR2016/000894 priority patent/WO2017131251A1/fr
Publication of WO2017131251A1 publication Critical patent/WO2017131251A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a display device and a touch input processing method thereof, and more particularly, to generate a probability model by analyzing touch input data on a displayed soft keyboard interface, and to generate a user's touch input according to the generated probability model. It's about how to map the keys to your intentions.
  • display devices of various sizes are becoming popular. Such devices may include a touch sensor to recognize a user's touch on the display panel and perform an operation accordingly.
  • various portable devices such as mobile phones, notebook computers, PDAs, tablet PCs, and the like provide such a touch recognition function.
  • Touch-sensitive display devices can display a soft keyboard interface for user's character input.
  • a typing error occurs due to a user's touch input method, a hand shape, a user's posture, a size of a display panel, and the like.
  • touch input error correction methods based on language models have been developed.
  • the device may predict the character to be input according to the input characters, and correct the touch input error accordingly.
  • the error correction method according to language analysis / prediction requires large size data for various words, which are combinations of characters, and has a weakness in incompatibility because different models are required for each language. Therefore, the present specification is to propose a method that can reduce the typing error of the user while requiring less language-dependent constraints.
  • the touch input processing method the step of storing the touch input data for the keyboard interface; Generating at least one Gaussian model for each of at least one key region included in the keyboard interface; Generating a Gaussian mixture model for the keyboard interface using the at least one Gaussian model; And mapping a received touch input to a key value using the Gaussian mixture model.
  • the generating of the Gaussian model is performed using first touch input data included in a base area in the key area, and the base area is It can be a predefined area in the center of the key area.
  • the generating of the Gaussian mixture model may include: first touch input data included in the base area and second touch input data not included in the base area; Can be performed using.
  • the received touch input in the mapping of the received touch input to the key value, is a first key of a first key region in which the received touch input is located. And a second key value of a second key area adjacent to the first key area.
  • the mapping of the received touch input to the key value may include: when the position of the received touch input is located within a high reliability area within the first key area. Mapping the received touch input to a first key value of the first key region including the high reliability region and if the received touch input is located outside the high reliability region within the first key region, the Gaussian mixing Mapping a constantly received touch input to a first key value of the first key area or to a second key value of a second key area adjacent to the first key area based on a model, wherein the high reliability The area may be a predefined area centered in the existing area.
  • the mapping of the received touch input based on the Gaussian mixed model may include a preset value of the likelihood of the received touch input. Mapping the received touch input to a key value of the Gaussian mixed model when the threshold value is greater than or equal to a threshold value; The method may further include mapping a received touch input to the first key value of the first key area.
  • the touch input data may be stored by randomizing the input order.
  • deleting touch input data included in the key area including the base area may further include.
  • At least one of the size, shape, and position of at least one of the base area and the high reliability area may be variable.
  • the touch input processing method may display a GUI indicating a key mapping area reflecting the high reliability area and the Gaussian mixed model area together with the keyboard interface.
  • a display device for solving the above technical problem, a display unit for displaying a keyboard interface; A sensor unit for sensing a touch input to the display unit; A memory unit for storing touch input data for the keyboard interface and an application for processing the touch input; And a processing unit configured to drive the application to process the touch input, the display device generating at least one Gaussian model for each of the at least one key region included in the keyboard interface, One Gaussian model may be used to generate a Gaussian mixture model for the keyboard interface, and the Gaussian mixture model may be used to map a received touch input to a key value.
  • the display device may perform the touch input processing method according to the embodiment of the present invention.
  • the display device according to the embodiment of the present invention may store a program / application for performing this touch input processing method.
  • the display device may perform the touch input processing method of the present invention by driving such a program / application.
  • a typo may be effectively corrected when a user inputs a touch.
  • the present invention mathematically models the touch distribution and uses the distribution model to infer user intentions to map the received touch input to key values. Therefore, the present invention does not depend on the language of the keyboard interface, and thus can be applied to keyboard interfaces for various languages more universally.
  • a typo may be corrected by one modeling of one keyboard shape, and a plurality of models may not be required on one keyboard shape.
  • the touch distribution for the base region is first modeled, and the entire touch distribution is modeled using the parameters obtained in the modeling. Therefore, the touch distribution can be modeled according to the user's intention. In particular, when the user's touch input is biased in a specific direction with a pattern, this can be effectively modeled.
  • the present invention can correct a typo by applying a modeled result and key mapping a touch input. However, even in this case, by using the high reliability region together with the modeling result, it is possible to prevent the correction of the mistake due to the excessive typo correction and the excessive typo correction by the modeling.
  • the present invention modifies a touch input, and provides a UI that can notify the distribution, statistics and modification of the touch input according to this process. Since the user can know his / her touch distribution or typo correction through the UI, the user can more easily recognize the usability and the use effect of the method / device / application / software according to the present invention.
  • FIG. 1 illustrates a display device and a soft keyboard interface provided by the display device according to an embodiment of the present invention.
  • FIG. 2 illustrates touch input data and error occurrences for the keyboard interface.
  • 3 illustrates two key areas among key areas included in a keyboard interface.
  • FIG. 4 illustrates user input for two key areas and two key areas included in a keyboard interface.
  • FIG. 5 illustrates a method of initializing touch input data according to an embodiment of the present invention.
  • FIG. 6 illustrates a Gaussian model generation method for each key region according to an embodiment of the present invention.
  • FIG. 7 illustrates a Gaussian mixture model generation method for a keyboard interface according to an embodiment of the present invention.
  • FIG. 8 illustrates a GMM region determination method according to an embodiment of the present invention.
  • FIG. 9 illustrates a mapping method of a touch input using a GMM and a high reliability region according to an embodiment of the present invention.
  • FIG. 10 illustrates a mapping method of a touch input using a GMM and a high reliability region according to an embodiment of the present invention.
  • FIG. 11 illustrates a key mapping method of a received touch input according to an embodiment of the present invention.
  • FIG. 12 illustrates a display device according to an embodiment of the invention.
  • FIG. 13 is a flowchart illustrating a touch input processing method according to an embodiment of the present invention.
  • GUI 14 illustrates a graphical user interface (GUI) provided by a device or an application according to an embodiment of the present invention.
  • GUI graphical user interface
  • FIG. 15 illustrates a keyboard interface and a calibrated keyboard interface in accordance with an embodiment of the present invention.
  • the present disclosure relates to a display device and a touch input processing method of the display device.
  • the display device is meant to include a variety of electronic devices, for example, mobile phones, personal digital assistants (PDAs), notebooks, tablet PCs, MP3 players, CD players, DVD players, head mounted displays (HMDs), smart watches
  • PDAs personal digital assistants
  • HMDs head mounted displays
  • Various electronic devices capable of displaying visual information such as a watch, a watch phone, a television, a kiosk, and a TV, are included.
  • the display device may be referred to as a 'device'.
  • the device is a device for driving an application or software for implementing / implementing the present invention
  • the device is to implement the present invention according to the driving of the application or software.
  • the description of the invention should be considered to serve as a description of the application / software implementing the invention.
  • the description of the method of the following specification and claims should be considered as describing the operation of the application / software in accordance with an embodiment of the invention.
  • the following description of the present invention naturally applies to software / applications coded to perform the method of the present invention separately from the device.
  • the operations of the devices described below may all be understood as operations of the software / application.
  • FIG. 1 illustrates a display device and a soft keyboard interface provided by the display device according to an embodiment of the present invention.
  • FIG. 1 illustrates a display device 1010 according to an embodiment of the invention.
  • the display device 1010 is shown as an embodiment of a tablet PC, the display device 1010 of the present invention is defined by the configuration and operation of the device, not the type or shape of the device.
  • the device 1010 may display the soft keyboard interface 1020.
  • Soft keyboard interface 1020 represents a keyboard interface that includes at least one key region provided by being displayed without a physical mechanical configuration.
  • the soft keyboard interface 1020 may be referred to herein as a virtual keyboard interface, a keyboard interface, or a soft keyboard.
  • the key area means an area corresponding to one displayed key.
  • an area recognized as a corresponding key value may be represented as a key area.
  • the key area may indicate a displayed key layout or may indicate a default recognition area recognized by the corresponding key.
  • the keyboard interface 1020 may be provided in various forms.
  • a keyboard interface 1020 having a key arrangement generally provided is illustrated as an example, but the shape of the keyboard interface 1020, the number of keys included, the arrangement of keys, the layout of the keyboard, and the like may vary according to embodiments. Can be.
  • FIG. 1 illustrates an embodiment in which the keyboard interface 1020 is displayed at the bottom of the display, the display position, orientation, and size of the keyboard interface 1020 are not limited thereto.
  • the keyboard interface 1020 includes a plurality of key areas. And basically, the device maps touch inputs for key areas to the corresponding key values. For example, a touch input with coordinates in the key region of A is mapped to a key value of A.
  • FIG. 2 illustrates touch input data and error occurrences for the keyboard interface.
  • the device may include a touch sensor.
  • the user input recognition method for the display of the touch sensor is a well-known technique and will not be described in detail.
  • the touch input may be dataized by the predetermined protocol of the touch sensor and the processor. That is, when the user applies a touch input to the display, the device may recognize the touch input as a touch sensor, and the recognized touch input may be processed as touch input data.
  • the touch input data may include coordinate values (x, y).
  • the touch input data may further include touch start coordinates, touch end coordinates, touch input time, touch duration, touch area, additional sensor information of the touch time, environment information of the device, and the like.
  • the touch input data may include additional sensor information and may include tilt information and acceleration information of the device when the corresponding touch input is stored.
  • the displayed keyboard interface includes a plurality of key areas.
  • the key area may indicate an area corresponding to one displayed key.
  • the device may recognize the touch input by mapping the touch input to the key value of the key area including the touch input coordinates. For example, when the location of the received touch input is located in the key region 2010 of ' ⁇ ', the device may map the received touch input to a key value of ' ⁇ '. However, as shown in FIG. 2, an input for a key area not intended by the user may occur at the boundary of the key areas.
  • the dots in FIG. 2 represent stored touch input data. That is, the coordinate values of the touch input data are displayed as dots.
  • the touch inputs marked with X are touch inputs recognized near the boundary of the key areas.
  • touch inputs to the key region 2010 of ' ⁇ ' are biased to the right. Therefore, for the key area 2020 of the right ' ⁇ ' key area 2010, the touch inputs of the left boundary of the key area 2020 correspond to the user's intention to view the touch input of the ' ⁇ ' as the touch input. .
  • the touch inputs of the left boundary of the key region 2020 of the ' ⁇ ' may be determined as a typo of the user, and the typos may be corrected by mapping them to the key value of the ' ⁇ ' rather than the key value of the ' ⁇ '.
  • the present invention proposes a method of mapping touch inputs to the boundary of the key areas to key values suitable for the user's intention through mathematical modeling.
  • a method of mapping a touch input for a key region of the keyboard interface to a key value according to a user's intention will be described in detail.
  • 3 illustrates two key areas among key areas included in a keyboard interface.
  • FIG. 3 illustrates a first key area 3010 for inputting a character A and a second key area 3020 for inputting a character S.
  • the characters A and S are selected as embodiments, and the first key region 3010 and the second key region 3020 may be mapped to key values corresponding to arbitrary numbers and characters according to the configuration of the keyboard interface.
  • FIG. 4 illustrates touch input data for two key areas and two key areas included in a keyboard interface.
  • user input data for the keyboard interface may be collected.
  • the user input data may include touch coordinates.
  • the device may store more than a preset number of touch input data for a preset period of time, and perform touch input processing by analyzing and modeling the stored touch input data. The analysis and modeling of such data may be set to be performed at a time when the user does not use the device or at a time when the device is connected to the charger.
  • the device may perform data analysis and modeling when the touch input data for the entire keyboard interface is greater than or equal to a predetermined number. For example, when 3000 or more touch coordinate values of the touch input data are collected, the device may perform data analysis and modeling. However, in this case, the total number of data is 3000 or more, but the number of touch coordinates may be biased in a specific key area. Therefore, the device of the present invention can consider not only the number of global coordinate values but also the number of data for each key region. For example, when the number of recently received touch inputs for the keyboard interface is 3000 or more and the number of received touch inputs for each key area is 25 or more, the device may perform data analysis and modeling.
  • the device may randomize the storage order.
  • the present invention performs touch input processing by analyzing / modeling only touch positions without performing language based touch input guessing. Therefore, even if the touch input data is not stored in order, it does not affect the analysis / modeling performance of the data. By randomizing the storage order, exposure of personal information such as passwords and specific word sequences can be minimized.
  • an input to a personal information input interface such as a password field may not be stored.
  • Touch input data is used locally only by the device and is not transmitted externally. However, if necessary, the touch input data may be transmitted to a server linked with the device.
  • the server may be a server in which a program that manages / controls an app driven to implement the present invention in a device is stored and driven.
  • the user input for the first key area 3010 and the second key area 3020 and the user input for S may have a distribution biased to the right.
  • the touch inputs 4010 that are adjacent to the left side of the second key area 3020, that is, the boundary of the first key area 3010 are not the inputs to the second key area 3020, but the first key area 3010. Viewing as input to is consistent with the user's intent.
  • the mapping of inputs to the boundaries of these adjacent key regions can have a decisive effect on reducing typing errors.
  • a method of mapping such key inputs to key values suitable for user intention will be described in detail.
  • FIG. 5 illustrates a method of initializing touch input data according to an embodiment of the present invention.
  • the present invention sets the base areas 5010 and 5020 in the key areas 3010 and 3020 of the keyboard interface, and starts modeling the touch input data in the base area.
  • the device may map touch input data in the base area to key values of the corresponding area, and this mapping may be referred to as labeling.
  • Labeling is the act of assigning a correct answer, which means mapping the received input to the correct result.
  • the operation of mapping the received touch input to a correct key value may correspond to labeling.
  • the device may map touch inputs in the first base area 5010 to a key value of 'A' and map touch inputs in the second base area 5020 to a key value of 'S'. Can be.
  • the first base area 5010 is positioned at the center of the displayed first key area 3010 and may have an area of a certain percentage of the key area. In an embodiment, the first base area 5010 with respect to the first key area 3010 occupies 80% of the area of the first key area 3010, and the shape is the same as that of the first key area 3010. It can be set to a shape. In an embodiment, the area, shape, and position of each base area with respect to each key area may be the same, but may be different.
  • the reason for using the base area is as follows.
  • the touch inputs 4010 intended for the input of the first key area are the second key area 3020. Modeled as input to. Therefore, since it is difficult to perform key value mapping according to user intention to the boundary region, the present invention first performs modeling based on the touch input of the base region, and then performs additional modeling using touch inputs other than the base region. Suggest a method.
  • the base area represents an area that can be estimated with a high probability that the key area will not be out of the user's intention even if it is recognized as a touch input to the corresponding key area.
  • the device may delete the touch input data when the number of touch input data included in the base area in each key area is equal to or less than a preset number. That is, if the number of touch input data included in the base area is less than the predetermined number, the number of samples for proper Gaussian modeling is small, and thus Gaussian modeling for the corresponding key area may not be performed. In this case, if only the touch input data of the base area is deleted, the touch data remaining in the key area may affect the generation of the GMM for the entire keyboard interface. Therefore, when the number of touch input data for the base area is less than the preset number, the device may delete data for the key area as well as the base area, and generate the GMM using the deleted data.
  • FIG. 6 illustrates a Gaussian model generation method for each key region according to an embodiment of the present invention.
  • the present invention maps the touch input data 6010 in the first base area 5010 of the first key area 3010 to 'A'. That is, the touch input data 6010 in the first base area 5010 is mapped to a key value corresponding to the first key area 3010.
  • the present invention can learn a Gaussian distribution using the mapped data. Training of the Gaussian distribution may be referred to as Gaussian model generation.
  • Gaussian probabilistic models are probabilistic models that represent a distribution of aggregated observations around a mean. The mathematical description of the Gaussian model itself is not detailed.
  • the present invention can generate a Gaussian model for each of the key areas included in the keyboard interface. For example, after the Gaussian distribution learning for the first key region, the present invention learns the second Gaussian distribution using the touch input data 6020 in the second base region 5020 of the second key region 3020. This operation can be performed for all key areas included in the keyboard interface. However, Gaussian modeling may be skipped for key regions where the number of sample data is less than a certain number.
  • Gaussian distribution can be learned / generated based on Maximum Likelihood.
  • the learned parameters may be used as parameters of the GMM to be described later.
  • the present invention learns a Gaussian distribution of individual key regions based on the base region, so that the Gaussian distribution of the first region 3010 and the Gaussian distribution of the second region 3020 are shifted to the right, respectively. You can expect it. That is, the mean of the Gaussian distribution (model) of the first region 3010 may be modeled to be close to the position of the touch input data 6010 within the first base region 5010. Similarly, the mean of the Gaussian distribution (model) of the second region 3020 will be modeled close to the location of the touch input data 6020 within the second base region 5020.
  • FIG. 7 illustrates a Gaussian mixture model generation method for a keyboard interface according to an embodiment of the present invention.
  • the present invention now generates a Gaussian Mixture Model (GMM) for the entire keyboard interface.
  • the present invention models the GMM using all of the touch input data not used in the process up to FIG. 6 as shown in FIG. 7.
  • the present invention can model the GMM using an Expectation Maximization (EM) technique.
  • EM Expectation Maximization
  • the GMM may indicate the probability of which key value the touch input for each key region of the keyboard interface should be mapped to.
  • Gaussian probabilistic models are probabilistic models that represent a distribution of aggregated observations around a mean. Therefore, there is a limitation that can express only a unimodal form in which data is grouped into a group centered on an average.
  • the present invention uses GMM to represent a plurality of probability distributions for a plurality of keys included in a keyboard interface.
  • Expectation maximization technique is a method used to estimate the probability model with the received random variable. That is, the EM algorithm is an iterative algorithm that finds parameters with maximum likelihood or maximum a posteriori (MAP) in a probability model that depends on unobserved latent variables. The EM algorithm alternates between an expectation (E) step of calculating the expected value of log likelihood as an estimate of a parameter, and a maximization (M) step of obtaining a variable value that maximizes this expectation. The variable value calculated in the maximization phase is then used as an estimate for the next expected value phase.
  • E expectation
  • M maximization
  • the GMM is generated using a plurality of Gaussian models for each key region generated in the process up to FIG. 6.
  • the GMM includes parameters of weight, mean, and covariance, wherein the double weight, mean, uses the parameters obtained for each Gaussian distribution, and the covariance is 1 / n. Can be initialized.
  • n represents the number of keys modeled by GMM.
  • the parameter is optimized by repeatedly performing an EM technique in the modeling process of the GMM, and a value of the parameter may be out of an appropriate range and may be optimized to a local minimum value.
  • the position of the average of the Gaussian model with respect to region A in FIG. 7 may invade the key region of S, or the covariance may be too wide.
  • the present invention may limit the width at which the mean and covariance can vary for each key region.
  • the present invention may limit an area in which an average of the Gaussian model for each key area may exist to an internal area such as the base area described above. It is also possible to set limits of the possible range of covariance.
  • the likelihood of the Gaussian distribution with respect to the coordinates of the received touch input may be calculated.
  • the present invention may perform the comparison of the probabilities only for the input key and the adjacent keys.
  • the present invention provides a key in which a user input is located, such as performing a communicative comparison for S, A, and D, or S, A, D, W, and X for an input to S. You can also perform a communicative comparison on a region and two to eight adjacent key regions.
  • FIG. 8 illustrates a GMM region determination method according to an embodiment of the present invention.
  • Fig. 8 shows a method for determining a GMM area by taking three key areas A, S, and D as an example.
  • the three layout areas 8010 of FIG. 8A represent three key areas of the displayed keyboard interface.
  • 8 shows a GMM for three key regions 8010 and a method for configuring the GMM regions accordingly.
  • FIG. 8 (b) shows a GMM for three key areas A, S, and D.
  • the GMM contains Gaussian distributions for each key region.
  • the GMM includes a Gaussian distribution 8020 for A, a Gaussian distribution 8030 for S, and a Gaussian distribution 8040 for D.
  • the present invention can map the coordinates to the key values of the Gaussian model if the probability of a particular coordinate is greater than or equal to a predetermined threshold based on each Gaussian model of the GMM.
  • the coordinates of the specific touch input 8050 correspond to the position of x.
  • the x position has the highest probable value for the Gaussian model 8040 of D in the GMM, which is above the threshold that can map the x coordinate to the key value of D. Therefore, although the device exists in the key area of S of the key area 8010 where the touch input 8050 is displayed, the device may map the touch input 8050 to a key value of D instead of S.
  • the present invention may configure a GMM region by collecting coordinates whose communicative value is greater than or equal to a threshold value from the GMM.
  • the GMM region includes a first region 8060 mapped to a key value of A, a second region 8070 mapped to a key value of S, and a third region 8080 mapped to a key value of D. It includes.
  • the threshold value can be determined / adjusted to reflect the user's intent. For example, increasing the threshold reduces the size of the GMM area where typos can be corrected, and can only correct typos for touch inputs that are close to the average of each Gaussian model. Therefore, the accuracy of the typo correction can be increased, but the frequency of typo correction can be reduced. Lowering the threshold can correct typos for touch inputs that are relatively far from the boundary. Instead, there is a risk that the area can be recognized as a typo by the intended touch input. Therefore, the accuracy of typos can be lowered, but the frequency of typos can be increased.
  • the present invention is characterized by setting an appropriate threshold value in consideration of such a tradeoff.
  • the present invention is intended to correct typing errors that are not intended by the user, and determining and correcting the intended typing as an error is the most avoiding factor. Therefore, in addition to performing keymapping or thresholding only with GMM, a high-reliability zone can be set for a key zone, and touch input to this zone can always be mapped to key values for that zone. have.
  • FIG. 9 illustrates a mapping method of a touch input using a GMM and a high reliability region according to an embodiment of the present invention.
  • FIG. 9 (a) shows three key areas of the keyboard interface displayed as shown in FIG. 8 (a).
  • the high reliability area refers to an area that should be determined as a key input to the display area when a touch input is received in the area. As shown in FIG. 9B, the touch input in which the coordinate value is recognized into the high reliability region 9010 of the A key region is recognized as the A key regardless of the GMM.
  • FIG. 9C is a diagram illustrating the GMM region described above with reference to FIG. 8, and overlapping descriptions thereof will be omitted.
  • FIG. 9 (d) shows final key mapping regions for mapping a touch input to a key value.
  • the high reliability region is mapped to the corresponding key value irrespective of the GMM, and the regions of the threshold or higher in the GMM are set in addition to the high reliability region.
  • the high reliability region and the remaining regions of the GMM region may be mapped according to the key values of the displayed key region.
  • the finally determined regions may be referred to as key mapping regions.
  • the key mapping area of A, the key mapping area of S, and the key mapping area of D are determined as shown in FIG. 9 (d).
  • the entire keyboard interface including the plurality of key mapping regions may be referred to as a calibrated key mapping interface.
  • the high reliability region is located in the central region within the key region, and the size can be changed according to the setting. For example, when the user's touch input pattern, that is, the average value of the GMMs, is largely out of the center of each key region, the size of the high reliability region may be reduced to increase the error correction capability. Alternatively, when the user's touch input pattern is generally located close to the center of each key region, the size of the high reliability region may be increased to reduce incorrect typographical correction and to improve typo correcting accuracy.
  • the location of the high reliability region may also be adjusted according to the user's touch input pattern to improve the error correction capability and accuracy. For example, in the key region of 'D' of FIG. 9 according to the user's touch input pattern, the high reliability region may be moved to the left to improve both error correction capability and accuracy.
  • FIG. 10 illustrates a mapping method of a touch input using a GMM and a high reliability region according to an embodiment of the present invention.
  • FIG. 10 shows the method shown in FIG. 9 with respect to the x plane.
  • Fig. 10 (a) shows the displayed layout area of Fig. 9 (a)
  • Fig. 10 (b) shows the high reliability area of Fig. 9 (b)
  • Fig. 10 (c) shows the GMM area of Fig. 9 (c).
  • 10 (d) and 9 (d) show the final determined key mapping inverse in the x plane, respectively.
  • the present invention in determining the final region, the present invention considers the GMM region but does not modify the high reliability region. That is, the present invention configures the final key mapping region so that the GMM region for 'D' of FIG. 10 (c) does not exceed the high reliability region for 'S' of FIG. 10 (b).
  • the present invention configures a key mapping area according to the GMM area for an area greater than or equal to a specific threshold in the GMM, and follows the mapping of the displayed key area for the high reliability area and the outside of the GMM area.
  • FIG. 11 illustrates a key mapping method of a received touch input according to an embodiment of the present invention.
  • the present invention may configure a key mapping area for the entire keyboard interface, and perform key mapping according to a position in the key mapping area of the coordinates of the received touch input.
  • the amount of pre-computation may increase. Therefore, as another embodiment, the present invention may process the received touch input for each coordinate without configuring the entire key mapping area. 11 is a flow chart illustrating this method.
  • the device may receive a touch input for the displayed keyboard interface in operation S11010.
  • the received touch input includes coordinate information.
  • the device may first determine whether the coordinates of the received touch input are located in the high reliability area (S11020). If the touch input coordinates exist in the high reliability region, the device may map the received touch input to a key value of the high reliability region (S11030).
  • the device may determine whether the GMM corresponding to the coordinates of the touch input is equal to or greater than the threshold (S11040). If the GMM corresponding to the touch input coordinates is equal to or larger than the threshold, the device may map the received touch input to a key value of the corresponding GMM region (S11050). In other words, the device may map the received touch input to a key value of the Gaussian model when the communicability of the received touch input coordinates is equal to or larger than a threshold value for one of the plurality of Gaussian models included in the GMM. have.
  • the device may map the touch input to a key value of the key region where the touch input is located (S11060).
  • FIG. 12 illustrates a display device according to an embodiment of the invention.
  • the display device 12000 may include a display unit 12010, a communication unit 12020, a processing unit 12030, a sensor unit 12040, and a memory unit 1250. As indicated by the dotted lines in FIG. 12, the communication unit 12020 may optionally be included.
  • the display unit 12010 may display visual information on the display screen.
  • the visual information may represent a still image, a moving picture, an application execution screen, various interfaces, or various visually expressible information including the same, which can be displayed by the display unit 1200.
  • the display unit 12010 may output various visual information on the display screen based on the control command of the processing unit 12030.
  • the display unit 12010 of the present invention can display a soft keyboard interface as shown in FIG. 1, and display GUIs according to other embodiments of the present invention.
  • the communication unit 12020 may perform wired or wireless communication with an external device of the device.
  • the communication unit 12020 may not be provided depending on the configuration of the device, or may be configured of a plurality of communication chipsets.
  • the communication unit 1230 includes a communication module and may perform communication using various communication protocols such as 3G, 4G (LTE), 5G, WIFI, Bluetooth, and NFC.
  • the sensor unit 12040 may sense a user input or an environment of the device by using at least one sensor mounted on the device.
  • the at least one sensor is a variety of sensors, such as touch sensors, fingerprint sensors, motion sensors, pressure sensors, camera (image) sensors, tilt sensors, gyro sensors, gyroscope sensors, angular velocity sensors, illuminance sensors, and angle sensors. It may include.
  • the above-described sensors may be included in the device as a separate element, or may be integrated into at least one or more elements and included in the device.
  • the sensor unit 12040 of the present invention includes a touch sensor capable of sensing a touch input to the display unit 12010.
  • the touch sensor may sense a touch input and transmit data of a promised type, such as a coordinate value, to the processing unit 1230.
  • the touch input includes both contact and non-contact touch inputs (eg, hovering inputs) to the display unit 12010.
  • the touch input may contact or not touch the display unit 12010 using touch input means (for example, a stylus pen or a touch pen), as well as an input for directly contacting or non-contacting the display unit 12010 with a part of a user's body. It includes all of the touch input.
  • the memory unit 1250 may store data including various information.
  • the memory unit 12050 collectively refers to volatile and nonvolatile memories.
  • the memory unit 1250 may store the received touch input data.
  • the memory unit may store an application / software for driving the above-described method of the present invention.
  • the processing unit 1230 may control at least one other unit included in the device.
  • the control unit 1230 may process data inside the device.
  • the control unit 1230 may control at least one unit included in the device based on the detected user input.
  • Processing unit 12030 may implement the method of the present invention by running an application / software for performing the method of the present invention.
  • FIG. 13 is a flowchart illustrating a touch input processing method according to an embodiment of the present invention.
  • the touch input processing method according to the present invention may be performed by a device or by an application / software running on the device.
  • the device may store touch input data for the keyboard interface in operation S13010.
  • the displayed keyboard interface may include at least one key area, which may be referred to as a layout area.
  • the touch input data may include at least one of coordinates of the touch input, an area of the touch input, a duration, and additional sensor information at the time of the touch input.
  • the device may randomly store the storage order of the touch input data.
  • the device may generate a Gaussian model for each key region (S13020).
  • the device may generate a Gaussian model / distribution using the stored touch input data.
  • the device may generate a Gaussian model using only touch input data in the base area of each key area.
  • the device may generate a Gaussian model for all or some of the key areas included in the keyboard interface.
  • the device may generate up to n Gaussian models.
  • the Gaussian model generation for the m key areas may be omitted according to the number of touch input data.
  • the device may generate n-m Gaussian models and generate a GMM for the keyboard interface.
  • the device may map the touch input according to the displayed key area for the key area in which the Gaussian model is not generated.
  • the device may generate a Gaussian mixture model (GMM) for the keyboard interface using the generated Gaussian model (S13030).
  • GMM Gaussian mixture model
  • the device may generate a Gaussian mixture model by using touch input data in the base area and touch input data other than the base area together.
  • FIGS. 3 to 8 applies to a Gaussian model generation of the device and a method of generating a Gaussian mixed model using the same, and overlapping descriptions will not be repeated.
  • the device may map the received touch input to a key value using a Gaussian mixed model (S13040).
  • the device may map the received touch input to a key value of a key area where the received touch input is located or a key value of an adjacent key area of the key area where the received touch input is located.
  • the key value may represent a character, a number, a symbol, or the like displayed on the keyboard interface, or a digital value corresponding to the character, number, symbol, or the like.
  • the device may perform key value mapping of the received touch input as described with reference to FIGS. 9 through 11.
  • the device may map the touch input based on the key mapping area reflecting the high reliability area and the GMM area.
  • the device may map the touch input based on the coordinate of the touch input as in the method of FIG. 11.
  • the device of the present invention is characterized in that a touch input is mapped to a key value of a key region where a touch input is located or a key mapping region where a touch input is located to a key value of an adjacent key region. That is, the input of a specific key region may be mapped to the key value of an adjacent key region according to the distribution of the received touch input without changing the layout of the displayed key regions.
  • the device may first determine whether the location of the received touch input is a high reliability area of a particular key area. If the received touch input is located in the high reliability region, the device may map the received touch input to a key value of a key region including the corresponding high reliability region. When the received touch input is located outside the high reliability region, the device may map the received touch input to a key value of a key region where the received touch input is located or a key value of an adjacent key region based on the Gaussian mixture model.
  • the method of mapping the received touch input based on the mixed Gaussian model is as described with reference to FIGS. 8 to 11.
  • the device may map the received touch input to a key value of the GMM that is greater than or equal to the threshold value.
  • the key value of the GMM that is greater than or equal to the threshold value may be the key value of the Gaussian model whose communicative value of the corresponding coordinate is greater than or equal to the threshold value among the plurality of Gaussian models represented by the GMM.
  • the device may map the received touch input to a key value of a key region in which the received touch input is located.
  • the device may randomly store an input order of touch input data.
  • the touch input data included in the key area including the base area may be deleted. Therefore, Gaussian modeling of the corresponding key region is not performed, and touch data of the corresponding key region does not affect the generation of the GMM for the keyboard interface.
  • the size, shape, position, etc. of at least one of the base region and the high reliability region may be variable.
  • the base region and the high reliability region may be set identically.
  • GUI 14 illustrates a graphical user interface (GUI) provided by a device or an application according to an embodiment of the present invention.
  • GUI graphical user interface
  • a UI User Interface
  • the device may provide an effect of notifying the user of the correction of the typo. This effect may be provided as a tactile, sound or visual effect such as vibration.
  • FIG. 14 illustrates an embodiment of a GUI illustrating typo correction.
  • the device may display the modified key area by coloring or highlighting it. Correcting a typo in this specification indicates a case where a received touch input is mapped to a key value of an adjacent key region instead of a key region in which a touch input is located according to a GMM region. That is, when a touch input is received in an area other than the key area of " ⁇ " as shown in FIG. 14 (b), but is mapped to a key value of " ⁇ ", the device (app) has a visual effect in the key area of " ⁇ ". Can be applied.
  • FIG. 14C illustrates a visual effect when a touch input is normally received in a key region of “” and the touch input is mapped to a key value of “”.
  • the user can know when a typo was corrected by the present invention through the GUI as shown in FIG.
  • FIG. 15 illustrates a keyboard interface and a calibrated keyboard interface in accordance with an embodiment of the present invention.
  • the touch input data may be displayed as a dot to show a user input pattern.
  • a visual effect may be added to and displayed on the touch input data determined to be a typo.
  • touch input data recognized as a typo is denoted by x.
  • the device (app) may display the data to identify data with different points.
  • the GUI of the present invention may provide a distribution of key mapping areas and touch input data to the keyboard interface.
  • the device (app) may provide as a GUI the recognition area 15010 of the keyboard interface and the key mapping area 15020 of the calibration keyboard interface before applying the present invention, and provide statistics according to the calibration. It may be.
  • the error rate is reduced from 2.03% to 1.39%
  • the typing speed is increased from 163 CPM (Characters Per Minute) to 281 CPM
  • the frequency of using a backspace is 100. It is reduced from 9.4 times to 5.1 times.
  • the device (app) may provide a calibrated key mapping interface 15020 to the user and also provide statistical data / tables / graphs according to the calibration.
  • the device (app) may further include a separate key in the keyboard interface that can trigger the display of the present invention's additional GUI, typographical error, distribution of touch input data, and the like.
  • the present invention can be used in the field of touch recognition of display devices and display devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif d'affichage et un procédé de traitement d'entrée tactile associé. Un procédé de traitement d'entrée tactile, selon un mode de réalisation de la présente invention, comprend les étapes consistant à : mémoriser des données d'entrée tactile d'une interface de clavier; générer au moins un modèle gaussien de chacune de l'au moins une région de touche comprise dans l'interface de clavier; générer un modèle de mélange gaussien de l'interface de clavier à l'aide dudit modèle gaussien; et mapper une entrée tactile reçue en tant que valeur de touche à l'aide du modèle de mélange gaussien.
PCT/KR2016/000894 2016-01-27 2016-01-27 Dispositif d'affichage et procédé de traitement d'entrée tactile associé WO2017131251A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020187019858A KR102122438B1 (ko) 2016-01-27 2016-01-27 디스플레이 디바이스 및 그의 터치 입력 프로세싱 방법
PCT/KR2016/000894 WO2017131251A1 (fr) 2016-01-27 2016-01-27 Dispositif d'affichage et procédé de traitement d'entrée tactile associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/000894 WO2017131251A1 (fr) 2016-01-27 2016-01-27 Dispositif d'affichage et procédé de traitement d'entrée tactile associé

Publications (1)

Publication Number Publication Date
WO2017131251A1 true WO2017131251A1 (fr) 2017-08-03

Family

ID=59399045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/000894 WO2017131251A1 (fr) 2016-01-27 2016-01-27 Dispositif d'affichage et procédé de traitement d'entrée tactile associé

Country Status (2)

Country Link
KR (1) KR102122438B1 (fr)
WO (1) WO2017131251A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019098534A1 (fr) * 2017-11-15 2019-05-23 삼성전자주식회사 Dispositif électronique et procédé de commande associé

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102231511B1 (ko) * 2019-05-29 2021-03-23 엘지전자 주식회사 가상 키보드 제어방법 및 제어장치
WO2023146077A1 (fr) * 2022-01-27 2023-08-03 삼성전자 주식회사 Dispositif électronique et procédé de reconnaissance d'intention d'utilisateur à partir d'une entrée tactile sur un clavier virtuel, et support de stockage lisible par ordinateur non transitoire

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110201387A1 (en) * 2010-02-12 2011-08-18 Microsoft Corporation Real-time typing assistance
US20130067382A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Soft keyboard interface
US20140074458A1 (en) * 2008-08-05 2014-03-13 Nuance Communications, Inc. Probability-based approach to recognition of user-entered data
WO2014110595A1 (fr) * 2013-01-14 2014-07-17 Nuance Communications, Inc. Réduction des taux d'erreurs pour claviers tactiles
WO2015130040A1 (fr) * 2014-02-28 2015-09-03 Lg Electronics Inc. Terminal mobile et son procédé de commande

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140074458A1 (en) * 2008-08-05 2014-03-13 Nuance Communications, Inc. Probability-based approach to recognition of user-entered data
US20110201387A1 (en) * 2010-02-12 2011-08-18 Microsoft Corporation Real-time typing assistance
US20130067382A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Soft keyboard interface
WO2014110595A1 (fr) * 2013-01-14 2014-07-17 Nuance Communications, Inc. Réduction des taux d'erreurs pour claviers tactiles
WO2015130040A1 (fr) * 2014-02-28 2015-09-03 Lg Electronics Inc. Terminal mobile et son procédé de commande

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019098534A1 (fr) * 2017-11-15 2019-05-23 삼성전자주식회사 Dispositif électronique et procédé de commande associé
KR20190055489A (ko) * 2017-11-15 2019-05-23 삼성전자주식회사 전자 장치 및 그 제어 방법
KR102397414B1 (ko) * 2017-11-15 2022-05-13 삼성전자주식회사 전자 장치 및 그 제어 방법
US11599204B2 (en) 2017-11-15 2023-03-07 Samsung Electronics Co., Ltd. Electronic device that provides a letter input user interface (UI) and control method thereof

Also Published As

Publication number Publication date
KR20180105643A (ko) 2018-09-28
KR102122438B1 (ko) 2020-06-12

Similar Documents

Publication Publication Date Title
US10585490B2 (en) Controlling inadvertent inputs to a mobile device
WO2015163674A1 (fr) Procédé de fourniture d'interaction d'utilisateur à un dispositif portable, et son dispositif portable
JP5507494B2 (ja) タッチ・スクリーンを備える携帯式電子機器および制御方法
WO2021057337A1 (fr) Procédé de fonctionnement et dispositif électronique
WO2015023136A1 (fr) Procédé et appareil de reconnaissance d'état de préhension dans un dispositif électronique
US20130201155A1 (en) Finger identification on a touchscreen
WO2014030934A1 (fr) Procédé d'exploitation de fonction de stylo et dispositif électronique le prenant en charge
US20120038652A1 (en) Accepting motion-based character input on mobile computing devices
WO2020151519A1 (fr) Procédé d'entrée d'informations, dispositif terminal et support d'enregistrement lisible par ordinateur
US11314411B2 (en) Virtual keyboard animation
WO2013133524A1 (fr) Commande basée sur un geste d'entrée tactile
WO2019022567A2 (fr) Procédé de fourniture automatique de suggestions d'achèvement automatique sur la base de gestes et dispositif électronique associé
WO2017131251A1 (fr) Dispositif d'affichage et procédé de traitement d'entrée tactile associé
EP4160370A1 (fr) Procédé d'agencement d'icône, dispositif électronique et support de stockage
WO2019164098A1 (fr) Appareil et procédé permettant de fournir une fonction associée à une disposition de clavier
WO2017003068A1 (fr) Dispositif électronique pour afficher un clavier et procédé d'affichage de clavier associé
WO2018143566A1 (fr) Procédé et dispositif électronique d'affichage d'objets graphiques pour entrée d'empreintes digitales
WO2019107799A1 (fr) Procédé et appareil de déplacement d'un champ d'entrée
KR102216127B1 (ko) 문자 입력 방법 및 장치
WO2013191408A1 (fr) Procédé pour améliorer une reconnaissance tactile et dispositif électronique correspondant
KR102175853B1 (ko) 동작 제어 방법 및 그 전자 장치
KR102079985B1 (ko) 터치 입력 프로세싱 방법 및 디바이스
EP3249878A1 (fr) Systèmes et procédés de détection de la direction d'objets sur un dispositif électronique
WO2013172522A1 (fr) Terminal pouvant composer un message texte et procédé de commande
WO2022080739A1 (fr) Appareil d'affichage et procédé de commande associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16888228

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20187019858

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020187019858

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 301118)

122 Ep: pct application non-entry in european phase

Ref document number: 16888228

Country of ref document: EP

Kind code of ref document: A1