WO2013038630A1 - 情報入力装置及び情報入力方法 - Google Patents
情報入力装置及び情報入力方法 Download PDFInfo
- Publication number
- WO2013038630A1 WO2013038630A1 PCT/JP2012/005672 JP2012005672W WO2013038630A1 WO 2013038630 A1 WO2013038630 A1 WO 2013038630A1 JP 2012005672 W JP2012005672 W JP 2012005672W WO 2013038630 A1 WO2013038630 A1 WO 2013038630A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- detected
- position information
- unit
- contact
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention relates to an information input device and an information input method using a touch sensor such as a touch pad or a touch panel.
- a touch sensor such as a touch pad or a touch panel.
- touch input which is input by touching a touch sensor
- the type of touch input motion is identified, and the motion
- the present invention relates to an information input device and an information input method that perform information device control according to the type of information.
- a touch pad is well known as one of input devices for operating a GUI screen displayed on the display.
- the user can perform a pointing operation to move the cursor on the GUI screen by tracing the surface of the touch pad with a finger, and according to the movement of the user's finger. Intuitive operation can be performed.
- gesture operations such as turning the surface of the touchpad as if drawing a circle with a finger (rotation operation) and operations for flicking the surface of the touchpad with a finger (flick operation)
- rotation operation rotation operation
- flick operation flick operation
- the present invention solves the above-described conventional problems, and an object of the present invention is to provide an information input device having operation means adapted to the user's intention, improving the identification performance of gestures such as flicks and rotations.
- an information input device is an information input device that specifies a touch input from a user as a touch gesture, and the touch input to the touch sensor and the touch sensor is performed.
- a predetermined number of the contact positions detected after the touch input is started from a plurality of contact positions sequentially detected at a plurality of different timings during a predetermined time.
- An input processing unit that specifies a touch gesture using a plurality of specific contact positions that are detected later and that exclude a contact position belonging to a predetermined region.
- the input processing unit specifies a touch gesture using at least a predetermined number of contact positions detected since the start of touch input, and a predetermined number of detections have been performed.
- the touch gesture is specified by excluding the contact positions included in a predetermined area that is a subsequent contact position and has many erroneous detections. For this reason, the adoption / non-adoption of the detection information of the touch sensor can be determined more appropriately. For this reason, detection of a touch gesture can be more adapted to the touch gesture intended by the user, and the performance of touch gesture input can be improved.
- a recording medium such as a method, an integrated circuit, a computer program, or a computer-readable CD-ROM, and the method, the integrated circuit, the computer program, and the recording medium. You may implement
- the information input device of the present invention can make touch gesture detection more suitable for the touch gesture intended by the user, and can improve the performance of touch gesture input.
- FIG. 1 is a block diagram illustrating a configuration of the information input device according to the first embodiment.
- FIG. 2 is a diagram illustrating a point where a touch input by the user is performed on the touch sensor.
- FIG. 3 is a diagram illustrating an example of the appearance of the information input device when the information input device is a remote controller.
- FIG. 4 is a diagram illustrating a point where touch input by the user is performed on the touch sensor.
- FIG. 5 is a diagram illustrating a point where a touch input by the user is performed on the touch sensor.
- FIG. 6 is a diagram for explaining a first specific example of a predetermined area set as an exclusion area.
- FIG. 7 is a diagram for explaining a plurality of detection points at which the detection unit detects a locus of finger movement on the touch sensor.
- FIG. 8 is a diagram showing a graph in which a plurality of detection points are mapped to the two-dimensional coordinates of the time t and the vertical axis y.
- FIG. 9 is a diagram for explaining detection points at which the detection unit detects a locus of finger movement on the touch sensor.
- FIG. 10 is a diagram for explaining how to specify a touch gesture of a rotation operation by using a plurality of detection points.
- FIG. 11 is a diagram for explaining how to specify a touch gesture of a rotation operation by using a plurality of detection points.
- FIG. 12 is a flowchart showing a flow of information input processing performed by the input processing unit of the information input device.
- FIG. 13 is a diagram for explaining an example in which a problem occurs when the technique of Patent Document 2 is applied to a touch sensor having a protrusion.
- FIG. 14 is a block diagram showing a configuration of an information input device according to another embodiment of the first embodiment.
- FIG. 15 is a diagram for explaining a specific behavior of the region changing unit.
- the information input device described in Patent Literature 1 includes an operation unit in which a touch sensor is arranged, a storage unit that stores the positions of two points, a start point and an end point, input to the operation unit, and a start point and an end point.
- An input direction determination unit that determines an input direction from the positions of two points, and a control unit that changes the arrangement of operation regions arranged in the operation unit based on the determined input direction.
- the type of the touch gesture movement is identified based on the series information of two or more points of the contact position detected by the touch sensor, and the type of the touch gesture movement is identified. It is common to specify the type of information device control in response.
- the “sequence information” referred to here is a collection of a plurality of detection points identified as one touch gesture. For this reason, when the specific position information is stored differently from the user's intention among the stored contact position series information, the touch gesture identification result is different from the user's intended touch gesture. Problem arises.
- FIG. 2 is a diagram illustrating a point where touch input by the user is performed on the touch sensor 10.
- the detection points 100 to 140 are points indicating positions where contact by the user is detected, and the reference numbers are assigned so that the numbers increase from the smallest number to the largest in the order of the detected time.
- the touch input shown in FIG. 2 when the user moves the detection points 100 to 130 from the left to the right with the right hand and releases the finger of the right hand from the touch sensor, the user touches the detection point 140 with the left hand unintentionally. It is an example when it is detected that the left hand is released.
- the user can determine from the detection points 100 to 130 that the finger of the right hand is moved from left to right.
- the contact start point in the detected touch input is recognized as the detection point 100
- the end point is recognized as the detection point 140.
- the detection points 100 to 140 are detected as shown in FIG. 2, the touch input start point and end point are detected as in Patent Document 1, and the touch input is identified by the moving direction between the two points. If the technology to be applied is applied, it is identified that the movement is downward from the area above the touch sensor 10.
- at least one specific point is detected at a position different from the user's intention among the series information constituting the touch input input by the user, it is identified as an information input different from the user's intention. There is a problem of end up.
- Patent Document 2 in an information input method using a touch sensor, a method for selectively rejecting a touch event in an end region of the touch sensor panel is described in order to minimize an unintended operation. Yes. In other words, even if a touch is performed in the end region of the touch sensor panel where an input different from the user's intention is easily performed so that it is not erroneously detected that the touch is performed at a position different from the user's intention, the touch is not performed. A method of selective exclusion is described. There have also been proposed methods for providing some exceptions to refusal to touch at the edge of the touch panel, which are detected in the main area of the touch sensor and are recognized as part of a specific gesture. It is stated that contact is not rejected even if the finger is moved to the end.
- Patent Document 2 has problems described below, particularly depending on the shape and properties of the touch sensor.
- FIG. 3 is a diagram illustrating an example of the appearance of the information input device when the information input device is a remote controller.
- the information input device 200 is a remote controller for home appliances, and includes a touch sensor 210 together with buttons for inputting.
- the user's finger may move in a direction along the edge of the touch sensor when the user's finger contacts the sensor peripheral portion.
- FIG. 4 is a diagram illustrating the surface of the touch sensor 10.
- the touch sensor 10 has a protrusion 380 above it.
- a boundary line 390 indicated by a broken broken line is a boundary line that separates the main area A1 and the end area A2 of the touch sensor 10.
- the left and upper portions from the boundary line 390 are the end region A ⁇ b> 2, and the other portions are the main region A ⁇ b> 1.
- the detection points 300 to 360 are touch sensors 10 in the locus of the actual finger contact position when the user performs an upward flick operation (see the white arrow in FIG. 4) from the lower area of the touch sensor 10. It is an example of the detection point detected by. From the detection point 300 to the detection point 330, the user moves his / her finger from the bottom to the top. Then, the user's finger 370 hits the protrusion 380 around the touch sensor 10 at the detection point 330, and has moved along the protrusion 380 of the touch sensor 10 from the detection point 330 to the detection point 360.
- the start point at which the user touches the touch sensor 10 with the fingertip as in the detection points 300 to 330 and the point immediately after that are located at the left end, and the end point and the point immediately in front thereof as in the detection points 330 to 360. 4 is not detected by applying a method of rejecting all contact information in the end region A2 as in the method of Patent Document 2, when the is positioned at the upper end.
- the detection points 300 to 360 are identified as a part of a gesture by applying the method of Patent Document 2, and all points are treated as valid as an exception of contact rejection.
- the direction is detected by the detection point 300 that is the start point and the detection point 360 that is the end point, and is identified as a gesture that has moved from the lower left region to the upper right direction. That is, while the user is about to input a flick operation in the upward direction, the information input device identifies the flick operation as a flick operation in the upper right direction from the lower left.
- the input is identified as information input that is different from the user's intention. Therefore, if the user tries to perform the intended operation, an operation for correcting the erroneously identified input is required, and the user-friendliness is a problem. become.
- FIG. 5 is a diagram for explaining a problem that occurs when the contact time is short in the touch sensor 10 having the bulge 460 around as in FIG. 4.
- the detection points 400 to 440 are points indicating positions where contact by the user is detected, and the reference numbers are assigned so that the numbers increase from the smallest number to the largest in the order of the detected time.
- a boundary line 470 indicated by a broken line is a boundary line that separates the end region A12 and the main region A11 of the touch sensor 10, and a portion of the touch sensor 10 above the boundary line 470 is an end region of the touch sensor 10. Therefore, this is a region that is a candidate for contact rejection.
- the part below the boundary line 470 among the touch sensors 10 is the main area A11 of the touch sensor.
- the user moves his / her finger from the bottom to the top from the detection point 400 to the detection point 420 which is the start point. At this time, it is assumed that the user intends an upward flick operation. However, it is detected that the finger hits the protrusion 460 around the touch sensor 10 at the detection point 420 and the user's finger has moved rightward along the protrusion 460 of the touch sensor 10.
- the movement direction from the detection point 400 as the start point to the detection point 440 as the end point is set to be obliquely upward from the lower left. Identified as a direction flick operation.
- the flick operation intended by the user is an upward flick operation, in this case as well, the result is different from the user's intention.
- the input is identified as information input that is different from the user's intention. Therefore, if the user tries to perform the intended operation, an operation for correcting the erroneously identified input is required, and the user-friendliness is a problem. become.
- an information input device is an information input device that specifies a touch input from a user as a touch gesture, and includes a touch sensor and the touch on the touch sensor. The contact position where the input is performed, and a predetermined number of the contact positions from the start of the touch input among the plurality of contact positions sequentially detected at a plurality of different timings during a predetermined time. And an input processing unit that identifies a touch gesture using a plurality of specific contact positions excluding contact positions belonging to a predetermined area.
- the contact positions belonging to a predetermined area are excluded.
- a plurality of positions are used as specific positions for specifying touch gestures.
- the input from the start of the touch input to a certain time is often input by the user, but the input input after that may include an unintended input. Further, many unintended inputs occur in a predetermined area. From these things, at the contact position after the touch gesture is specified by using at least a predetermined number of contact positions detected after the touch input is started and the predetermined number of detections are performed.
- the touch gesture is specified by excluding the contact position included in a predetermined area where there are many erroneous detections, it is possible to more appropriately determine whether or not to adopt the detection information of the touch sensor. For this reason, detection of a touch gesture can be more adapted to the touch gesture intended by the user, and the performance of touch gesture input can be improved.
- the input processing unit detects each of the detection unit that detects the plurality of contact positions, the plurality of contact positions detected during the predetermined time by the detection unit, and the plurality of contact positions.
- a first storage unit that stores, as position information, information that associates the timing with the second timing, a second storage unit that stores the predetermined region of the regions on the touch sensor, and the first storage
- the contact position detected by the detection unit after the number of the contact positions indicated by the position information stored by the unit exceeds the predetermined number is stored in the second storage unit in advance.
- two or more pieces of positional information among a plurality of pieces of positional information stored in the first storage unit after the exclusion processing and an exclusion unit that performs the exclusion processing to exclude the contact position for A specifying unit configured to specify the touch gesture Te may have.
- the input processing unit further includes the predetermined storage stored in the second storage unit when at least one of the plurality of contact positions detected by the detection unit belongs to a specific region.
- a region changing unit that changes the region that has been changed, and the exclusion unit is changed by the region changing unit with respect to the contact position detected by the detection unit after the contact position belongs to the specific region.
- the exclusion process using a predetermined area may be performed.
- the touch sensor has a protrusion formed on the surface thereof, and the region changing unit has at least one of a plurality of contact positions detected by the detection unit as the specific region.
- the predetermined region stored in the second storage unit may be changed.
- the input processing unit is further stored after the first position information from the point of any first position information among the plurality of position information stored by the first storage unit.
- the second storage unit has a region changing unit that changes the predetermined region stored, the exclusion unit, For the contact position detected by the detection unit after the contact position belongs to a specific region, the exclusion process using a predetermined region changed by the region change unit may be performed.
- the second storage unit may store an end region predetermined as an end of the touch sensor as the predetermined region.
- an information input device that identifies a touch input from a user as a touch gesture, which is a touch sensor and a contact position where the touch input to the touch sensor is performed, and a plurality of different information during a predetermined time. Are detected after a predetermined number of the contact positions have been detected from the start of the touch input, and stored in the plurality of contact positions.
- a plurality of identifications excluding the contact position An input processing unit for identifying a touch gesture with tactile position may be provided.
- the specifying unit may calculate the touch input calculated from the direction of the touch input calculated by approximating the two or more specific positions with line segments and position information indicating the two or more specific positions.
- a linear motion having parameter information indicating the speed of the touch may be specified as a touch gesture.
- the specific unit includes the direction of the touch input calculated by approximating three or more specific positions among the plurality of specific positions stored in the first storage unit with an arc, and the three or more A rotational motion having parameter information indicating the speed of the touch input calculated from the position information indicating the specific position may be specified as a touch gesture.
- a recording medium such as a method, an integrated circuit, a computer program, or a computer-readable CD-ROM, and the method, the integrated circuit, the computer program, and the recording medium. You may implement
- FIG. 1 is a block diagram of an information input device according to an embodiment of the present invention.
- the information input device 1 includes a touch sensor 10 and an input processing unit 20.
- the touch sensor 10 is a device that detects a position where a user's finger contacts the surface of the touch sensor 10, and outputs an electrical signal corresponding to the touched position.
- the touch sensor 10 can be realized by a touch pad, a touch panel, or the like.
- the touch sensor 10 may detect a user's finger using any of a capacitance method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and the like. Description will be made using a capacitive touch sensor.
- the input processing unit 20 includes a detection unit 21, a first storage unit 22, a second storage unit 23, an exclusion unit 24, and a specifying unit 25.
- the detection unit 21 uses an electrical signal obtained from the capacitive touch sensor 10 to detect the position when the user's finger contacts the touch sensor 10 at a predetermined sampling period during a predetermined time. .
- the detection unit 21 detects a position (contact position) where the touch sensor 10 is repeatedly touched at a sampling period of 60 ms.
- the detection unit 21 is not limited to detecting a position touched by the touch sensor 10 at a constant cycle, and may be configured to be detected at a different cycle. For example, when the movement of the touch input by the user is faster than a predetermined threshold, the sampling cycle is shortened. Conversely, when the movement of the touch input by the user is slower than the predetermined threshold, the sampling cycle is increased. You may make it do. Further, the sampling cycle may be shortened as the movement of the touch input by the user becomes faster. As described above, by dynamically changing the sampling period in accordance with the moving speed of the touch input by the user, the intervals between the plurality of detected contact positions can be made closer to a constant value.
- the detection unit 21 may be in any form as long as the position where the touch input to the touch sensor is performed can be detected at a plurality of different timings during a predetermined time.
- the first storage unit 22 stores, as position information, information associating a plurality of contact positions detected by the detection unit 21 during a predetermined time with timings at which the plurality of contact positions are detected.
- the second storage unit 23 stores a predetermined area of the touch sensor 10 as an exclusion area.
- This exclusion area is an area that indicates a position candidate that is not stored in the first storage unit 22.
- a portion where the detection accuracy of detection may drop is set as an exclusion region.
- the exclusion area may be an area that is predetermined at the time of factory shipment.
- the user may be able to set a part of the area where it is difficult for the user to perform a touch operation later as an exclusion area.
- FIG. 6 is a diagram for explaining a first specific example of a predetermined area set as an exclusion area.
- a region outside the boundary line 500 indicated by a rectangular broken line on the touch sensor 10 is set as an exclusion region A22.
- region of the inner part of the rectangular broken line on the touch sensor 10 is main area
- the protrusion around the touch sensor 10 is raised, for example, when a user performs a flick operation, the direction of the flick operation is likely to change if a part of the finger hits the peripheral protrusion.
- the region outside the rectangular broken line (end region) of the touch sensor 10 is set as the exclusion region. Is effective in reducing false detections.
- a second specific example of the exclusion region is that when the touch sensor is a capacitance type as in the first embodiment, the distribution of the conductive film arranged to detect the capacitance of the finger is thin. It is an area.
- the exclusion unit 24 is configured such that the position detected by the detection unit 21 after the number of contact positions indicated by the position information stored in the first storage unit 22 exceeds a predetermined number is detected by the second storage unit 23. If it belongs to a predetermined area stored, an exclusion process for eliminating the contact position is performed. Specifically, the exclusion unit 24 determines whether or not the position (contact position) detected by the detection unit 21 is included in the exclusion region stored in the second storage unit 23. When the contact position is included in the exclusion region as a result of the region determination, the exclusion unit 24 further determines a predetermined number of contact positions indicated by the position information stored in the first storage unit 22. The number of whether or not the number is exceeded is determined.
- the exclusion unit 24 performs an exclusion process that excludes a contact position that satisfies the determination condition when the determination conditions of both the area determination and the number determination are satisfied.
- the exclusion unit 24 performs a process such that the position information associated with the contact position that satisfies the determination conditions of both the area determination and the number determination is not stored in the first storage unit 22 as a result of detecting the touch input. What is necessary is just to perform as an exclusion process, and the form of an exclusion process is not ask
- the predetermined number may be changed according to the complexity of the menu of the device to be operated by the information input device 1.
- the complexity is small if the operation is to change the TV program guide to the next day. That is, if it is detected that the TV program schedule is displayed and it is determined that the complexity of the menu is small, it is easy to return to the original screen even if erroneous input detection is performed.
- the predetermined number is set to a small number (for example, three).
- an operation for editing a recorded program has a high complexity.
- the predetermined number is set to a large number (for example, 10). In this way, the predetermined number may be set to increase as the complexity of the menu for operating the device increases.
- the first example of timing is the timing at which the user lifts his / her finger from the touch sensor 10, and the exclusion unit 24 excludes the position information stored in the first storage unit 22 in order from the latest position information. Determine whether or not. That is, the exclusion process is performed at the timing when one touch input by the user is completed. Then, both the region determination and the number determination are sequentially performed from the latest contact position among the plurality of contact positions detected in the touch input, and the position information associated with the contact position satisfying both determination conditions is first A process of deleting from the position information stored in the storage unit 22 is performed as an exclusion process.
- the second example of timing is a point in time when the contact position is detected by the detection unit 21, and it is determined whether or not the detected contact position is stored in the first storage unit 22 as position information. That is, the exclusion unit 24 performs, as the exclusion process, a process that is not stored in the first storage unit 22 when the contact position detected by the detection unit 21 satisfies both the determination conditions of the area determination and the number determination.
- a third example of timing is when the number of pieces of position information stored in the first storage unit 22 has gathered more than a predetermined number, and the exclusion unit 24 is stored in the first storage unit 22.
- the area determination and the number determination are performed to determine whether or not to exclude from the first storage unit 22 in order from the latest position information. If both determination conditions are satisfied, a process of deleting position information associated with a contact position that satisfies the determination condition from the position information stored in the first storage unit 22 is performed as an exclusion process.
- a fourth example of timing is when a predetermined number of pieces of position information stored in the first storage unit 22 are collected, and the exclusion unit 24 is detected by the detection unit 21 after that time.
- the area determination is performed for the contact position, and the process of not storing the contact position satisfying the area determination condition in the first storage unit 22 is performed as an exclusion process.
- the exclusion part 24 is the position detected by the detection part 21 after that, when the positional information where one touch input memorize
- the exclusion processing condition by the exclusion unit 24 is that the positional information associated with the contact position detected by the detection unit 21 is included in the exclusion region that is an arbitrary region. The region determination need not be a condition for the exclusion process.
- an example of a method for detecting a sudden change in the direction of finger movement is as follows. The first connecting from the point of any first position information stored in the first storage unit 22 to the point of the second position information stored in the first storage unit 22 after the first position information.
- the angle between the second vector and the second vector connecting the second positional information point and the third positional information point stored in the first storage unit 22 after the second positional information When the absolute value of the angle is equal to or greater than a predetermined angle, it is detected as a sudden change in the direction of finger movement.
- the specifying unit 25 uses the position information of at least two points among the position information stored in the first storage unit 22 to identify whether or not the touch gesture is determined in advance. Specify the type.
- FIG. 7 is a diagram for explaining the detection points 800 to 830 where the detection unit 21 detects the trajectory of the finger movement on the touch sensor 10.
- the detection points 800 to 830 shown in FIG. 7 are points indicating the contact positions associated with the position information stored in the first storage unit 22 in time order. That is, the detection points 800 to 830 are contact positions by the user and are points detected by the detection unit 21.
- the specifying unit 25 identifies that the touch input by the user is a flick operation.
- a first example of a specific operation of the specifying unit 25 when specifying a touch gesture by a flick operation is the oldest detection point 800 indicated by the position information among the position information stored in the first storage unit 22.
- the touch gesture is specified by using two points, that is, a start point that is and an end point that is the latest detection point 830.
- the specifying unit 25 sets the movement vector from the detection point 800 to the detection point 830 in the line segment connecting the detection point 800 and the detection point 830 as a time difference between the time when the detection point 800 is stored and the time when the detection point 830 is stored. Divide to calculate the velocity vector.
- the specifying unit 25 specifies a touch gesture by regarding the flick operation when the direction and norm of the calculated velocity vector are within a predetermined value range.
- the second example of the specific operation of the specifying unit 25 when specifying a touch gesture by a flick operation specifies a touch gesture by deriving an approximate line or an approximate curve using all position information. That is.
- the specifying unit 25 uses the least square method for the position information (that is, the time information t and the position information (x, y)) of each of the detection points 800 to 830 stored in the first storage unit 22.
- a relational expression x (t) of x and t and a relational expression y (t) of y and t are derived.
- the graph 900 in FIG. 8 is obtained by mapping the detection points 800 to 830 to the two-dimensional coordinates of the time t and the vertical axis y.
- the curve 910 can be derived by approximating the coordinate y with a quadratic expression at time t.
- a curve (not shown) in which the coordinate x is approximated by a quadratic expression at time t can be derived.
- the x and y components of the speed are obtained by differentiating the relational expression x (t) and the relational expression y (t) at time t. That is, the velocity vector at each time can be obtained.
- the specifying unit 25 specifies the touch gesture by regarding the flick operation when the direction and norm of the velocity vector at the calculated predetermined time are within a predetermined value range.
- FIG. 9 is a diagram for explaining the detection points 1000 to 1040 where the detection unit 21 detects the locus of finger movement on the touch sensor 10.
- the detection points 1000 to 1040 shown in FIG. 9 are points indicating the contact positions associated with the position information stored in the first storage unit 22 in time order. That is, the detection points 1000 to 1040 are contact positions by the user and are points detected by the detection unit 21.
- the specifying unit 25 identifies that the touch input by the user is a rotation operation.
- the specifying unit 25 connects the first position information point stored in the first storage unit 22 to the second position information point stored after the first position information.
- the angle formed by the vector and the second vector connecting the second position information point and the third position information point stored after the second position information is obtained.
- FIG. 10 will be described as an example.
- Detection points 1000 to 1040 in FIG. 10 are the same as detection points 1000 to 1040 in FIG.
- An angle 1120 formed by a vector 1100 from the detection point 1000 to the detection point 1010 and a vector 1110 from the detection point 1010 to the detection point 1020 is obtained.
- FIG. 11 will be described as an example. Detection points 1000 to 1040 in FIG. 11 are the same as detection points 1000 to 1040 in FIGS. Angles formed by vectors connecting the respective points are obtained as a corner 1120, a corner 1210, a corner 1220, and a corner 1230, respectively.
- the specifying unit 25 has a direction in which the angle between the vectors connecting the position information points obtained by the above method is within a certain range, and the sum of the angles formed is When a predetermined value is reached, it can be regarded as a rotational motion.
- the above method is a specific example of the specifying unit 25 when the touch gesture of the rotation operation is specified.
- FIG. 12 is a flowchart showing a flow of information input processing performed by the input processing unit 20 of the information input device.
- the first example of the above-described timing will be described as the timing at which the exclusion unit 24 performs the exclusion process.
- the detection unit 21 detects a contact position for a predetermined time with a predetermined sampling cycle (S101).
- storage part 22 memorize
- the first storage unit 22 stores position information in which all of the plurality of contact positions detected by the detection unit 21 are associated during a predetermined time.
- the exclusion unit 24 reads the latest position information among the position information stored in the first storage unit 22 (S103).
- the exclusion unit 24 determines whether or not the contact positions associated with the read position information have been detected in excess of a predetermined number (S104). That is, the exclusion unit 24 performs number determination on the read position information.
- the exclusion unit 24 determines whether the contact position associated with the position information belongs to the exclusion region. It is determined whether or not (S105). That is, the exclusion unit 24 performs region determination for the contact position.
- the exclusion unit 24 deletes the position information from the first storage unit 22 when it is determined by the region determination that the contact position associated with the read position information belongs to the exclusion region (S105: Yes) (S105). As a result, position information that satisfies both the number determination and area determination conditions is deleted from the first storage unit 22.
- step S106 the position information detected immediately before the position information read in step S106 is read (S107), and the process returns to step S104.
- the specifying unit 25 sets two or more pieces of position information stored in the first storage unit 22.
- the touch gesture is specified by using (S108).
- the exclusion unit 24 detects after the number of contact positions indicated by the position information stored in the first storage unit 22 exceeds a predetermined number. If the contact position detected by the unit 21 belongs to a predetermined region, an exclusion process for excluding the contact position from the position information stored in the first storage unit 22 is performed. Then, the specifying unit 25 specifies a touch gesture using two or more specific positions among the specific positions associated with the position information stored in the first storage unit 22 after the exclusion process by the exclusion unit 24.
- the user moves his / her finger from the detection point 300 to the detection point 330 to perform a flick operation from the bottom to the top, and then the touch sensor at the detection point 330.
- the exclusion area stored in the second storage unit 23 is the area on the left side and the upper side of the boundary line 390 and the number of pieces of position information stored in the first storage unit 22 is 4 or more, the exclusion unit It is assumed that position information detected after 24 is not stored in the first storage unit 22.
- the specifying unit 25 specifies the flick operation in the upward direction from the moving direction from the detection point 300 that is the start point to the detection point 330 that is the end point.
- the user moves the finger related to the touch input from the detection point 400 to the detection point 420 to perform the upward flick operation, and then, at the detection point 420, the protrusion around the touch sensor is formed.
- the exclusion unit 24 It is assumed that position information detected after the fourth is not stored in the first storage unit 22. In other words, it is assumed that the area above the boundary line 470 is set as the predetermined area, and the predetermined number is set to 3.
- the position information used by the specifying unit 25 to specify the touch gesture is three points from the detection points 400 to 420.
- the specifying unit 25 specifies the flick operation in the upward direction from the moving direction from the detection point 400 that is the start point to the detection point 420 that is the end point.
- the flick operation and its direction that are not correctly detected as intended by the user can be correctly detected by the information input apparatus 1 according to the first embodiment.
- the exclusion area is set around the bulge around the touch sensor, but is not limited to such a bulge.
- the exclusion region is a region around the protrusion of the touch sensor 11 when the touch sensor 11 is provided with a protrusion on the surface thereof.
- the touch sensor is used as a guide for the position so that the position of the finger touching the user can be identified by tactile sense without the user looking at the touch sensor.
- Protrusions may be provided on the surface.
- Detection points 610 to 650 are detection points detected by the touch sensor 11 in the locus of the contact position of the finger when the user intends an upward flick operation.
- the detection point 640 and the detection point 650 are detection points representing movement when the finger 660 collides with the protrusion 600 near the detection point 630 and the finger flows rightward due to the collision.
- the flick operation is specified based on the moving direction from the detection point 610 that is the start point to the detection point 650 that is the end point among the detection points 610 to 650 detected by the detection unit 21, It becomes a flick operation in the direction.
- the detection point 640 and the detection point 650 are candidates for position information that the exclusion unit 24 does not store in the first storage unit 22.
- the exclusion unit 24 is set to exclude position information included in the exclusion region when the first storage unit 22 stores three or more pieces of position information.
- the positions of the detection points 610 to 630 that are effective points that are not excluded from the first storage unit 22 by the exclusion unit 24 are set as described above, the positions of the detection points 610 to 630 that are effective points that are not excluded from the first storage unit 22 by the exclusion unit 24.
- a touch gesture is specified by the specifying unit 25 based on the series of information. In this case, the specifying unit 25 specifies the flick operation in the upward direction. In this way, in order to eliminate the influence of finger movement different from the user's intention from the material to identify the touch gesture, the periphery of the protrusion 600 and the assumed finger trajectory after the finger contacts the protrusion It is useful to set a candidate area as an exclusion area.
- the exclusion area is fixed to a predetermined area, but is not limited thereto, and may be dynamically changed.
- the exclusion area is set around the projection 600 for a predetermined time thereafter. It is also possible to do.
- the input processing unit 20 a further includes a region changing unit 26. That is, the information input device 1a having the input processing unit 20a as described above may be used.
- the area changing unit 26 sets the exclusion area around the protrusion for a predetermined time thereafter.
- the region changing unit 26 changes a predetermined region stored in the second storage unit 23 when at least one of the plurality of contact positions detected by the detecting unit 21 belongs to a specific region. . Note that it is considered that a sudden change in the direction of finger movement cannot occur unless an event of contact with the protrusion 600 occurs, and a protrusion is detected when a sudden change in the direction of finger movement is detected.
- an example of a method for detecting a sudden change in the direction of finger movement is as follows.
- the angle between the position information point and the second vector connecting the third position information point stored after the second position information is obtained, and the absolute value of the angle is equal to or greater than a predetermined angle.
- exclusion part 24 is the area
- the region changing unit 26 determines that the finger has touched the protrusion. I reckon. Thereafter, the region changing unit 26 sets the inside of the boundary line 670 as an exclusion region.
- the exclusion unit 24 is stored in the first storage unit 22 because the exclusion region is not set. Do not exclude existing location information. Therefore, it is possible to reduce the elimination of the touch input intended by the user caused by the fixed exclusion area being set. For this reason, a touch gesture can be recognized based on more positional information.
- the input processing unit 20 includes the detection unit 21, the first storage unit 22, the second storage unit 23, the exclusion unit 24, and the specifying unit 25.
- the input processing unit is a contact position where touch input to the touch sensor is performed, and touch input is started from a plurality of contact positions sequentially detected at a plurality of different timings during a predetermined time. If a touch gesture is specified using a plurality of specific contact positions that are detected after a predetermined number of contact positions are detected from the above and excluding contact positions belonging to a predetermined area, the input processing unit 20 Is not limited to having the above configuration.
- each of the above devices can be realized by a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or the hard disk unit.
- Each device achieves its functions by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip, and specifically, a computer system including a microprocessor, ROM, RAM, and the like. .
- a computer program is stored in the ROM.
- the system LSI achieves its functions by the microprocessor loading a computer program from the ROM to the RAM and performing operations such as operations in accordance with the loaded computer program.
- each of the above devices may be configured from an IC card that can be attached to and detached from each device or a single module.
- the IC card or module is a computer system that includes a microprocessor, ROM, RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its functions by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be realized by the method described above. Further, these methods may be realized by a computer program realized by a computer, or may be realized by a digital signal consisting of a computer program.
- the present invention also relates to a computer-readable recording medium that can read a computer program or a digital signal, such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc), You may implement
- a computer program or a digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention is also a computer system including a microprocessor and a memory.
- the memory stores a computer program, and the microprocessor may operate according to the computer program.
- program or digital signal may be recorded on a recording medium and transferred, or the program or digital signal may be transferred via a network or the like, and may be implemented by another independent computer system.
- the information input device has an effect that a more reliable operation can be performed by performing touch gesture recognition that eliminates finger movements different from the user's intention in touch gesture operation using a touch sensor. It is useful as an information input device and an information input method for electronic equipment.
Abstract
Description
本発明者は、「背景技術」の欄において記載した、スタイラス検出装置に関し、以下の問題が生じることを見出した。
図1は、本発明の実施の形態における情報入力装置のブロック図である。
10、11、210 タッチセンサ
20、20a 入力処理部
21 検出部
22 第一記憶部
23 第二記憶部
24 排除部
25 特定部
26 領域変化部
Claims (12)
- ユーザからのタッチ入力をタッチジェスチャとして特定する情報入力装置であって、
タッチセンサと、
前記タッチセンサに対する前記タッチ入力が行われた接触位置であって、所定時間の間における異なる複数のタイミングで順次検出された複数の接触位置の中から、前記タッチ入力が開始されてから予め定められた数の前記接触位置が検出された後に検出され、かつ、予め定められた領域に属する接触位置を除いた複数の特定接触位置を用いてタッチジェスチャを特定する入力処理部と、を備える
情報入力装置。 - 前記入力処理部は、
前記複数の接触位置を検出する検出部と、
前記検出部により前記所定時間の間に検出された複数の接触位置と、当該複数の接触位置のそれぞれが検出されたタイミングとを関連付けた情報を位置情報として記憶する第一記憶部と、
前記タッチセンサ上の領域のうちの前記予め定められた領域を記憶している第二記憶部と、
前記第一記憶部により記憶された位置情報で示される前記接触位置の数が前記予め定められた数を超えた以降に前記検出部により検出された接触位置が、前記第二記憶部により記憶されている前記予め定められた領域に属すれば、当該接触位置を排除する排除処理を行う排除部と、
前記排除処理後に前記第一記憶部により記憶されている複数の位置情報のうち、二以上の位置情報を用いて前記タッチジェスチャを特定する特定部と、を有する
請求項1に記載の情報入力装置。 - 前記入力処理部は、さらに、前記検出部で検出された前記複数の接触位置の少なくとも一つが特定の領域に属したときに、前記第二記憶部が記憶している前記予め定められた領域を変化させる領域変化部を有し、
前記排除部は、前記特定の領域に前記接触位置が属した後において前記検出部に検出された接触位置について、前記領域変化部により変化された予め定められた領域を用いた前記排除処理を行う
請求項1または2に記載の情報入力装置。 - 前記タッチセンサは、その表面に形成される突起物を有し、
前記領域変化部は、前記検出部で検出された複数の接触位置の少なくとも一つが前記特定の領域である、前記突起物が形成される領域に属したときに、前記第二記憶部が記憶している前記予め定められた領域を変化させる
請求項3に記載の情報入力装置。 - 前記入力処理部は、さらに、前記第一記憶部により記憶されている複数の位置情報のうち任意の第一の位置情報の点から、第一の位置情報より後で記憶された第2の位置情報の点までを結ぶ第一のベクトルと、第2の位置情報の点と第2の位置情報より後で記憶された第3の位置情報の点までを結ぶ第二のベクトルとのなす角の絶対値が予め定められた角度以上であるときに、前記第二記憶部が記憶している前記予め定められた領域を変化させる領域変化部を有し、
前記排除部は、前記特定の領域に前記接触位置が属した後において前記検出部に検出された接触位置について、前記領域変化部により変化された予め定められた領域を用いた前記排除処理を行う
請求項1または2に記載の情報入力装置。 - 前記第二記憶部は、前記タッチセンサの端として予め定められた端部領域を前記予め定められた領域として記憶している
請求項1から5のいずれか1項に記載の情報入力装置。 - ユーザからのタッチ入力をタッチジェスチャとして特定する情報入力装置であって、
タッチセンサと、
前記タッチセンサに対する前記タッチ入力が行われた接触位置であって、所定時間の間における異なる複数のタイミングで順次検出された複数の接触位置の中から、前記タッチ入力が開始されてから予め定められた数の前記接触位置が検出された後に検出され、かつ、前記複数の接触位置に記憶されている任意の第一の位置情報の点から、第一の位置情報より後で記憶された第2の位置情報の点までを結ぶ第一のベクトルと、第2の位置情報の点と第2の位置情報より後で記憶された第3の位置情報の点までを結ぶ第二のベクトルとのなす角の絶対値が予め定められた角度以上であるときに、接触位置を除いた複数の特定接触位置を用いてタッチジェスチャを特定する入力処理部と、を備える
情報入力装置。 - 前記特定部は、前記二以上の特定位置を線分で近似することにより算出される前記タッチ入力の方向と、当該二以上の特定位置を示す位置情報から算出される前記タッチ入力の速さとを示すパラメータ情報を持つ直線運動をタッチジェスチャとして特定する
請求項1から7のいずれか1項に記載の情報入力装置。 - 前記特定部は、前記第一記憶部で記憶された複数の特定位置のうち、三以上の特定位置を円弧で近似することにより算出される前記タッチ入力の方向と、当該三以上の特定位置を示す位置情報から算出される前記タッチ入力の速さとを示すパラメータ情報を持つ回転運動をタッチジェスチャとして特定する
請求項1から7のいずれか1項に記載の情報入力装置。 - ユーザからのタッチセンサへのタッチ入力をタッチジェスチャとして特定する情報入力方法であって、
前記タッチセンサに対する前記タッチ入力が行われた位置であって、異なる複数のタイミングで検出された複数の接触位置を検出する検出ステップと、
前記検出ステップにおいて所定時間の間に検出された複数の接触位置と、当該複数の接触位置のそれぞれが検出されたタイミングとを関連付けた情報を位置情報として第一記憶部に記憶させる記憶ステップと、
前記記憶ステップで記憶された位置情報で示される前記接触位置の数が予め定められた数を超えた以降に前記検出ステップにおいて検出された接触位置が、前記予め定められた領域に属すれば、当該接触位置を排除する排除処理を行う排除ステップと、
前記排除処理後に前記第一記憶部により記憶されている複数の位置情報のうち、二点以上の位置情報を用いてタッチジェスチャを特定する特定ステップと、を含む
情報入力方法。 - 請求項10に記載の各ステップをコンピュータに実行させるためのプログラム。
- ユーザからのタッチ入力をタッチジェスチャとして特定する情報入力装置に備えられる集積回路であって、
前記タッチセンサに対する前記タッチ入力が行われた位置であって、異なる複数のタイミングで検出された複数の接触位置を検出する検出部と、
前記検出部により前記所定時間の間に検出された複数の接触位置と、当該複数の接触位置のそれぞれが検出されたタイミングとを関連付けた情報を位置情報として記憶する第一記憶部と、
前記タッチセンサ上の領域のうちの前記予め定められた領域を記憶している第二記憶部と、
前記第一記憶部により記憶された位置情報で示される前記接触位置の数が前記予め定められた数を超えた以降に前記検出部により検出された接触位置が、前記第二記憶部により記憶されている前記予め定められた領域に属すれば、当該接触位置を排除する排除処理を行う排除部と、
前記排除処理後に前記第一記憶部により記憶されている複数の位置情報のうち、二以上の位置情報を用いて前記タッチジェスチャを特定する特定部と、を備える
集積回路。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013508301A JP5841590B2 (ja) | 2011-09-13 | 2012-09-06 | 情報入力装置及び情報入力方法 |
US13/989,213 US20130300704A1 (en) | 2011-09-13 | 2012-09-06 | Information input device and information input method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-199931 | 2011-09-13 | ||
JP2011199931 | 2011-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013038630A1 true WO2013038630A1 (ja) | 2013-03-21 |
Family
ID=47882883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/005672 WO2013038630A1 (ja) | 2011-09-13 | 2012-09-06 | 情報入力装置及び情報入力方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20130300704A1 (ja) |
JP (1) | JP5841590B2 (ja) |
WO (1) | WO2013038630A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58103744A (ja) * | 1981-12-16 | 1983-06-20 | Hitachi Ltd | 螢光ランプの製造方法 |
JP2014235524A (ja) * | 2013-05-31 | 2014-12-15 | グリー株式会社 | 情報処理方法、情報処理システム及びプログラム |
JP2017004565A (ja) * | 2016-09-29 | 2017-01-05 | グリー株式会社 | 情報処理方法、情報処理システム及びプログラム |
JP2017509977A (ja) * | 2014-02-21 | 2017-04-06 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | タッチデバイス上の電力消費を改善するための方法および装置 |
JP6389581B1 (ja) * | 2018-05-16 | 2018-09-12 | 株式会社Cygames | プログラム、電子装置、及び方法 |
JP2018156589A (ja) * | 2017-03-21 | 2018-10-04 | 富士ゼロックス株式会社 | 入力装置、画像形成装置及びプログラム |
JP2019200765A (ja) * | 2018-07-13 | 2019-11-21 | 株式会社Cygames | プログラム、電子装置、及び方法 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9507454B1 (en) * | 2011-09-19 | 2016-11-29 | Parade Technologies, Ltd. | Enhanced linearity of gestures on a touch-sensitive surface |
US20150149801A1 (en) * | 2013-11-26 | 2015-05-28 | Synaptics Incorporated | Complex wakeup gesture framework |
KR102282498B1 (ko) * | 2014-05-19 | 2021-07-27 | 삼성전자주식회사 | 디스플레이를 이용한 입력 처리 방법 및 장치 |
CN104731498B (zh) * | 2015-01-30 | 2016-07-13 | 努比亚技术有限公司 | 移动终端防误触控方法及装置 |
US10318128B2 (en) * | 2015-09-30 | 2019-06-11 | Adobe Inc. | Image manipulation based on touch gestures |
US10282579B2 (en) | 2016-01-29 | 2019-05-07 | Synaptics Incorporated | Initiating fingerprint capture with a touch screen |
US10592717B2 (en) | 2016-01-29 | 2020-03-17 | Synaptics Incorporated | Biometric imaging with hover detection |
EP3640783B1 (en) | 2017-09-11 | 2023-12-13 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Touch operation response method and device |
US10698533B2 (en) | 2017-09-11 | 2020-06-30 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method for responding to touch operation and electronic device |
WO2019047234A1 (zh) * | 2017-09-11 | 2019-03-14 | 广东欧珀移动通信有限公司 | 触摸操作响应方法及装置 |
WO2019047226A1 (zh) | 2017-09-11 | 2019-03-14 | 广东欧珀移动通信有限公司 | 触摸操作响应方法及装置 |
JP7218567B2 (ja) * | 2018-12-21 | 2023-02-07 | 京セラドキュメントソリューションズ株式会社 | 情報入力装置 |
JP7205236B2 (ja) * | 2019-01-08 | 2023-01-17 | トヨタ自動車株式会社 | リモート走行システム及びリモート走行アプリケーションソフト |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02112013A (ja) * | 1988-10-21 | 1990-04-24 | Toshiba Corp | タッチパネル式入力装置 |
JPH07114621A (ja) * | 1993-10-15 | 1995-05-02 | Hitachi Ltd | ジェスチャ認識方法およびそれを用いたジェスチャ認識装置 |
JP2009217814A (ja) * | 2008-01-04 | 2009-09-24 | Apple Inc | タッチ表面の端部領域におけるタッチ接触の選択的拒否 |
US20110069021A1 (en) * | 2009-06-12 | 2011-03-24 | Hill Jared C | Reducing false touchpad data by ignoring input when area gesture does not behave as predicted |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8279180B2 (en) * | 2006-05-02 | 2012-10-02 | Apple Inc. | Multipoint touch surface controller |
US8854316B2 (en) * | 2010-07-16 | 2014-10-07 | Blackberry Limited | Portable electronic device with a touch-sensitive display and navigation device and method |
WO2012037664A1 (en) * | 2010-09-24 | 2012-03-29 | Research In Motion Limited | Portable electronic device and method of controlling same |
CN102622120B (zh) * | 2011-01-31 | 2015-07-08 | 宸鸿光电科技股份有限公司 | 多点触控面板的触碰轨迹追踪方法 |
US8994670B2 (en) * | 2011-07-21 | 2015-03-31 | Blackberry Limited | Electronic device having touch-sensitive display and method of controlling same to identify touches on the touch-sensitive display |
-
2012
- 2012-09-06 US US13/989,213 patent/US20130300704A1/en not_active Abandoned
- 2012-09-06 WO PCT/JP2012/005672 patent/WO2013038630A1/ja active Application Filing
- 2012-09-06 JP JP2013508301A patent/JP5841590B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02112013A (ja) * | 1988-10-21 | 1990-04-24 | Toshiba Corp | タッチパネル式入力装置 |
JPH07114621A (ja) * | 1993-10-15 | 1995-05-02 | Hitachi Ltd | ジェスチャ認識方法およびそれを用いたジェスチャ認識装置 |
JP2009217814A (ja) * | 2008-01-04 | 2009-09-24 | Apple Inc | タッチ表面の端部領域におけるタッチ接触の選択的拒否 |
US20110069021A1 (en) * | 2009-06-12 | 2011-03-24 | Hill Jared C | Reducing false touchpad data by ignoring input when area gesture does not behave as predicted |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58103744A (ja) * | 1981-12-16 | 1983-06-20 | Hitachi Ltd | 螢光ランプの製造方法 |
JP2014235524A (ja) * | 2013-05-31 | 2014-12-15 | グリー株式会社 | 情報処理方法、情報処理システム及びプログラム |
JP2017509977A (ja) * | 2014-02-21 | 2017-04-06 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | タッチデバイス上の電力消費を改善するための方法および装置 |
JP2017004565A (ja) * | 2016-09-29 | 2017-01-05 | グリー株式会社 | 情報処理方法、情報処理システム及びプログラム |
JP2018156589A (ja) * | 2017-03-21 | 2018-10-04 | 富士ゼロックス株式会社 | 入力装置、画像形成装置及びプログラム |
JP6389581B1 (ja) * | 2018-05-16 | 2018-09-12 | 株式会社Cygames | プログラム、電子装置、及び方法 |
WO2019220873A1 (ja) * | 2018-05-16 | 2019-11-21 | 株式会社Cygames | プログラム、電子装置、及び方法 |
JP2019200595A (ja) * | 2018-05-16 | 2019-11-21 | 株式会社Cygames | プログラム、電子装置、及び方法 |
CN112424739A (zh) * | 2018-05-16 | 2021-02-26 | Cy游戏公司 | 程序、电子装置和方法 |
JP2019200765A (ja) * | 2018-07-13 | 2019-11-21 | 株式会社Cygames | プログラム、電子装置、及び方法 |
JP7250451B2 (ja) | 2018-07-13 | 2023-04-03 | 株式会社Cygames | プログラム、電子装置、及び方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5841590B2 (ja) | 2016-01-13 |
US20130300704A1 (en) | 2013-11-14 |
JPWO2013038630A1 (ja) | 2015-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5841590B2 (ja) | 情報入力装置及び情報入力方法 | |
JP6429981B2 (ja) | ユーザ入力の意図の分類 | |
US9557852B2 (en) | Method of identifying palm area of a touch panel and a updating method thereof | |
JP5728008B2 (ja) | 情報入力装置、情報入力方法及びプログラム | |
TWI514229B (zh) | 圖形編輯方法以及電子裝置 | |
US9348458B2 (en) | Gestures for touch sensitive input devices | |
US20140300559A1 (en) | Information processing device having touch screen | |
CN105117056B (zh) | 一种操作触摸屏的方法和设备 | |
US20120032903A1 (en) | Information processing apparatus, information processing method, and computer program | |
JP2010244132A (ja) | タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法およびユーザインタフェース制御プログラム | |
EP1774429A2 (en) | Gestures for touch sensitive input devices | |
US9477398B2 (en) | Terminal and method for processing multi-point input | |
TWI526952B (zh) | 電容式觸控裝置及其物件辨識方法 | |
WO2012111227A1 (ja) | タッチ式入力装置、電子機器および入力方法 | |
US20140298275A1 (en) | Method for recognizing input gestures | |
US9367169B2 (en) | Method, circuit, and system for hover and gesture detection with a touch screen | |
US20170308177A1 (en) | Capacitive Keyboard Having Variable Make Points | |
CN104679312A (zh) | 电子装置及其触控系统、触控方法 | |
KR20070079858A (ko) | 터치패드를 이용한 드래그 기능 구현 방법 | |
KR20110047556A (ko) | 멀티 터치 입력 기반 스트로크 특징 데이터에 의한 전자 기기의 동작 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013508301 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12831044 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13989213 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12831044 Country of ref document: EP Kind code of ref document: A1 |