US20190250814A1 - Segment Length Measurement Using a Touch Screen System in Response to Gesture Input - Google Patents
Segment Length Measurement Using a Touch Screen System in Response to Gesture Input Download PDFInfo
- Publication number
- US20190250814A1 US20190250814A1 US16/395,143 US201916395143A US2019250814A1 US 20190250814 A1 US20190250814 A1 US 20190250814A1 US 201916395143 A US201916395143 A US 201916395143A US 2019250814 A1 US2019250814 A1 US 2019250814A1
- Authority
- US
- United States
- Prior art keywords
- location
- touch screen
- gesture
- screen display
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
Definitions
- Embodiments of the present invention generally relate to a method and system for processing gestures to cause computation of measurements of an angle or a segment using a touch system.
- Angles are generally defined by three (3) points. Calculating the measurement of an angle generally involves multiple steps—accessing the menu, choosing a measurement tool, and then touching the three points defining the angle.
- Embodiments of the present invention relate to a method and a system for processing gestures to cause computation of measurements of a line segment using a touch screen.
- the system includes a processor, a touch screen coupled to the processor, a gesture module coupled to the processor for executing a gesturing method.
- the method includes determining a gesture shape and whether the gesture shape selects a first line segment by intersecting the first line segment. When the gesture shape selects a first line segment, it is then determined whether the gesture shape also selects an additional line segment different from the first line segment. When an additional line segment is not selected, the method calculates the length measurement from the beginning point of the segment to the end point of the selected first line segment. The method further displays the length measurement on a display.
- FIG. 1 is an embodiment of a diagram depicting a gesture for triggering calculating an angle using a touch screen
- FIG. 2 is an embodiment of a diagram depicting a calculation of the angle of FIG. 1 ;
- FIG. 3 is an embodiment of a diagram depicting a gesture for triggering calculating a segment using a touch screen
- FIG. 4 is an embodiment of a diagram depicting a calculation of the segment of FIG. 3 ;
- FIG. 5 is an embodiment of a flow diagram of a gesturing method for processing gestures to cause computation of measurements of an angle or a segment using a touch system
- FIG. 6 is an embodiment of a system diagram utilizing the method of FIG. 5 .
- a touch screen is a screen that is capable of recognizing a gesture without necessarily requiring an actual (e.g., physical) touch of the screen, such as, a gesture of hand, finger, stylus, motion, etc.
- a touch screen is a screen that recognizes a gesture from actually touching the screen.
- FIG. 1 is an embodiment of a diagram depicting a gesture for triggering calculating a measurement of an angle using a touch screen.
- FIG. 1 an embodiment of an arc or a line drawn from one line of the angle to the other is depicted. As shown in FIG.
- the user gestures an arc or a line from line AC to line AB or from AB to line AC to calculate angle CAB or BAC. While lines AC and AB are shown as touching (e.g., meeting at point A) in the depicted embodiment, it is understood that in other embodiments, such lines forming an angle need not necessarily touch.
- FIG. 2 is an embodiment of a diagram depicting a calculation of the angle of FIG. 1 .
- a method or system coupled to the touch screen recognizes the gesture and computes the angle's measurement, which is shown in FIG. 2 to be 56.4°.
- a user utilizing a touch screen provides a gesture by sliding a finger, a pointer or the like across a segment.
- FIG. 3 is an embodiment of a diagram depicting a gesture for triggering calculating measurement of a segment's length using a touch screen.
- FIG. 3 an embodiment of a line drawn across a segment is depicted.
- the user gestures a line crossing segment DE to calculate the measurement of segment DE.
- FIG. 4 is an embodiment of a diagram depicting a calculation of the segment of FIG. 3 .
- a method or system coupled to the touch screen recognizes the gesture and computes the measurement of the segment from the segment's beginning point to its end point, which is shown in FIG. 4 to be 7.17 cm.
- FIG. 5 is an embodiment of a flow diagram of a gesturing method 500 for processing gestures to cause computation of measurements of an angle or a segment using a touch system.
- the method starts at step 502 and proceeds to step 504 , wherein the method 500 determines the shape of the gesture.
- the method 500 determines if the gestured shape is an arc or line between two touching lines. As stated above, in some embodiments, the two lines need not necessarily be touching. If a gestured arc or line is between two such lines, then the method 500 proceeds to step 508 , wherein the angle between the lines and behind the arc or line is calculated. Otherwise, the method 500 proceeds to step 510 .
- the method 500 determines if the gestured shape is a line that intersects a segment (e.g., one line as opposed to two lines). if it is, then the method 500 proceeds to step 512 , wherein the measurement between the beginning point and the end point of the segment is calculated. Otherwise, the method 500 proceeds to step 516 . From steps 508 and 512 , the method 500 proceeds to step 514 , wherein the calculated measurements are displayed and the method 500 proceeds to step 516 . The method 500 ends at step 516 .
- FIG. 6 is an embodiment of a gesture system diagram that performs or otherwise utilizes the method of FIG. 5 .
- the gesture system comprises a processor, a gesture module and a touch screen.
- the processor is capable of executing instructions to perform functions, such as, calculating measurements, determining gestures from a touch screen, and the like.
- the gesture module performs a method such as the method 500 of FIG. 5 .
- the touch screen is coupled to the processor directly, indirectly or wirelessly to facilitate gesture recognition and/or determination by the gesture system.
Abstract
Disclosed embodiments relate to processing of gestures to cause computation of measurements of a line using a touch screen. A system includes a processor, a touch screen coupled to the processor, a gesture module coupled to the processor for executing a gesturing method. The method includes determining a gesture shape and whether the gesture shape selects a first line segment by intersecting the first line segment. When the gesture shape selects a first line segment, it is then determined whether the gesture shape also selects an additional line segment different from the first line segment. When an additional line segment is not selected, the method calculates the length measurement from the beginning point of the segment to the end point of the selected first line segment. The method further displays the length measurement on a display.
Description
- The present application is a continuation of and claims priority to U.S. patent application Ser. No. 15/606,527, filed on May 26, 2017, entitled Segment Length Measurement Using a Touch Screen System in Response to Gesture Input”, which is a continuation of and claims priority to U.S. patent application Ser. No. 14/638,735 filed on Mar. 4, 2015, now U.S. Pat. No. 9,690,478, granted on Jun. 27, 2017, which claims priority to U.S. Provisional Patent Application No. 61/947,747 entitled “Gesture to Cause Computation of Angle Measurement in a Touch System”, filed on Mar. 4, 2014, all of which are hereby incorporated by reference in their entireties.
- Embodiments of the present invention generally relate to a method and system for processing gestures to cause computation of measurements of an angle or a segment using a touch system.
- In a touch system, measuring segments and angles is cumbersome. Angles are generally defined by three (3) points. Calculating the measurement of an angle generally involves multiple steps—accessing the menu, choosing a measurement tool, and then touching the three points defining the angle.
- Therefore, there is a need for a method and/or apparatus for processing gestures to cause computation of measurements of an angle or a line using a touch system.
- Embodiments of the present invention relate to a method and a system for processing gestures to cause computation of measurements of a line segment using a touch screen. The system includes a processor, a touch screen coupled to the processor, a gesture module coupled to the processor for executing a gesturing method. The method includes determining a gesture shape and whether the gesture shape selects a first line segment by intersecting the first line segment. When the gesture shape selects a first line segment, it is then determined whether the gesture shape also selects an additional line segment different from the first line segment. When an additional line segment is not selected, the method calculates the length measurement from the beginning point of the segment to the end point of the selected first line segment. The method further displays the length measurement on a display.
- So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
-
FIG. 1 is an embodiment of a diagram depicting a gesture for triggering calculating an angle using a touch screen; -
FIG. 2 is an embodiment of a diagram depicting a calculation of the angle ofFIG. 1 ; -
FIG. 3 is an embodiment of a diagram depicting a gesture for triggering calculating a segment using a touch screen; -
FIG. 4 is an embodiment of a diagram depicting a calculation of the segment ofFIG. 3 ; -
FIG. 5 is an embodiment of a flow diagram of a gesturing method for processing gestures to cause computation of measurements of an angle or a segment using a touch system; and -
FIG. 6 is an embodiment of a system diagram utilizing the method ofFIG. 5 . - Utilizing a touch screen, the user slides fingers, a pointer or the likes from one line to another. In some embodiments, a touch screen is a screen that is capable of recognizing a gesture without necessarily requiring an actual (e.g., physical) touch of the screen, such as, a gesture of hand, finger, stylus, motion, etc. In other embodiments, a touch screen is a screen that recognizes a gesture from actually touching the screen.
FIG. 1 is an embodiment of a diagram depicting a gesture for triggering calculating a measurement of an angle using a touch screen. InFIG. 1 , an embodiment of an arc or a line drawn from one line of the angle to the other is depicted. As shown inFIG. 1 , the user gestures an arc or a line from line AC to line AB or from AB to line AC to calculate angle CAB or BAC. While lines AC and AB are shown as touching (e.g., meeting at point A) in the depicted embodiment, it is understood that in other embodiments, such lines forming an angle need not necessarily touch. -
FIG. 2 is an embodiment of a diagram depicting a calculation of the angle ofFIG. 1 . As shown inFIG. 2 , a method or system coupled to the touch screen recognizes the gesture and computes the angle's measurement, which is shown inFIG. 2 to be 56.4°. - In accordance with a further aspect of the present disclosure, a user utilizing a touch screen provides a gesture by sliding a finger, a pointer or the like across a segment.
FIG. 3 is an embodiment of a diagram depicting a gesture for triggering calculating measurement of a segment's length using a touch screen. InFIG. 3 , an embodiment of a line drawn across a segment is depicted. As shown inFIG. 3 , the user gestures a line crossing segment DE to calculate the measurement of segment DE. -
FIG. 4 is an embodiment of a diagram depicting a calculation of the segment ofFIG. 3 . As shown inFIG. 4 , a method or system coupled to the touch screen recognizes the gesture and computes the measurement of the segment from the segment's beginning point to its end point, which is shown inFIG. 4 to be 7.17 cm. -
FIG. 5 is an embodiment of a flow diagram of agesturing method 500 for processing gestures to cause computation of measurements of an angle or a segment using a touch system. The method starts atstep 502 and proceeds tostep 504, wherein themethod 500 determines the shape of the gesture. Atstep 506, themethod 500 determines if the gestured shape is an arc or line between two touching lines. As stated above, in some embodiments, the two lines need not necessarily be touching. If a gestured arc or line is between two such lines, then themethod 500 proceeds tostep 508, wherein the angle between the lines and behind the arc or line is calculated. Otherwise, themethod 500 proceeds tostep 510. - At
step 510, themethod 500 determines if the gestured shape is a line that intersects a segment (e.g., one line as opposed to two lines). if it is, then themethod 500 proceeds tostep 512, wherein the measurement between the beginning point and the end point of the segment is calculated. Otherwise, themethod 500 proceeds tostep 516. Fromsteps method 500 proceeds tostep 514, wherein the calculated measurements are displayed and themethod 500 proceeds tostep 516. Themethod 500 ends atstep 516. -
FIG. 6 is an embodiment of a gesture system diagram that performs or otherwise utilizes the method ofFIG. 5 . The gesture system comprises a processor, a gesture module and a touch screen. The processor is capable of executing instructions to perform functions, such as, calculating measurements, determining gestures from a touch screen, and the like. The gesture module performs a method such as themethod 500 ofFIG. 5 . The touch screen is coupled to the processor directly, indirectly or wirelessly to facilitate gesture recognition and/or determination by the gesture system. - While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Claims (19)
1-18. (canceled)
19. A system comprising:
a touch screen display for displaying at least one location and having a touch screen for receiving a gesture input;
a processor communicatively coupled to the touch screen; and
a memory communicatively coupled to the processor and storing program instructions that, when executed by the processor, cause the processor to:
determine a gesture shape indicated by the gesture input;
determine whether the gesture input selects a first location displayed on the touch screen display, wherein selection of the first location is indicated by the gesture shape intersecting the first location;
when the gesture input selects the first location, determine whether the gesture input selects a second location displayed on the touch screen display, wherein selection of the second location is indicated by the gesture shape intersecting the second location; and
when the gesture input does not select a second location, determine a beginning point and an end point of the selected first location, determine a length measurement between the beginning point and the end point, and display the determined length measurement on the touch screen display.
20. The system as claimed in claim 19 , wherein the gesture shape is an arc.
21. The system as claimed in claim 19 , wherein the gesture shape is a line.
22. The system as claimed in claim 19 , wherein the determined length measurement is displayed on the touch screen display in close proximity to the selected first location.
23. The system as claimed in claim 19 , wherein the program instructions, when executed by the processor, further cause the processor to:
when the gesture input does select a second location displayed on the touch screen display, identify an angle formed between the selected first location and the selected second location, determine an angle measurement of the angle, and display the determined angle measurement on the touch screen display.
24. The system as claimed in claim 23 , wherein the determined angle measurement is displayed on the touch screen display in close proximity to the angle.
25. A method for initiating a measurement calculation of a location displayed on a touch screen display of a processor-based system, the method comprising:
using the touch screen display to receive a gesture input;
using a processor of the processor-based system to determine a gesture shape indicated by the gesture input;
using the processor to determine whether the gesture input selects the location, wherein selection of the location is indicated by the gesture shape intersecting the location;
when the location is selected by the gesture shape, using the processor to determine whether the gesture input selects at least one other location displayed on the touch screen display, wherein selection of the at least one other location is indicated by the gesture shape also intersecting the at least one other location; and
when the gesture input does not select at least one other location displayed on the touch screen display, using the processor to identify a beginning point and an end point of the selected location, determine a length measurement between the beginning point and the end point, and display the determined length measurement on the touch screen display.
26. The method as claimed in claim 25 , comprising, when the gesture input selects one other location displayed on the touch screen display, using the processor to identify an angle formed between the selected location and the selected one other location, determine an angle measurement of the angle, and display the determined angle measurement on the touch screen display.
27. The method as claimed in claim 26 , wherein the determined angle measurement is displayed on the touch screen display in close proximity to the angle.
28. The method as claimed in claim 25 , wherein the gesture shape is an arc.
29. The method as claimed in claim 25 , wherein the gesture shape is line.
30. The method as claimed in claim 25 , wherein the determined length measurement is displayed on the touch screen display in close proximity to the selected location.
31. A non-transitory computer-readable medium comprising instructions, that when executed by a processor, cause the processor to:
determine a gesture shape indicated by a gesture input received by a touch screen display;
determine whether the gesture input selects a first location displayed on the touch screen display, wherein selection of the first location is indicated by the gesture shape intersecting the first location;
when the gesture input selects the first location, determine whether the gesture input selects a second location displayed on the touch screen display, wherein selection of the second location is indicated by the gesture shape intersecting the second location; and
when the gesture input does not select a second location, determine a beginning point and an end point of the selected first location, determine a length measurement between the beginning point and the end point, and display the determined length measurement on the touch screen display.
32. The non-transitory computer-readable medium as claimed in claim 31 , wherein the gesture shape is an arc.
33. The non-transitory computer-readable medium as claimed in claim 31 , wherein the gesture shape is a line.
34. The non-transitory computer-readable medium as claimed in claim 31 , wherein the instructions cause the processor to display the determined length measurement on the touch screen display in close proximity to the selected first location.
35. The non-transitory computer-readable medium as claimed in claim 31 , wherein the instructions, when executed by the processor, further cause the processor to:
when the gesture input does select a second location displayed on the touch screen display, identify an angle formed between the selected first location and the selected second location, determine an angle measurement of the angle, and display the determined angle measurement on the touch screen display.
36. The non-transitory computer-readable medium as claimed in claim 35 , wherein the instructions cause the processor to display the determined angle measurement on the touch screen display in close proximity to the angle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/395,143 US20190250814A1 (en) | 2014-03-04 | 2019-04-25 | Segment Length Measurement Using a Touch Screen System in Response to Gesture Input |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461947747P | 2014-03-04 | 2014-03-04 | |
US14/638,735 US9690478B2 (en) | 2014-03-04 | 2015-03-04 | Method and system for processing gestures to cause computation of measurement of an angle or a segment using a touch system |
US15/606,527 US10318150B2 (en) | 2014-03-04 | 2017-05-26 | Segment length measurement using a touch screen system in response to gesture input |
US16/395,143 US20190250814A1 (en) | 2014-03-04 | 2019-04-25 | Segment Length Measurement Using a Touch Screen System in Response to Gesture Input |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/606,527 Continuation US10318150B2 (en) | 2014-03-04 | 2017-05-26 | Segment length measurement using a touch screen system in response to gesture input |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190250814A1 true US20190250814A1 (en) | 2019-08-15 |
Family
ID=54017395
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/638,735 Active 2035-03-06 US9690478B2 (en) | 2014-03-04 | 2015-03-04 | Method and system for processing gestures to cause computation of measurement of an angle or a segment using a touch system |
US15/606,527 Active US10318150B2 (en) | 2014-03-04 | 2017-05-26 | Segment length measurement using a touch screen system in response to gesture input |
US16/395,143 Abandoned US20190250814A1 (en) | 2014-03-04 | 2019-04-25 | Segment Length Measurement Using a Touch Screen System in Response to Gesture Input |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/638,735 Active 2035-03-06 US9690478B2 (en) | 2014-03-04 | 2015-03-04 | Method and system for processing gestures to cause computation of measurement of an angle or a segment using a touch system |
US15/606,527 Active US10318150B2 (en) | 2014-03-04 | 2017-05-26 | Segment length measurement using a touch screen system in response to gesture input |
Country Status (1)
Country | Link |
---|---|
US (3) | US9690478B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108804016B (en) * | 2018-06-29 | 2020-12-29 | 江苏特思达电子科技股份有限公司 | Object identification method and device based on touch screen and electronic equipment |
CN111443802B (en) * | 2020-03-25 | 2023-01-17 | 维沃移动通信有限公司 | Measurement method and electronic device |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020113784A1 (en) * | 2000-12-29 | 2002-08-22 | Feilmeier Michael Leon | Portable computer aided design apparatus and method |
US20100309115A1 (en) * | 2009-06-03 | 2010-12-09 | Honda Motor Co., Ltd. | Drawing assist device, drawing assist program, and drawing assist method |
US20130205242A1 (en) * | 2012-02-06 | 2013-08-08 | Michael K. Colby | Character-String Completion |
US20140104177A1 (en) * | 2012-10-16 | 2014-04-17 | Google Inc. | Multi-gesture text input prediction |
US20140108989A1 (en) * | 2012-10-16 | 2014-04-17 | Google Inc. | Character deletion during keyboard gesture |
US20140115521A1 (en) * | 2012-10-19 | 2014-04-24 | Google Inc. | Decoding imprecise gestures for gesture-keyboards |
US20140218299A1 (en) * | 2013-02-05 | 2014-08-07 | Google Inc. | Gesture keyboard input of non-dictionary character strings |
US20140237356A1 (en) * | 2013-01-21 | 2014-08-21 | Keypoint Technologies (Uk) Limited | Text input method and device |
US20140317547A1 (en) * | 2013-04-22 | 2014-10-23 | Google Inc. | Dynamically-positioned character string suggestions for gesture typing |
US20140327622A1 (en) * | 2013-05-03 | 2014-11-06 | Google Inc. | Alternative hypothesis error correction for gesture typing |
US20140368434A1 (en) * | 2013-06-13 | 2014-12-18 | Microsoft Corporation | Generation of text by way of a touchless interface |
US20160224240A1 (en) * | 2015-02-03 | 2016-08-04 | Google Inc. | User state-adaptive text input |
Family Cites Families (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5555363A (en) * | 1993-09-30 | 1996-09-10 | Apple Computer, Inc. | Resetting the case of text on a computer display |
US7401299B2 (en) * | 2001-09-05 | 2008-07-15 | Autodesk, Inc. | Method and apparatus for providing a presumptive drafting solution |
JPH1011208A (en) * | 1996-06-24 | 1998-01-16 | Sharp Corp | Coordinate input device |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US6532018B1 (en) * | 1999-04-19 | 2003-03-11 | Microsoft Corporation | Combined floating-point logic core and frame buffer |
AU2001251344A1 (en) * | 2000-04-05 | 2001-10-23 | Dimensional Media Associates, Inc. | Methods and apparatus for virtual touchscreen computer interface controller |
US7859519B2 (en) * | 2000-05-01 | 2010-12-28 | Tulbert David J | Human-machine interface |
JP4176299B2 (en) * | 2000-09-29 | 2008-11-05 | 富士フイルム株式会社 | Medical image display system |
US7095401B2 (en) * | 2000-11-02 | 2006-08-22 | Siemens Corporate Research, Inc. | System and method for gesture interface |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US20020146175A1 (en) * | 2001-02-12 | 2002-10-10 | Josh Goldfoot | Method of shape recognition using postulated lines |
US20020141643A1 (en) * | 2001-02-15 | 2002-10-03 | Denny Jaeger | Method for creating and operating control systems |
US7254787B2 (en) * | 2001-02-15 | 2007-08-07 | Denny Jaeger | Method for formatting text by hand drawn inputs |
JP4126610B2 (en) * | 2002-12-26 | 2008-07-30 | エルジー ディスプレイ カンパニー リミテッド | Liquid crystal display |
US7098896B2 (en) * | 2003-01-16 | 2006-08-29 | Forword Input Inc. | System and method for continuous stroke word-based text input |
US7453439B1 (en) * | 2003-01-16 | 2008-11-18 | Forward Input Inc. | System and method for continuous stroke word-based text input |
US8745541B2 (en) * | 2003-03-25 | 2014-06-03 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
FI114831B (en) * | 2003-05-07 | 2004-12-31 | Tekla Corp | Computer-aided model design |
US7310091B2 (en) * | 2003-09-16 | 2007-12-18 | Acer Incorporated | Handwriting pen capable of simulating different strokes |
KR20060113930A (en) * | 2003-12-30 | 2006-11-03 | 리포소닉스 인코포레이티드 | Systems and methods for the destruction of adipose tissue |
CN100594473C (en) * | 2004-01-19 | 2010-03-17 | 皇家飞利浦电子股份有限公司 | Method and apparatus providing flexible measurement functionality for medical images |
US20060001654A1 (en) * | 2004-06-30 | 2006-01-05 | National Semiconductor Corporation | Apparatus and method for performing data entry with light based touch screen displays |
US20060001656A1 (en) * | 2004-07-02 | 2006-01-05 | Laviola Joseph J Jr | Electronic ink system |
JP4903371B2 (en) * | 2004-07-29 | 2012-03-28 | 任天堂株式会社 | Game device and game program using touch panel |
US20060028457A1 (en) * | 2004-08-08 | 2006-02-09 | Burns David W | Stylus-Based Computer Input System |
US20080072234A1 (en) * | 2006-09-20 | 2008-03-20 | Gerald Myroup | Method and apparatus for executing commands from a drawing/graphics editor using task interaction pattern recognition |
JP2009176114A (en) * | 2008-01-25 | 2009-08-06 | Mitsubishi Electric Corp | Touch panel device and user interface device |
JP5065175B2 (en) * | 2008-06-19 | 2012-10-31 | エヌアイシ・オートテック株式会社 | CAD program, structure design system, and design method |
US20090322701A1 (en) * | 2008-06-30 | 2009-12-31 | Tyco Electronics Corporation | Method and apparatus for detecting two simultaneous touches and gestures on a resistive touchscreen |
US8604364B2 (en) * | 2008-08-15 | 2013-12-10 | Lester F. Ludwig | Sensors, algorithms and applications for a high dimensional touchpad |
US8888604B2 (en) * | 2008-10-09 | 2014-11-18 | Golf Impact, Llc | Golf swing measurement and analysis system |
WO2010113397A1 (en) * | 2009-03-31 | 2010-10-07 | 三菱電機株式会社 | Display input device |
JP5086394B2 (en) * | 2009-07-07 | 2012-11-28 | ローム株式会社 | Touch panel control circuit, control method, touch panel input device using them, and electronic device |
JP5351283B2 (en) * | 2009-10-16 | 2013-11-27 | パイオニア株式会社 | Map display device, map display method, and map display program |
DE102009058802B4 (en) * | 2009-12-18 | 2018-03-29 | Airbus Operations Gmbh | Arrangement for the combined representation of a real and a virtual model |
US20110157083A1 (en) * | 2009-12-31 | 2011-06-30 | Nuvoton Technology Corporation | Resistive touch apparatus |
US20110221701A1 (en) * | 2010-03-10 | 2011-09-15 | Focaltech Systems Ltd. | Multi-touch detection method for capacitive touch screens |
US8943701B2 (en) * | 2010-06-28 | 2015-02-03 | Trimble Navigation Limited | Automated layout and point transfer system |
US8972467B2 (en) * | 2010-08-31 | 2015-03-03 | Sovanta Ag | Method for selecting a data set from a plurality of data sets by means of an input device |
US8767019B2 (en) * | 2010-08-31 | 2014-07-01 | Sovanta Ag | Computer-implemented method for specifying a processing operation |
GB2486445B (en) * | 2010-12-14 | 2013-08-14 | Epson Norway Res And Dev As | Camera-based multi-touch interaction apparatus system and method |
US20120216113A1 (en) * | 2011-02-18 | 2012-08-23 | Google Inc. | Touch gestures for text-entry operations |
US8686943B1 (en) * | 2011-05-13 | 2014-04-01 | Imimtek, Inc. | Two-dimensional method and system enabling three-dimensional user interaction with a device |
JP2013003596A (en) * | 2011-06-10 | 2013-01-07 | Sony Corp | Information processing apparatus, program, and information processing method |
JP2012256270A (en) * | 2011-06-10 | 2012-12-27 | Sony Corp | Information processor, program, and information processing method |
US8959459B2 (en) * | 2011-06-15 | 2015-02-17 | Wms Gaming Inc. | Gesture sensing enhancement system for a wagering game |
GB201110159D0 (en) * | 2011-06-16 | 2011-07-27 | Light Blue Optics Ltd | Touch sensitive display devices |
EP2761251B1 (en) * | 2011-09-27 | 2018-05-09 | Leica Geosystems AG | Measuring system and method for marking a destination known in a coordinate system |
KR20130085094A (en) * | 2012-01-19 | 2013-07-29 | 삼성전기주식회사 | User interface device and user interface providing thereof |
JP6004716B2 (en) * | 2012-04-13 | 2016-10-12 | キヤノン株式会社 | Information processing apparatus, control method therefor, and computer program |
US9418672B2 (en) * | 2012-06-05 | 2016-08-16 | Apple Inc. | Navigation application with adaptive instruction text |
JP5342040B1 (en) * | 2012-06-07 | 2013-11-13 | 株式会社エヌ・ティ・ティ・ドコモ | Display device, display method, and program |
US9235310B2 (en) * | 2012-06-25 | 2016-01-12 | Texas Instruments Incorporated | Apparatus to detect dual gesture on a resistive screen |
JP5974685B2 (en) * | 2012-07-04 | 2016-08-23 | 富士ゼロックス株式会社 | Display device and program |
KR101452053B1 (en) * | 2012-11-26 | 2014-10-22 | 삼성전기주식회사 | Touchscreen device and screen zooming method thereof |
AU2013363975A1 (en) * | 2012-12-19 | 2015-07-23 | Willem Morkel Van Der Westhuizen | User control of the trade-off between rate of navigation and ease of acquisition in a graphical user interface |
KR102157332B1 (en) * | 2013-03-15 | 2020-09-17 | 삼성전자주식회사 | Method and apparatus for controlling zoom function in an electronic device |
JP2014203351A (en) * | 2013-04-08 | 2014-10-27 | 船井電機株式会社 | Drawing device, drawing method, and drawing program |
DE102013009009B4 (en) * | 2013-05-17 | 2023-08-03 | Elektrobit Automotive Gmbh | System and method for data selection using a touch-sensitive surface |
EP3005041A1 (en) * | 2013-05-29 | 2016-04-13 | Brainlab AG | Gesture feedback for non-sterile medical displays |
US9891812B2 (en) * | 2013-09-14 | 2018-02-13 | Changwat TUMWATTANA | Gesture-based selection and manipulation method |
US9176657B2 (en) * | 2013-09-14 | 2015-11-03 | Changwat TUMWATTANA | Gesture-based selection and manipulation method |
EP3047354A1 (en) * | 2013-09-17 | 2016-07-27 | Koninklijke Philips N.V. | Gesture enabled simultaneous selection of range and value |
US9477403B2 (en) * | 2013-11-26 | 2016-10-25 | Adobe Systems Incorporated | Drawing on a touchscreen |
US10795558B2 (en) * | 2015-06-07 | 2020-10-06 | Apple Inc. | Device, method, and graphical user interface for providing and interacting with a virtual drawing aid |
-
2015
- 2015-03-04 US US14/638,735 patent/US9690478B2/en active Active
-
2017
- 2017-05-26 US US15/606,527 patent/US10318150B2/en active Active
-
2019
- 2019-04-25 US US16/395,143 patent/US20190250814A1/en not_active Abandoned
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020113784A1 (en) * | 2000-12-29 | 2002-08-22 | Feilmeier Michael Leon | Portable computer aided design apparatus and method |
US20100309115A1 (en) * | 2009-06-03 | 2010-12-09 | Honda Motor Co., Ltd. | Drawing assist device, drawing assist program, and drawing assist method |
US20130205242A1 (en) * | 2012-02-06 | 2013-08-08 | Michael K. Colby | Character-String Completion |
US20140104177A1 (en) * | 2012-10-16 | 2014-04-17 | Google Inc. | Multi-gesture text input prediction |
US20140108989A1 (en) * | 2012-10-16 | 2014-04-17 | Google Inc. | Character deletion during keyboard gesture |
US20140115521A1 (en) * | 2012-10-19 | 2014-04-24 | Google Inc. | Decoding imprecise gestures for gesture-keyboards |
US20140237356A1 (en) * | 2013-01-21 | 2014-08-21 | Keypoint Technologies (Uk) Limited | Text input method and device |
US20140218299A1 (en) * | 2013-02-05 | 2014-08-07 | Google Inc. | Gesture keyboard input of non-dictionary character strings |
US20140317547A1 (en) * | 2013-04-22 | 2014-10-23 | Google Inc. | Dynamically-positioned character string suggestions for gesture typing |
US20140327622A1 (en) * | 2013-05-03 | 2014-11-06 | Google Inc. | Alternative hypothesis error correction for gesture typing |
US20140368434A1 (en) * | 2013-06-13 | 2014-12-18 | Microsoft Corporation | Generation of text by way of a touchless interface |
US20160224240A1 (en) * | 2015-02-03 | 2016-08-04 | Google Inc. | User state-adaptive text input |
Also Published As
Publication number | Publication date |
---|---|
US10318150B2 (en) | 2019-06-11 |
US20150253981A1 (en) | 2015-09-10 |
US9690478B2 (en) | 2017-06-27 |
US20170262170A1 (en) | 2017-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9678639B2 (en) | Virtual mouse for a touch screen device | |
WO2017074827A3 (en) | Touch sensing of user input device | |
US9170680B2 (en) | Method, system and computer program product for operating a touchscreen | |
US10514802B2 (en) | Method for controlling display of touchscreen, and mobile device | |
US10956030B2 (en) | Multi-touch based drawing input method and apparatus | |
US20110216094A1 (en) | Display device and screen display method | |
US8954873B2 (en) | Information processing apparatus, information processing method and computer readable medium | |
US20190250814A1 (en) | Segment Length Measurement Using a Touch Screen System in Response to Gesture Input | |
TW201525849A (en) | Method, apparatus and computer program product for polygon gesture detection and interaction | |
MX361297B (en) | Probabilistic touch sensing. | |
US20150242034A1 (en) | Measuring method and a measuring device with fingertip zoom | |
JP2016200860A5 (en) | ||
CN104656903A (en) | Processing method for display image and electronic equipment | |
US20230384923A1 (en) | Method, apparatus, electronic device and storage medium for invoking touch screen magnifier | |
WO2016197743A3 (en) | Touch screen control method and device, and terminal | |
KR101422447B1 (en) | Method and apparatus for changing page of e-book using pressure modeling | |
JP6724172B2 (en) | Coordinate input device | |
US9947081B2 (en) | Display control system and display control method | |
US20200371681A1 (en) | Method for zooming an image displayed on a touch-sensitive screen of a mobile terminal | |
TWI460649B (en) | System and method for providing zoom function for visual objects displayed on screen | |
JP2011181104A5 (en) | ||
TWI611343B (en) | Operating method of touch panel and display module using the same | |
US20150363354A1 (en) | Calculator | |
JP2016218782A (en) | Touch panel device, coordinate output method of touch panel, and input device | |
JP2014013609A5 (en) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |