US10838550B2 - Changing specification of operation based on start position - Google Patents
Changing specification of operation based on start position Download PDFInfo
- Publication number
- US10838550B2 US10838550B2 US15/421,134 US201715421134A US10838550B2 US 10838550 B2 US10838550 B2 US 10838550B2 US 201715421134 A US201715421134 A US 201715421134A US 10838550 B2 US10838550 B2 US 10838550B2
- Authority
- US
- United States
- Prior art keywords
- area
- start position
- button
- threshold value
- case
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
- G06F3/1204—Improving or facilitating administration, e.g. print management resulting in reduced user or operator actions, e.g. presetting, automatic actions, using hardware token storing data
Definitions
- the present invention relates to an electronic apparatus and a control program thereof.
- a tap operation and a swipe operation are distinguished from each other according to an amount of movement from a position (operation start position) in which touch is performed for the first time. That is, in a case where the amount of movement from the operation start position is smaller than a predetermined threshold, it is determined to be the tap operation, and in a case where the amount of movement from the operation start position is equal to or greater than the predetermined threshold, it is determined to be the swipe operation.
- the tap operation may be recognized in a touch panel side without recognizing the swipe operation, therefore a corresponding process recognizing that the thumbnail image displayed on a touch position is selected without scrolling the list can start.
- a process corresponding to the button cannot be performed since the tap operation performed with respect to a button displayed on the screen is recognized as the swipe operation without recognizing the tap operation.
- An advantage of some aspects of the invention is to provide an electronic apparatus in which it is difficult for a user to erroneously perform an operation.
- an electronic apparatus including: a display; a change unit that changes a specification algorithm for specifying an operation with respect to the display according to a start position of the operation; and a specification unit that specifies the operation according to the specification algorithm.
- Operations (or operation to be frequently performed by user, operation with high priority, or the like) having a high possibility to be performed by a user are different according to attributes of a start position of the operation within a screen. Therefore, by changing a specification algorithm for specifying the operation according to the start position of the operation, it is possible to easily specify an operation having a high possibility to be performed by a user. As a result, it is possible to provide an electronic apparatus in which it is difficult for a user to erroneously perform an operation.
- FIG. 1 is a block diagram showing a configuration of a printer.
- FIG. 2 is a diagram showing a configuration of a displayed screen.
- FIG. 3 is a flowchart showing a specification algorithm change process.
- FIGS. 4A to 4D are explanatory diagrams for explaining a threshold to be changed.
- FIG. 5 is a diagram showing a tap operation.
- FIGS. 6A and 6B are diagrams for explaining a specification method of a selection position in a text area.
- FIG. 1 is a block diagram showing a configuration of a printer 1 as an electronic apparatus of the invention.
- the printer 1 includes a control unit 10 , a print unit 20 , an image read unit 30 , a communication unit 40 , and a user I/F unit 50 , and functions as a printer having an image read function.
- the print unit 20 includes actuators, sensors, driving circuits, and mechanical parts for performing printing on print media such as photographic papers, plain papers, and OHP sheets by a well-known printing method such as an inkjet method and an electrophotographic method.
- the image read unit 30 includes a well-known color image sensor that emits light on a document placed on a document platen and decomposes light reflected from the document platen into colors of R, G, and B to be scanned image data, an actuator for transporting the document, a drive circuit, and mechanical parts.
- the communication unit 40 includes various communication interfaces for performing wired or wireless communication with external devices.
- the communication unit 40 includes an interface for performing communication with various removable memories mounted in the printer 1 .
- the user I/F unit 50 includes a touch panel display 51 (hereinafter, simply mentioned as display 51 ) including a key input unit (not shown).
- the display 51 includes a display panel that displays various kinds of information under the control of the control unit 10 and a touch detection panel overlapped on the display panel, and detects touch by a finger of a person by using a well-known technology such as a capacitive technology, a resistive technology, and an optical technology. Then, the display 51 outputs touch information (for example, coordinates of touch start position and touch end position) indicating the contact to the control unit 10 .
- the control unit 10 is configured by a CPU, a RAM, a ROM, a non-volatile memory, and the like (not shown), and the CPU can execute a control program 11 stored in the ROM or the non-volatile memory by using the RAM or the non-volatile memory.
- the control program 11 is a program causing the printer 1 to perform a process, by controlling each unit of the printer 1 , corresponding to an operation when the operation (for example, including tap operation (corresponds to selection operation) and swipe operation (corresponds to movement operation)) for displaying various screen constituent elements on the display 51 and performing a process on the display 51 based on touch information obtained from the display 51 , is detected.
- a change function that changes a specification algorithm for specifying kinds of an operation performed with respect to the display 51 in accordance with a touch start position (start position of operation), and a specification function for specifying the kinds of an operation according to the specification algorithm are included in the control program 11 .
- the control unit 10 functions as “a change unit” and “a specification unit”.
- FIG. 2 shows a screen 511 displayed on the display 51 .
- an axis extending in parallel with a horizontal direction of the screen 511 is an x axis
- an axis extending in parallel with a vertical direction of the screen 511 is a y axis.
- a list of a thumbnail image (including thumbnail images 511 a 1 ) of an image recorded in a memory card connected through the communication unit 40 and a scroll bar 511 a 2 are displayed.
- the scroll bar 511 a 2 is moved in a direction in parallel with y axis.
- buttons including a button 511 b 1 are displayed in the button area 511 b of the screen 511 .
- the list area 511 a when it is specified that the swipe operation is performed in the direction along which the scroll bar 511 a 2 is moved, the list of the thumbnail images scrolls in the direction in parallel with y axis.
- a predetermined process in which a corresponding image is determined as a selection image is performed.
- a process corresponding to the button is performed.
- FIG. 3 is a flowchart showing a specification algorithm change process.
- the specification algorithm change process is a process for changing a threshold value so as to specify an operation for discriminating the swipe operation and the tap operation according to the touch start position (start position of operation) with respect to the display 51 .
- the specification algorithm change process is performed in response to start of touch on the display 51 .
- the control unit 10 specifies an operation by using a threshold value set by the specification algorithm change process, and performs a process corresponding to the specified operation (no process is performed in a case where it is not process corresponding to the specified operation).
- the control unit 10 determines whether or not the touch start position is included in the button area 511 b (step S 100 ). In a case where it is not determined that the touch start position is included in the button area 511 b , it is determined whether or not the touch start position is included in the list area 511 a (step S 105 ). In step S 105 , in a case where it is not determined that the touch start position is included in the list area 511 a , the control unit 10 sets a threshold value to a default value TH 0 (step S 110 ).
- FIGS. 4A to 4D are diagrams showing a threshold value for specifying the tap operation and the swipe operation.
- the control unit 10 specifies that the tap operation is performed when a movement distance from the touch start position is equal to or less than the default value TH 0 , and determines that the swipe operation is performed when the movement distance exceeds the default value TH 0 .
- step S 105 in a case where it is determined that the touch start position is included in the list area 511 a , the control unit 10 sets the threshold value to a value TH 1 as shown in FIG. 4B (step S 115 ).
- the value TH 1 is a value smaller than a value TH 0 .
- a configuration easy to determine that the swipe operation is performed is implemented, compared to a case where a value shown in FIG. 4A , and FIG. 4C and FIG. 4D described below is set as the threshold value.
- a user can scroll the list during a time a finger is moved to exceed the value TH 1 in a direction in parallel with y axis while keeping the finger in contact with the display 51 from the touch start position, and the user can easily search a desired image among the list of the thumbnail images.
- step S 100 when the touch start position is included in the button area 511 b , the control unit 10 determines whether or not the touch start position is included in the center area of the button (step S 120 ).
- the center area in which the center of the button is set as a reference is set for each button, and it is determined whether or not the touch start position is in the center area. More specifically, for example, in a case of the button 511 b 1 shown in FIG. 2 , a range in which a y coordinates of the coordinates of a center c 1 as a reference is ⁇ Y 1 and an x coordinates thereof is ⁇ X 1 , is set as a center area a 1 of the button 511 b 1 .
- the center area a 1 and an outer edge area which will be described below are set within the button 511 b 1 .
- the coordinates of the center of button as a reference may also set as an area within a predetermined distance as the center area.
- step S 120 in a case where it is not determined that the touch start position is within the center area of the button, that is, in a case where it is determined that the touch start position is included in the outer edge area that is an area other than the center area of the button, the control unit 10 sets the threshold value to a value TH 2 as shown in FIG. 4C (step S 125 ).
- step S 120 when the touch start position is within the center area of the button, the control unit 10 sets the threshold value to a value TH 3 as shown in FIG. 4D (step S 130 ).
- the value TH 2 is a value greater than the default value TH 0 .
- the value TH 3 is a value further greater than the value TH 2 .
- the threshold value becomes the value TH 3 greater than the value TH 2 such that a case where the touch start position is near the center of button can easily determine that the tap operation is performed, compared to a case where it is not near the center.
- FIG. 5 is a diagram showing an example of the tap operation.
- an upper diagram shows an operation on a touch detection surface 51 a of the display 51 at the time of the start of the touch
- a lower diagram shows an operation on the touch detection surface 51 a of the display 51 at the time of the end of the touch.
- An area a 2 indicates a touch area at the time of the start of the touch
- a position g 2 indicates the touch start position that is the center of the area a 2 .
- An area a 3 indicates a touch area at the time of the end of the touch
- a position g 3 indicates the touch end position that is the center of the area a 3 .
- FIG. 5 shows an upper diagram shows an operation on a touch detection surface 51 a of the display 51 at the time of the start of the touch
- a lower diagram shows an operation on the touch detection surface 51 a of the display 51 at the time of the end of the touch.
- An area a 2 indicates a touch area at the time of the start of the touch
- the threshold value is the default value TH 0
- the threshold value is greater than the default value TH 0
- the threshold value is further greater than the outer edge area in the center area of the button, it is possible to reduce occurrence of the phenomenon when the user taps the button.
- the value TH 1 may be a value smaller than the default value TH 0 as the embodiment.
- the value TH 1 may be set to the same value as the default value TH 0 .
- the change unit may change a specification method of a selection position in a selection operation within a specification algorithm to be the start position of the operation in a case where the start position of the operation is included in the text area, and may change the center of the touch area from the start to the end of the operation to be the selection position in a case where the start position of the operation is not included in the text area.
- the specification unit may specify the selection position in the selection operation according to the specification method. A specific example will be described with reference to FIG. 6 .
- FIG. 6 shows that a plurality of characters entered by the user over two rows are displayed in the text area 512 a and a cursor cs is positioned at the end of the second line. For example, a case where a user taps between “o” and “.” so as to move the cursor cs between “o” and “.” of the first line, will be described.
- An area a 4 of FIG. 6A indicates a touch area which is detected by the control unit 10 for the first time, and the position g 4 indicates the center (that is, start position of operation) of the area a 4 .
- the area a 5 of FIG. 6B indicates a trajectory of the touch area from the start to the end of the touch on the display 51 by the user, and a position g 5 indicates the center of the area a 5 .
- the start position of the operation is specified as the selection position, and thus the position g 4 shown in FIG.
- the cursor cs can be moved to the position intended by the user.
- the tap operation for example, the method for specifying the selection position with respect to the user having the habit as indicated in FIG. 5 is particularly effective.
- the button area is an area on which the button is displayed.
- the button enables at least the selection operation.
- the list area is an area on which a list of a plurality of objects is displayed.
- the list area enables at least the movement operation for moving a display position of the object.
- the text area is an area on which characters and symbols are displayed. The text area enables at least the selection operation.
- a process corresponding to the swipe operation is not allocated in the button within the button area.
- a process for example, movement in display position of button, display of pull-down menu and drop-down list, increment or decrement of display area of button, or the like
- a process for example, movement in display position of button, display of pull-down menu and drop-down list, increment or decrement of display area of button, or the like
- a first operation corresponds to the swipe operation (movement operation) for moving the display position of an operation target object in the first embodiment
- a second operation corresponds to the tap operation (selection operation) for selecting the operation target object.
- the embodiment may be not limited to the correspondence relationship.
- a configuration may be implemented, where the first operation and the second operation are the swipe operation, and the first operation in which the movement distance from the operation start position is longer than that of the threshold value and the second operation in which the movement distance from the operation start position is equal to or smaller than that of the threshold value are allocated by different processes.
- the first operation and the second operation are the tap operation.
- a configuration may be implemented, in which different processes are allocated according to the movement distance from the operation start position.
- the specification algorithm may be changed to easily specify that the current operation is the same operation as the immediately preceding operation.
- the control unit 10 determines the threshold value according to the flowchart of FIG. 3 .
- the threshold value is set to TH 4 smaller than TH 1 , and thus it may be to easily determined that the swipe operation that is the same as the immediately preceding operation is also performed in the current operation.
- the threshold value is set as TH 5 greater than TH 1 , and thus it may be to easily determined that the tap operation is also performed in this time.
- the threshold value may be determined for each movement direction of the touch area in consideration of a movement direction of the touch area.
- each unit described in the aspects are realized by hardware resources whose functions are specified by the configuration itself, hardware resources whose functions are specified by programs, or a combination thereof.
- the functions of these units are not limited to those realized by physically independent hardware resources.
- the invention is applicable to various electronic apparatuses including touch panel type displays such as smart phones and tablet terminals in addition to printers.
- the display included in the electronic apparatus is not limited to the touch panel display. Even in a case where an operation is performed by using a pointing device such as a mouse with respect to a display not including the touch panel, it is possible to apply the invention. In this case, the selection operation corresponds to click or double click, and the movement operation corresponds to drag.
Abstract
Description
Claims (7)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-018637 | 2016-02-03 | ||
JP2016018637A JP6812639B2 (en) | 2016-02-03 | 2016-02-03 | Electronic devices, control programs for electronic devices |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170220198A1 US20170220198A1 (en) | 2017-08-03 |
US10838550B2 true US10838550B2 (en) | 2020-11-17 |
Family
ID=59386697
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/421,134 Active 2038-07-01 US10838550B2 (en) | 2016-02-03 | 2017-01-31 | Changing specification of operation based on start position |
Country Status (3)
Country | Link |
---|---|
US (1) | US10838550B2 (en) |
JP (1) | JP6812639B2 (en) |
CN (1) | CN107085477B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111164546B (en) * | 2017-10-11 | 2023-09-26 | 三菱电机株式会社 | Operation input device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101963862A (en) | 2010-09-26 | 2011-02-02 | 苏州瀚瑞微电子有限公司 | Coordinate anti-trembling method on touch screen |
US20120113007A1 (en) * | 2010-11-05 | 2012-05-10 | Jonathan Koch | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards |
JP2014153951A (en) | 2013-02-08 | 2014-08-25 | Shimane Prefecture | Touch type input system and input control method |
US20150015507A1 (en) | 2013-07-09 | 2015-01-15 | Canon Kabushiki Kaisha | Information processing apparatus, method for controlling same, and recording medium |
US20150095846A1 (en) * | 2013-09-30 | 2015-04-02 | Microsoft Corporation | Pan and selection gesture detection |
US20150130718A1 (en) * | 2013-11-12 | 2015-05-14 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Information processor |
JP2015138287A (en) | 2014-01-20 | 2015-07-30 | アルパイン株式会社 | information processing apparatus |
US20150253952A1 (en) | 2014-03-10 | 2015-09-10 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6583781B1 (en) * | 2000-10-17 | 2003-06-24 | International Business Machines Corporation | Methods, systems and computer program products for controlling events associated with user interface elements by capturing user intent based on pointer movements |
JP2009284468A (en) * | 2008-04-23 | 2009-12-03 | Sharp Corp | Personal digital assistant, computer readable program and recording medium |
CN102841745A (en) * | 2012-06-28 | 2012-12-26 | 宇龙计算机通信科技(深圳)有限公司 | Page display method and communication terminal |
-
2016
- 2016-02-03 JP JP2016018637A patent/JP6812639B2/en active Active
-
2017
- 2017-01-31 US US15/421,134 patent/US10838550B2/en active Active
- 2017-02-03 CN CN201710063078.1A patent/CN107085477B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101963862A (en) | 2010-09-26 | 2011-02-02 | 苏州瀚瑞微电子有限公司 | Coordinate anti-trembling method on touch screen |
US20120113007A1 (en) * | 2010-11-05 | 2012-05-10 | Jonathan Koch | Device, Method, and Graphical User Interface for Manipulating Soft Keyboards |
JP2014153951A (en) | 2013-02-08 | 2014-08-25 | Shimane Prefecture | Touch type input system and input control method |
US20150015507A1 (en) | 2013-07-09 | 2015-01-15 | Canon Kabushiki Kaisha | Information processing apparatus, method for controlling same, and recording medium |
JP2015018325A (en) | 2013-07-09 | 2015-01-29 | キヤノン株式会社 | Information processor and control method thereof, program, recording medium |
US20150095846A1 (en) * | 2013-09-30 | 2015-04-02 | Microsoft Corporation | Pan and selection gesture detection |
US20150130718A1 (en) * | 2013-11-12 | 2015-05-14 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Information processor |
JP2015138287A (en) | 2014-01-20 | 2015-07-30 | アルパイン株式会社 | information processing apparatus |
US20150253952A1 (en) | 2014-03-10 | 2015-09-10 | Toyota Jidosha Kabushiki Kaisha | Vehicle operation apparatus |
CN104915134A (en) | 2014-03-10 | 2015-09-16 | 丰田自动车株式会社 | Vehicle operation apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN107085477A (en) | 2017-08-22 |
JP2017138759A (en) | 2017-08-10 |
JP6812639B2 (en) | 2021-01-13 |
US20170220198A1 (en) | 2017-08-03 |
CN107085477B (en) | 2020-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9292188B2 (en) | Information processing apparatus, control method thereof, and storage medium | |
US9479658B2 (en) | Image forming apparatus interface where user selections are displayed in a hierarchical manner | |
US9998617B2 (en) | Control program for providing automatic notifications regarding setting change events | |
US9288345B2 (en) | Data processing apparatus and method for processing data | |
US10402080B2 (en) | Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium | |
US9524029B2 (en) | Indeterminable gesture recognition using accumulated probability factors | |
US9565324B2 (en) | Apparatus, non-transitory computer readable medium, and method | |
US10838550B2 (en) | Changing specification of operation based on start position | |
US20160227057A1 (en) | Methods for Optimizing Display Space of a User Interface of an Imaging Apparatus | |
US10416870B2 (en) | Display control device and non-transitory computer-readable storage medium having program recorded thereon | |
US10338808B2 (en) | Information processing apparatus and storage medium | |
US11775237B2 (en) | Display device capable of displaying preview image | |
CN111182166B (en) | Display device and computer-readable non-transitory recording medium storing display control program | |
US20160224214A1 (en) | Methods for Optimizing Display Space of a User Interface of an Imaging Apparatus | |
US10911619B2 (en) | Input device, image forming apparatus, and non-transitory computer readable medium for allocating a function to a visually unascertainable detection region | |
JP6834644B2 (en) | Input device, image forming device and program | |
US10712926B2 (en) | Display input device, image forming apparatus, and control method for display input device | |
US20140089843A1 (en) | Information processing apparatus and method of controlling the same, and storage medium thereof | |
JP6455476B2 (en) | Display operation device and operation instruction receiving program | |
JP6418119B2 (en) | Display device and image forming apparatus having the same | |
JP6379893B2 (en) | Display system and display program | |
US20180292947A1 (en) | Detection device and apparatus | |
JP2018156589A (en) | Input device, image forming apparatus, and program | |
JP2018160079A (en) | Input device and program | |
JP2017130708A (en) | Electronic apparatus, control program for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SEIKO EPSON CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SUGAHARA, TATSUYA;REEL/FRAME:041138/0102 Effective date: 20170116 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |