EP1645946A2 - An apparatus for manipulating an object displayed on a display device - Google Patents

An apparatus for manipulating an object displayed on a display device Download PDF

Info

Publication number
EP1645946A2
EP1645946A2 EP05027103A EP05027103A EP1645946A2 EP 1645946 A2 EP1645946 A2 EP 1645946A2 EP 05027103 A EP05027103 A EP 05027103A EP 05027103 A EP05027103 A EP 05027103A EP 1645946 A2 EP1645946 A2 EP 1645946A2
Authority
EP
European Patent Office
Prior art keywords
information
display
manipulation
touch panel
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP05027103A
Other languages
German (de)
French (fr)
Other versions
EP1645946A3 (en
EP1645946B1 (en
Inventor
Yu Minakuchi
Satoshi Okuyama
Hajime Kamata
Akiko Fukue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Fujitsu Ltd
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd, Apple Inc filed Critical Fujitsu Ltd
Publication of EP1645946A2 publication Critical patent/EP1645946A2/en
Publication of EP1645946A3 publication Critical patent/EP1645946A3/en
Application granted granted Critical
Publication of EP1645946B1 publication Critical patent/EP1645946B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an apparatus for use with a display device to manipulate an object displayed on the display device.
  • Fig. 1 illustrates a computer system with a conventional user interface.
  • a computer system with a conventional user interface consists mainly of a central processing unit (CPU) 4, a main memory 5, a keyboard/mouse 2, a frame memory 60 and a hard disk interface 71, which are interconnected via a system bus interface, and also a hard disk 7 and a display unit 3, which are connected to the system bus interface via the hard disk interface and the frame memory 6, respectively.
  • the main memory 5 stores a system control program and application programs which handle graphics processing, and provides a work area for use by the programs.
  • the CPU 4 performs display operations under control of the programs.
  • the hard disk 7 stores a data file for graphics to be displayed on the display unit 3.
  • the frame memory 6 stores a frame of picture (or object) data to be displayed on the display unit 3.
  • an apparatus for use with a display device to manipulate an object displayed on the display device comprising:- a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information; characterised in that: said touch panel information includes time-based changes of the position of the touching with regard to a moving manipulation of the object; and in that said object manipulation means comprises: file storage means for storing at least one data file which stores object data for displaying the object; display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch panel information and to the object information included in said display
  • an apparatus for use with a display device to manipulate an object displayed on the display device comprising: a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information; characterised in that: said touch panel information includes time-based changes of the position of the touching contact with regard to the manipulated object; and in that said object manipulation means comprises: file storage means for storing a plurality of data files which store object data for displaying the object; display information storage means for storing a plurality of items of object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch information and to the object information
  • an apparatus for use with a display device to manipulate an object displayed on the display device comprising: a touch panel, representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object of the display device in response to said touch panel information; characterised in that: said touch panel information represents changes of the position of the touching contact; and in that said object manipulation means comprises: file storage means for storing at least one data file which stores object data for displaying the object; display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch
  • the present invention can include a touch-sensitive panel (e.g. touch screen), means storing a plurality of data files, display information storage means and display control means, which can be used with a display device of a computer system or workstation.
  • a touch-sensitive panel e.g. touch screen
  • means storing a plurality of data files
  • display information storage means e.g. display information storage means
  • display control means e.g. display control means
  • Fig. 2 is a configuration diagram of a touch screen-equipped workstation for implementing the present invention.
  • the system includes an input-output (abbreviated to I/O) port 8, a touch screen controller 15 and a touch screen unit 1 with a touch screen 11.
  • the touch screen controller 15 connected to the input-output port 8 through an RS-232C interface, controls the touch screen unit 1.
  • the touch screen unit 1 which is sensitive to a position or positions (X-Y coordinates) where it is touched, and preferably also to a pressure applied to it, acts as a user interface that allows a user to send signals to the CPU by touching an area thereon with a body, such as a finger or a pencil.
  • Fig. 3 is a schematic diagram illustrating the principle of the present invention.
  • the input-output port 8, touch screen controller 15 and touch screen unit 1 shown in Fig. 2 are represented by the touch screen unit 1; and the frame memory 6 and display unit 3 are represented by the display unit 3.
  • Fig. 4(a) shows a display information table.
  • Fig. 4(b) shows touch screen information.
  • a display information table 1T which is provided in the main memory 5, corresponding to objects, includes an object type, display position information, file information, normal-display file name and special-state file name.
  • the object type defines the type including the shape, properties, circumstances, etc., of the object.
  • the display position information defines the size of the object (width, height), and the position (top-left coordinates X,Y) and the angle at which the object is displayed on the display unit 1.
  • the file information stored in the display data file which is used for an object which is so large in size that it requires scrolling to view the whole object, defines the size (width W, height H) of the whole object relative to the display screen size, and also the position (top-left co-ordinates X, Y) of the object portion being displayed on the display device, relative to the whole object data.
  • the normal display file name specifies a normal display file where object data for displaying a normal state of the object is stored.
  • the special-state file name specifies a special-state file where object data for displaying a special-state (e.g. turn-over indication of display color, used for displaying intermediate process of manipulating the object) of the object is stored.
  • Touch-screen information 21, which is sent from the touch screen unit 1, includes a touch position (X-Y coordinates) where the touch screen 11 is touched and a pressure applied thereon.
  • Fig. 11 is a diagram illustrating a distort-restore manipulation.
  • the system controller 50 calculates an amount of distortion of the object based on the pressure reported by the touch report 3R. It stores in the display information table 1T, a special-state file name specifying one of special-state files (for displaying a distorted state of the object in turn-over indication) corresponding to the amount of distortion calculated. Then, the system controller 50 sends a display update request 4Q to the display controller 52, commanding that the special-state file be displayed at the current display position.
  • the system controller 50 sends a display update request 4Q (with a normal display file name specified) to the display controller 52, commanding that a normal display file (normal indication) be displayed at the current display position.
  • a display update request 4Q with a normal display file name specified
  • a plurality of special-state files are provided in the hard disk 7, corresponding to the amount of distortion of the object, which results from a pressure applied on the touch screen 11.
  • the present invention regards a display screen as a virtual space. It defines conditions and physical properties of an object (e.g., weight, hardness, frictional resistance, center of gravity) in the display information table 1T. It also receives touch screen information 2I indicating a finger-touched position and pressure is input from a touch screen unit 1. Based on the touch screen information 21 and the display information table 1T, the present invention determines a manipulation to be conducted on the object displayed, e.g. scrolling, picking (up), pushing, rolling, distorting the object on the display surface of the display unit 3.
  • a manipulation to be conducted on the object displayed e.g. scrolling, picking (up), pushing, rolling, distorting the object on the display surface of the display unit 3.

Abstract

An apparatus for use with a display device to manipulate an object displayed on the display device. The apparatus comprises a touch panel (11), mounted to the display device (3) and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least coordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel, and object manipulation means (50-52) for manipulating and displaying the object on the display device (3) in response to the touch panel information. The touch panel information includes time-based changes of the position of the touching with regard to moving manipulation of the object. The object manipulation means (50-52) comprises: file storage means (7) for storing at least one data file which stores object data for displaying the object; display information storage means (1T) for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position on the object on the display device; and display control means (52) responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type and the display position information, and performing the manipulation by moving the object from one position to another on the display device (3) in accordance with the recognized type of manipulation.

Description

  • The present invention relates to an apparatus for use with a display device to manipulate an object displayed on the display device.
  • As use of computer systems for data processing has become widespread in recent years, more and more users are being required to input data to and converse with data processors such as work stations and personal computers. A vast range of application programs are available for recent data processors and even a complicated application can be processed by using such application programs in combination. However, there is a problem that such data processors are very difficult to handle, especially to manipulate an object displayed on a display device, for those who have little knowledge of computers.
  • Therefore, an apparatus for manipulating an object displayed on a display device, which is easy to use even for a person who has no special knowledge of computers, is in great demand.
  • Fig. 1 illustrates a computer system with a conventional user interface.
  • A computer system with a conventional user interface consists mainly of a central processing unit (CPU) 4, a main memory 5, a keyboard/mouse 2, a frame memory 60 and a hard disk interface 71, which are interconnected via a system bus interface, and also a hard disk 7 and a display unit 3, which are connected to the system bus interface via the hard disk interface and the frame memory 6, respectively. The main memory 5 stores a system control program and application programs which handle graphics processing, and provides a work area for use by the programs. The CPU 4 performs display operations under control of the programs. The hard disk 7 stores a data file for graphics to be displayed on the display unit 3. The frame memory 6 stores a frame of picture (or object) data to be displayed on the display unit 3.
  • To manipulate an object displayed on a display unit 3 in the above system, an operator is required to input a command for manipulating the object by using a keyboard/mouse 2 or to select an icon (a symbolic representation of a computer function) displayed on a display unit 3 by using the keyboard/mouse 2 in order to command a desired function. However, it is troublesome and annoying to use a keyboard/mouse and icons and a person with less knowledge of computers tends to be allergic to even touching a keyboard/mouse.
  • Therefore, it is a great problem that such data processors are very difficult to handle for those who have less knowledge of computers.
  • It is therefore desirable to provide an apparatus which can easily manipulate an object displayed on a display unit.
  • It is also desirable to provide a user interface with which a user can easily manipulate an object displayed on a display unit.
  • IBM Technical Disclosure Bulletin, vol. 33, No. 1B, June 1990, pages 277-278 discloses an apparatus according to the preamble of accompanying claim 1. In this prior art, icons resembling three-dimensional push buttons are displayed on the panel (screen). These icons can be "manipulated" by pushing them in or out, the icon display changing in appearance to reflect the amount of depression of the push button. There is only one type of object (icon) and they are displayed at fixed positions on the screen.
  • According to a first aspect of the present invention, there is provided an apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising:- a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information; characterised in that: said touch panel information includes time-based changes of the position of the touching with regard to a moving manipulation of the object; and in that said object manipulation means comprises: file storage means for storing at least one data file which stores object data for displaying the object; display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type and the display position information, and performing the manipulation by moving the object from one position to another on the display device in accordance with the recognized type of manipulation.
  • According to a second aspect of the invention, there is provided an apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising: a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information; characterised in that: said touch panel information includes time-based changes of the position of the touching contact with regard to the manipulated object; and in that said object manipulation means comprises: file storage means for storing a plurality of data files which store object data for displaying the object; display information storage means for storing a plurality of items of object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch information and to the object information for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type corresponding to the manipulated object and the display position information corresponding to the manipulated object, and performing the manipulation by changing the display of the object on the display device in accordance with the recognized type of manipulation.
  • According to a third aspect of the invention, there is provided an apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising: a touch panel, representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and object manipulation means for manipulating and displaying the object of the display device in response to said touch panel information; characterised in that: said touch panel information represents changes of the position of the touching contact; and in that said object manipulation means comprises: file storage means for storing at least one data file which stores object data for displaying the object; display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and display control means responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information and the object type, and for performing the manipulation of the object on the display device in accordance with the recognized type of manipulation.
  • Thus, the present invention can include a touch-sensitive panel (e.g. touch screen), means storing a plurality of data files, display information storage means and display control means, which can be used with a display device of a computer system or workstation.
  • Reference is made, by way of example, to the accompanying drawings, in which:
    • Fig. 1 illustrates a computer system with a conventional user interface;
    • Fig. 2 is a configuration diagram of a touch-screen-equipped workstation, to which the present invention may be applied;
    • Fig. 3 is a schematic diagram illustrating the principle of the present invention;
    • Fig. 4(a) shows a display information table;
    • Fig. 4(b) shows touch screen information;
    • Fig. 5 is a flowchart illustrating a pick manipulation;
    • Fig. 6 is a diagram illustrating a pick manipulation;
    • Fig. 7 is a diagram illustrating a scroll manipulation;
    • Fig. 8 is a diagram illustrating a push manipulation;
    • Fig. 9 is a diagram illustrating a flip manipulation;
    • Fig. 10 is a diagram illustrating a roll manipulation; and
    • Fig. 11 is a diagram illustrating a distort-restore manipulation.
  • Throughout the above-mentioned drawings, identical reference numerals are used to designate the same or similar component parts.
  • Fig. 2 is a configuration diagram of a touch screen-equipped workstation for implementing the present invention.
  • In addition to the conventional system shown in Fig. 1, the system includes an input-output (abbreviated to I/O) port 8, a touch screen controller 15 and a touch screen unit 1 with a touch screen 11. The touch screen controller 15, connected to the input-output port 8 through an RS-232C interface, controls the touch screen unit 1. The touch screen unit 1, which is sensitive to a position or positions (X-Y coordinates) where it is touched, and preferably also to a pressure applied to it, acts as a user interface that allows a user to send signals to the CPU by touching an area thereon with a body, such as a finger or a pencil.
  • Fig. 3 is a schematic diagram illustrating the principle of the present invention.
  • For easy understanding of the principle, the input-output port 8, touch screen controller 15 and touch screen unit 1 shown in Fig. 2 are represented by the touch screen unit 1; and the frame memory 6 and display unit 3 are represented by the display unit 3.
  • A system controller 50, touch discriminator 51, display controller 52 and display information table 1T, which are stored in the main memory 5, control display operations featured by the present invention
  • Fig. 4(a) shows a display information table. Fig. 4(b) shows touch screen information.
  • A display information table 1T, which is provided in the main memory 5, corresponding to objects, includes an object type, display position information, file information, normal-display file name and special-state file name. The object type defines the type including the shape, properties, circumstances, etc., of the object. The display position information defines the size of the object (width, height), and the position (top-left coordinates X,Y) and the angle at which the object is displayed on the display unit 1. The file information stored in the display data file, which is used for an object which is so large in size that it requires scrolling to view the whole object, defines the size (width W, height H) of the whole object relative to the display screen size, and also the position (top-left co-ordinates X, Y) of the object portion being displayed on the display device, relative to the whole object data. The normal display file name specifies a normal display file where object data for displaying a normal state of the object is stored. The special-state file name specifies a special-state file where object data for displaying a special-state (e.g. turn-over indication of display color, used for displaying intermediate process of manipulating the object) of the object is stored.
  • Touch-screen information 21, which is sent from the touch screen unit 1, includes a touch position (X-Y coordinates) where the touch screen 11 is touched and a pressure applied thereon.
  • Following are embodiments of the present invention.
    1. (1) The touch discriminator 51, based on the touch screen information 21 from the touch screen unit 1, discriminates the type of a touch an operator's finger has on the touch screen 11, that is, a touch type including, i.e., a "continuous touch start" and "continuous touch end" explained later. The touch discriminator 51 sends to the system controller 50, the result of the discrimination as a touch report 3R, which includes a touch type and touch coordinates.
      Based on the touch report 3R from the touch discriminator 51 and the display information table 1T, the system controller 50 determines the type of a manipulation conducted by an operator and, according to the determination, updates the display information table 1T. Then, the system controller 50 sends to the display controller 52, a display update request 4Q along with "display update data" which includes contents of the display information table 1T updated (including display position information, file information, normal dispaly file name and special-state file name).
      On receipt of the display update request 4Q from the system controller 50, the display controller 52 reads display file data (including object data) specified by the file name from the hard disk 7 and stores the data into the main memory 5. The display controller 52 then updates the object data in accordance with the display update data from the system controller 50 and loads the thus-updated object data into the frame memory 6 to display the object, as manipulated by the operator on the touch screen unit 1.
      Thus, the present invention determines a manipulation to be conducted on the object displayed, based on the touch screen information 21 which results from an operator's touching the touch screen 11 and the display information table 1T which defines the object's shape, physical properties, display position, etc. It then displays the object according to the manipulation determined, as intended by the operator.
    2. (2) Pick manipulation (see Figs. 5 and 6.)
      A pick manipulation is conducted in such a way as an object is picked up at a position on the display surface of the display unit 3 and placed at another position.
      Fig. 5 is a flowchart illustrating a pick manipulation. Fig. 6 is a diagram illustrating a pick manipulation.
      A pick manipulation is carried out according to the following steps (S1-S8) in Fig. 5:
      • (S1) The system controller 50 receives a touch report 3R from the touch discriminator 51.
      • (S2) The system controller 50 checks the touch report 3R to see whether the object-finger relation is a pick manipulation as shown in Fig. 6(a), based on the touch report 3R and contents of the display information table 1T shown in Fig. 6(c). When the relation is not a pick manipulation, the system controller 50 checks the touch report 3R for other manipulation.
      • (S3) When the relation is a pick manipulation, the system controller 50 sends a display update request 4Q including "display update data", commanding that the special-state file (turn-over indication) be displayed at the position specified by the display information table 1T.
      • (S4) The system controller 50 receives a touch report 3R.
      • (S5) The system controller 50 determines whether the touch report 3R includes a "continuous touch end", which occurs when the finger-object relation is as in Fig. 6(b). When a "continuous touch end" is reported, the operation goes to step (S8).
      • (S6) Otherwise, the system controller 50 updates the display position information "coordinates (X, Y)" of the display information table 1T so that the object is positioned between the two fingers.
      • (S7) The system controller 50 sends display update request 4Q to the display controller 52, commanding that the special-state file be displayed according to the display information table 1T updated, and returns to step (S4).
      • (S8) When "continuous touch end" is reported by a touch report 3R, the system controller 50 sends a display update request 4Q to the display controller 52, commanding that the normal-display file be displayed at the position specified in the display information table 1T.
      Following manipulations are carried out in the same way as described in the above flowchart of the pick manipulation.
    3. (3) Scroll manipulation (see Fig. 7.)
      A scroll manipulation is conducted in such a way as an object extending outside of the display surface of the display unit 3 is moved into and out of the display surface.
      Fig. 7 is a diagram illustrating a scroll manipulation.
      On determining that the finger moves while touching the touch screen 11 based on the touch screen information 21 from the touch screen unit 1, the discriminator 51 sends to the system controller 50, a touch report 3R including "continuous touch start" for the touch type and also "coordinates (800, 800)" for the touch position. As another touch screen information 21 comes in, the discriminator 51 sends a touch report 3R including "continuous touch in progress" and coordinates (780, 800). When the touch screen information 21 is not sent for more than 100 milliseconds, for example, the discriminator 51 sends a touch report 3R including "continuous touch end" and coordinates (700, 800) to the system controller 50.
      when a "continuous touch start" is reported and the "object type" is defined as "out-screen" in the display information table 1T, the system controller 50 recognizes the object as a large one extending beyond the display screen. Then, the system controller 50 determines the speed at which the finger has moved from right to left, for example, based on a change in the X-coordinate in the touch report 3R.
      Depending on whether the finger has moved at a speed of more (high-speed) or less (normal-speed) than for example 20 dots (pixels) e.g. since the last check, the display screen is scrolled first at an interval of 100 or 500 milliseconds, respectively. Then, the interval, at which the display update request 4Q is sent to the display controller 52, is increased by a factor 1.5 at each touch report 3R and, when the interval reaches 2 seconds, the scrolling is stopped.
      Practically, the screen is so controlled that it starts scrolling at an above-mentioned speed after a finger has moved a distance of 4 dots or more. That is, on recognizing that the finger has moved for that distance, the system controller 50 updates the file information "display position X" of the display information table 1T so that the object is displayed to the left by 10 dots, for example. Then, it sends to the display controller 52, a display update request including display position information, file information and normal display file name from the display information table 1T updated.
      The display controller 52 reads from the hard disk a display file specified by the normal display file name and loads it in the main memory 5. The display controller 52 then transfers only the part of the display file specified by the file information "display position X" of the display information table 1T, from the main memory 5 to the appropriate location of the frame memory 6.
      In the same way, the system controller 50 sends a display update request 4Q to the display controller 52 every time it receives a touch report 3R .
      When another "continuous touch" is reported before the scroll currently in progress comes to a stop, a new scroll can start from this point and at the first speed described above.
    4. (4) Scroll-stop manipulation (see Fig. 7.)
      Fig. 7 is a diagram illustrating a scroll manipulation.
      When a touch position given by a touch report 3R is the same as or approximately 5 dots apart from the position of the scrolling currently in progress, the system controller 50 doubles the frequency with which display update requests 4Q are sent to the display controller 52, in order to put an end to the scrolling.
    5. (5) Push manipulation (see Fig. 8.)
      A push manipulation is conducted in such a way as an object is pushed on the display surface of the display unit 3.
      Fig. 8 is a diagram illustrating a push manipulation.
      The system controller 50 determines the type of a manipulation, based on the touch report 3R and contents of the display information table 1T shown in Fig. 8(c). When the manipulation is a push manipulation as shown in Fig. 8(a), the system controller 50 sends to the display controller 52, a display update request 4Q including display position information, file information and normal display file name so that the object is displayed close to the finger position reported by the touch report 3R. The above display operation is repeated until a "continuous touch end" is reported by a touch report 3R.
    6. (6) Push-while-rotate manipulation (see Fig. 8.)
      A push-while-rotate manipulation is conducted in such a way as an object is pushed at a position off its center (or the center of gravity) and it moves rotating on the display surface of the display unit 3.
      Fig. 8 is a diagram illustrating a push manipulation.
      The system controller 50 determines the type of a manipulation, based on the touch report 3R and contents of the display information table 1T shown in Fig. 8(c). When the manipulation is a push-while-rotate manipulation as shown in Fig. 8(b), the system controller 50 sends to the display controller 52, display update requests 4Q with the angle of rotation increasing by 2 degrees, i.e., while increasing the angle in the display information table 1T shown in Fig. 8(c).
      The display controller 52 reads the display file from the hard disk and loads the data in the main memory 5, rotates the object by the angle and with the left-top coordinates (X, Y) as a rotational center, as specified by the display update request 4Q, and transfers the data with the object rotated, from the main memory 5 to the frame memory 6.
    7. (7) Flip manipulation (see Fig. 9.)
      A flip manipulation is conducted in such a way as a finger flips an object or touches the object from a remote position at a high speed on the display surface of the display unit 3.
      Fig. 9 is a diagram illustrating a flip manipulation.
      When a touch report 3R is input from the touch discriminator 51, the system controller 50 descriminates the type of a manipulation based on the touch report 3R and contents of the display information table 1T shown in Fig. 9 (c). When the manipulation is a flip manipulation as shown in Fig. 9 (a), the system controller 50 obtains a finger speed based on the touch report 3R and also an object speed (i.e., the interval at which display update requests 4Q are sent to the display controller 52), in the same way as described in item (3). The system controller 50 sends display update requests 4Q to the display controller 52, while updating the display position information left-top coordinates (X, Y) of the display information table 1T so that the object moves in the direction the finger moves. The system controller 50 stops moving the object when the above-mentioned interval reaches 2 seconds.
    8. (8) Flip-under-gravity manipulation (see Fig. 9.)
      A flip-under-gravity manipulation is conducted in such a way as an object which is subjected to a gravity is flipped by a finger on the display surface of the display unit 3.
      Fig. 9 is a diagram illustrating a flip manipulation.
      When the finger manipulation is a flip as in the above item (8) and the display information table 1T defines the object type as "gravity" meaning that the object is subjected to gravity, for example, the object moves under the combined influences of inertia and simulated gravity, i.e. "falls" as shown in Fig. 9(b). Therefore, the system controller 50 sends display update requests 4Q to the display controller 52, while updating the display position information left-top coordinates (X, Y) by adding a value to the Y-coordinate of of the display information table 1T. The value is represented by 2 to the Nth power (N: the number of display update requests 4Q sent). In this case, too, the system controller 50 stops moving the object when the above-mentioned interval reaches 2 seconds. The resulting trajectory may be a parabola.
    9. (9) Roll manipulation (see Fig. 10.)
      A roll manipulation is conducted in such a way as a rollable object is rolled by a finger on the display surface of the display unit 3.
      Fig. 10 is a diagram illustrating a roll manipulation.
      When a touch report 3R is input from the touch discriminator 51 and the display information table 1T defines the object type as "rollable" meaning that the object is constructed such that it rolls when flipped like a globe or a cylinder, as shown in Fig. 10(a), the system controller 50 sends display update requests 4Q to the display controller 52, while updating the display position information left-top coordinates (X, Y) of the display information table 1T so that the object moves a distance 10 per. cent behind the distance and in the direction the finger moves.
    10. (10) Distort-restore manipulation (see Fig. 11.)
      A distort-restore manipulation is conducted in such a way as an "elastic" object is pressed by a finger on the display surface of the display unit 3, thereby deforming the displayed object.
  • Fig. 11 is a diagram illustrating a distort-restore manipulation.
  • When a touch report 3R is input from the touch discriminator 51 and the display information table 1T defines the object type as "elastic" meaning that the object can be distorted and restored according to a pressure applied thereon by a finger, as shown in Fig. 11(a), the system controller 50 calculates an amount of distortion of the object based on the pressure reported by the touch report 3R. It stores in the display information table 1T, a special-state file name specifying one of special-state files (for displaying a distorted state of the object in turn-over indication) corresponding to the amount of distortion calculated. Then, the system controller 50 sends a display update request 4Q to the display controller 52, commanding that the special-state file be displayed at the current display position. When the above operation is repeated as necessary and a "continuous touch end" is reported by a touch report 3R, the system controller 50 sends a display update request 4Q (with a normal display file name specified) to the display controller 52, commanding that a normal display file (normal indication) be displayed at the current display position. A plurality of special-state files are provided in the hard disk 7, corresponding to the amount of distortion of the object, which results from a pressure applied on the touch screen 11.
  • As is apparent by the above description, the present invention regards a display screen as a virtual space. It defines conditions and physical properties of an object (e.g., weight, hardness, frictional resistance, center of gravity) in the display information table 1T. It also receives touch screen information 2I indicating a finger-touched position and pressure is input from a touch screen unit 1. Based on the touch screen information 21 and the display information table 1T, the present invention determines a manipulation to be conducted on the object displayed, e.g. scrolling, picking (up), pushing, rolling, distorting the object on the display surface of the display unit 3. Thus, the present invention allows a user to manipulate an object displayed on a display device quite easily, even when the user has little knowledge of computers.

Claims (6)

  1. An apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising:-
    a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information includes time-based changes of the position of the touching with regard to a moving manipulation of the object;
    and in that said object manipulation means comprises:
    file storage means for storing at least one data file which stores object data for displaying the object;
    display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and
    display control means responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type and the display position information, and performing the manipulation by moving the object from one position to another on the display device in accordance with the recognized type of manipulation.
  2. A controller for controlling an apparatus according to claim 1, the controller comprising:
    outputting means for outputting touch panel information representing a position of the touching; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information includes time-based changes of the position of the touching with regard to a moving manipulation of the object;
    and in that said object manipulation means comprises:
    display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and
    display control means responsive to the touch information and to the object information for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type and the display position information, and performing the manipulation by moving the object from one position to another on the display device in accordance with the recognized type of manipulation.
  3. An apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising:
    a touch panel, mounted to the display device and representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least co-ordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information includes time-based changes of the position of the touching contact with regard to the manipulated object;
    and in that said object manipulation means comprises:
    file storage means for storing a plurality of data files which store object data for displaying the object;
    display information storage means for storing a plurality of items of object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and
    display control means responsive to the touch information and to each item of object information for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type corresponding to the manipulated object and the display position information corresponding to the manipulated object, and performing the manipulation by changing the display of the object on the display device in accordance with the recognized type of manipulation.
  4. A controller for controlling an apparatus according to claim 1, the controller comprising:
    outputting means for outputting touch panel information representing a position of the touching; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information includes time-based changes of the position of the touching contact with regard to the manipulated object;
    and in that said object manipulation means comprises:
    file storage means for storing a plurality of data files which store object data for displaying the object;
    display information storage means for storing a plurality of items of object information including an object type which specifies the shape or physical properties of each object, and display position information which specifies the position of each object on the display device; and
    display control means responsive to the touch information and to each item of object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information, the object type corresponding to the manipulated object and the display position information, and performing the manipulation by changing the display of the object on the display device in accordance with the recognized type of manipulation.
  5. An apparatus for use with a display device to manipulate an object displayed on the display device, the apparatus comprising:
    a touch panel, representing a display surface of the display device, which is sensitive to characteristics of a touching contact on the touch panel, the characteristics including at least coordinate positions of the touching contact, for outputting corresponding touch panel information representing the characteristics of the touching contact on the touch panel; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information represents changes of the position of the touching contact;
    and in that said object manipulation means comprises:
    file storage means for storing at least one data file which stores object data for displaying the object;
    display information storage means for storing object information including an object type which specifies the shape or physical properties of the object, and display position information which specifies the position of the object on the display device; and
    display control means responsive to the touch panel information and to the object information included in said display information storage means, for recognizing the type of manipulation of the object in accordance with the touch panel information and the object type, and for performing the manipulation of the object on the display device in accordance with the recognized type of manipulation.
  6. A controller for controlling an apparatus according to claim 3, the controller comprising:
    outputting means for outputting touch panel information representing a position of the touching; and
    object manipulation means for manipulating and displaying the object on the display device in response to said touch panel information;
    characterised in that:
    said touch panel information represents changes of touching; and in that said object manipulation means comprises:
    display control means responsive to the touch information and to the object information for recognizing the type of manipulation of the object in accordance with the touch panel information and the object type, and for performing the manipulation of the object on the display device in accordance with the recognized type of manipulation.
EP05027103A 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device Expired - Lifetime EP1645946B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP25823291A JP2827612B2 (en) 1991-10-07 1991-10-07 A touch panel device and a method for displaying an object on the touch panel device.
EP99109587A EP0938039B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP92117111A EP0536715B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
EP92117111.2 Division 1992-10-07
EP99109587A Division EP0938039B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109587.8 Division 1999-05-14

Publications (3)

Publication Number Publication Date
EP1645946A2 true EP1645946A2 (en) 2006-04-12
EP1645946A3 EP1645946A3 (en) 2006-06-07
EP1645946B1 EP1645946B1 (en) 2011-06-08

Family

ID=17317357

Family Applications (6)

Application Number Title Priority Date Filing Date
EP99109484A Expired - Lifetime EP0938037B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP92117111A Expired - Lifetime EP0536715B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109587A Expired - Lifetime EP0938039B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109601A Expired - Lifetime EP0938040B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP05027103A Expired - Lifetime EP1645946B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109584A Expired - Lifetime EP0938038B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device

Family Applications Before (4)

Application Number Title Priority Date Filing Date
EP99109484A Expired - Lifetime EP0938037B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP92117111A Expired - Lifetime EP0536715B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109587A Expired - Lifetime EP0938039B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device
EP99109601A Expired - Lifetime EP0938040B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP99109584A Expired - Lifetime EP0938038B1 (en) 1991-10-07 1992-10-07 An apparatus for manipulating an object displayed on a display device

Country Status (5)

Country Link
US (1) US5844547A (en)
EP (6) EP0938037B1 (en)
JP (1) JP2827612B2 (en)
KR (1) KR950012490B1 (en)
DE (5) DE69233133T2 (en)

Families Citing this family (216)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5452240A (en) * 1993-11-23 1995-09-19 Roca Productions, Inc. Electronically simulated rotary-type cardfile
CN1059303C (en) * 1994-07-25 2000-12-06 国际商业机器公司 Apparatus and method for marking text on a display screen in a personal communications device
KR100426295B1 (en) * 1995-07-20 2004-05-17 가부시키가이샤 세가 Image processing unit and game machine
US6243619B1 (en) 1996-05-10 2001-06-05 Amada Company, Ltd. Control method and apparatus for plate material processing machine
EP0902920A1 (en) * 1996-06-07 1999-03-24 Amada Company Limited Control method and apparatus for plate material processing machine
US6417844B1 (en) * 1996-06-25 2002-07-09 Seiko Epson Corporation Data processing apparatus and data processing method
JPH10111776A (en) * 1996-10-08 1998-04-28 Sharp Corp Information processor
JP4044169B2 (en) * 1997-02-26 2008-02-06 株式会社アマダ Display method of information setting screen along process flow and multi-window type NC device having the function
KR100224998B1 (en) * 1997-04-09 1999-10-15 구자홍 Apparatus and method for remote control user interface of pc system
JP3504464B2 (en) * 1997-07-30 2004-03-08 インターナショナル・ビジネス・マシーンズ・コーポレーション Data input device and method
US6920619B1 (en) * 1997-08-28 2005-07-19 Slavoljub Milekic User interface for removing an object from a display
USRE43318E1 (en) * 1997-08-28 2012-04-17 Flatworld Interactives, Llc User interface for removing an object from a display
EP0910011A3 (en) * 1997-10-14 2000-12-06 Canon Kabushiki Kaisha Apparatus and method for displaying job list, and storage medium for such a program
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
WO1999028811A1 (en) * 1997-12-04 1999-06-10 Northern Telecom Limited Contextual gesture interface
US6131047A (en) 1997-12-30 2000-10-10 Ericsson Inc. Radiotelephones having contact-sensitive user interfaces and methods of operating same
JPH11203044A (en) * 1998-01-16 1999-07-30 Sony Corp Information processing system
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
EP1717682B1 (en) * 1998-01-26 2017-08-16 Apple Inc. Method and apparatus for integrating manual input
US6038333A (en) * 1998-03-16 2000-03-14 Hewlett-Packard Company Person identifier and management system
US6610917B2 (en) 1998-05-15 2003-08-26 Lester F. Ludwig Activity indication, external source, and processing loop provisions for driven vibrating-element environments
US6396477B1 (en) 1998-09-14 2002-05-28 Microsoft Corp. Method of interacting with a computer using a proximity sensor in a computer input device
US7358956B2 (en) * 1998-09-14 2008-04-15 Microsoft Corporation Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device
US7256770B2 (en) 1998-09-14 2007-08-14 Microsoft Corporation Method for displaying information responsive to sensing a physical presence proximate to a computer input device
US6456275B1 (en) * 1998-09-14 2002-09-24 Microsoft Corporation Proximity sensor in a computer input device
US6333753B1 (en) 1998-09-14 2001-12-25 Microsoft Corporation Technique for implementing an on-demand display widget through controlled fading initiated by user contact with a touch sensitive input device
US20020018051A1 (en) * 1998-09-15 2002-02-14 Mona Singh Apparatus and method for moving objects on a touchscreen display
US6933919B1 (en) * 1998-12-03 2005-08-23 Gateway Inc. Pointing device with storage
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
SE515077C2 (en) * 1999-04-27 2001-06-05 Kjell Soederlund Data Processing device
US6793619B1 (en) 1999-06-09 2004-09-21 Yaacov Blumental Computer-implemented method and system for giving a user an impression of tactile feedback
KR100866264B1 (en) * 1999-10-20 2008-11-03 코닌클리케 필립스 일렉트로닉스 엔.브이. Information processing device
JP2001134382A (en) * 1999-11-04 2001-05-18 Sony Corp Graphic processor
US7142205B2 (en) * 2000-03-29 2006-11-28 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US7450114B2 (en) * 2000-04-14 2008-11-11 Picsel (Research) Limited User interface systems and methods for manipulating and viewing digital documents
US7576730B2 (en) 2000-04-14 2009-08-18 Picsel (Research) Limited User interface systems and methods for viewing and manipulating digital documents
JP4686886B2 (en) * 2001-04-06 2011-05-25 ソニー株式会社 Information processing device
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US6816154B2 (en) * 2001-05-30 2004-11-09 Palmone, Inc. Optical sensor based user interface for a portable electronic device
US20020180811A1 (en) * 2001-05-31 2002-12-05 Chu Sing Yun Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling
US20030048280A1 (en) * 2001-09-12 2003-03-13 Russell Ryan S. Interactive environment using computer vision and touchscreens
US6690387B2 (en) * 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
JP3847641B2 (en) * 2002-02-28 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing program, computer-readable recording medium storing information processing program, and information processing method
US7487444B2 (en) 2002-03-19 2009-02-03 Aol Llc Reformatting columns of content for display
US11275405B2 (en) * 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US8949716B2 (en) * 2002-07-23 2015-02-03 Motorola Mobility Llc Adjusting target size of display images based on input device detection
US20040021698A1 (en) * 2002-08-05 2004-02-05 Baldwin Amanda K. Intuitive touchscreen interface for a multifunction device and method therefor
US7980936B2 (en) * 2002-09-30 2011-07-19 Igt Apparatus and method for player interaction
WO2004059625A1 (en) * 2002-12-30 2004-07-15 Koninklijke Philips Electronics N.V. Device for playing optical discs
US20050014560A1 (en) * 2003-05-19 2005-01-20 Yacob Blumenthal Method and system for simulating interaction with a pictorial representation of a model
WO2004104807A1 (en) * 2003-05-26 2004-12-02 Fujitsu Limited Item selecting method, information processor, and computer-readable storage medium
JP4723799B2 (en) * 2003-07-08 2011-07-13 株式会社ソニー・コンピュータエンタテインメント Control system and control method
US8164573B2 (en) 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
ATE416415T1 (en) * 2004-03-18 2008-12-15 Koninkl Philips Electronics Nv SCANNING DISPLAY DEVICE
ES2325264T3 (en) * 2004-06-21 2009-08-31 Weike (S) Pte Ltd. VIRTUAL CARD GAME SYSTEM.
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
WO2006013520A2 (en) * 2004-08-02 2006-02-09 Koninklijke Philips Electronics N.V. System and method for enabling the modeling virtual objects
JP4658544B2 (en) * 2004-09-03 2011-03-23 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND INPUT DEVICE
US7794324B2 (en) 2004-09-13 2010-09-14 Pokertek, Inc. Electronic player interaction area with player customer interaction features
JP2008521070A (en) * 2004-09-19 2008-06-19 イー.ビー.ティー.インタラクティブ・リミテッド Computer-implemented method and system for giving a tactile feedback impression to a user
US7762945B2 (en) * 2004-10-13 2010-07-27 E.B.T. Interactive Ltd. Computer-implemented method and system for providing feedback during sex play
JP2008527557A (en) * 2005-01-14 2008-07-24 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Moving an object presented by a touch input display device
EA009976B1 (en) 2005-04-27 2008-04-28 Арузе Корп. Gaming machine
JP4405430B2 (en) * 2005-05-12 2010-01-27 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JP3888382B2 (en) * 2005-07-05 2007-02-28 松下電器産業株式会社 Data processing device
JP4832826B2 (en) 2005-07-26 2011-12-07 任天堂株式会社 Object control program and information processing apparatus
JP4611147B2 (en) * 2005-08-15 2011-01-12 富士通コンポーネント株式会社 Remote operation system and remote operation method
US8225231B2 (en) 2005-08-30 2012-07-17 Microsoft Corporation Aggregation of PC settings
JP4743602B2 (en) 2005-10-04 2011-08-10 任天堂株式会社 Image processing apparatus, image processing program, game apparatus, and game program
TW200715192A (en) * 2005-10-07 2007-04-16 Elan Microelectronics Corp Method for a window to generate different moving speed
EP1955904B1 (en) * 2005-10-31 2013-03-27 Toyota Jidosha Kabushiki Kaisha Parking support device
US7958456B2 (en) 2005-12-23 2011-06-07 Apple Inc. Scrolling list with floating adjacent index symbols
US20070132789A1 (en) * 2005-12-08 2007-06-14 Bas Ording List scrolling in response to moving contact over list of index symbols
US7786975B2 (en) * 2005-12-23 2010-08-31 Apple Inc. Continuous scrolling list with acceleration
JP2007232776A (en) * 2006-02-27 2007-09-13 Mitsubishi Electric Corp Programmable display apparatus, document display method and program implementing method and recording medium with same recorded thereon, and document creation method and program implementing method and recording medium with same recorded thereon
KR100833862B1 (en) * 2006-03-30 2008-06-02 엘지전자 주식회사 Mobile terminal and Method for displaying object therein
KR100686165B1 (en) * 2006-04-18 2007-02-26 엘지전자 주식회사 Portable terminal having osd function icon and method of displaying osd function icon using same
JP2008012199A (en) * 2006-07-10 2008-01-24 Aruze Corp Game system and image display control method thereof
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
WO2008070054A2 (en) 2006-12-04 2008-06-12 Deka Product Limited Partnership Medical device including a slider assembly
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US7911453B2 (en) * 2007-06-29 2011-03-22 Microsoft Corporation Creating virtual replicas of physical objects
US7750895B2 (en) * 2007-06-29 2010-07-06 Microsoft Corporation Navigating lists using input motions
DE102007039446A1 (en) * 2007-08-21 2009-02-26 Volkswagen Ag A method of displaying information in a variable scale vehicle and display device
DE102007039444A1 (en) 2007-08-21 2009-02-26 Volkswagen Ag Method for displaying information in a motor vehicle and display device for a motor vehicle
CN101399897B (en) * 2007-09-30 2010-12-29 宏达国际电子股份有限公司 Image processing method
TWI405454B (en) * 2007-10-01 2013-08-11 Htc Corp Image process method
KR20090042342A (en) * 2007-10-26 2009-04-30 주식회사 메디슨 Device having soft buttons and method for changing attributes theirof
EP2060970A1 (en) * 2007-11-12 2009-05-20 Research In Motion Limited User interface for touchscreen device
US20090122018A1 (en) * 2007-11-12 2009-05-14 Leonid Vymenets User Interface for Touchscreen Device
US8405621B2 (en) 2008-01-06 2013-03-26 Apple Inc. Variable rate media playback methods for electronic devices with touch interfaces
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8205157B2 (en) 2008-03-04 2012-06-19 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US8788967B2 (en) * 2008-04-10 2014-07-22 Perceptive Pixel, Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US8745514B1 (en) 2008-04-11 2014-06-03 Perceptive Pixel, Inc. Pressure-sensitive layering of displayed objects
US8836646B1 (en) 2008-04-24 2014-09-16 Pixar Methods and apparatus for simultaneous user inputs for three-dimensional animation
US10180714B1 (en) 2008-04-24 2019-01-15 Pixar Two-handed multi-stroke marking menus for multi-touch devices
JP5690726B2 (en) 2008-07-15 2015-03-25 イマージョン コーポレーションImmersion Corporation System and method for haptic messaging based on physical laws
JP5100556B2 (en) 2008-07-30 2012-12-19 キヤノン株式会社 Information processing method and apparatus
JP5304478B2 (en) * 2008-08-07 2013-10-02 株式会社リコー Image forming apparatus, operation screen updating method and program
JP5858005B2 (en) * 2008-08-07 2016-02-10 株式会社リコー Display system
CA2734987A1 (en) 2008-08-22 2010-02-25 Google Inc. Navigation in a three dimensional environment on a mobile device
US8477103B2 (en) 2008-10-26 2013-07-02 Microsoft Corporation Multi-touch object inertia simulation
US8466879B2 (en) 2008-10-26 2013-06-18 Microsoft Corporation Multi-touch manipulation of application objects
US8302026B2 (en) * 2008-11-28 2012-10-30 Microsoft Corporation Multi-panel user interface
JP5184384B2 (en) * 2009-01-05 2013-04-17 株式会社ソニー・コンピュータエンタテインメント Control system and control method
US8289288B2 (en) * 2009-01-15 2012-10-16 Microsoft Corporation Virtual object adjustment via physical object detection
US8631354B2 (en) * 2009-03-06 2014-01-14 Microsoft Corporation Focal-control user interface
US8689128B2 (en) 2009-03-16 2014-04-01 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
US8589374B2 (en) 2009-03-16 2013-11-19 Apple Inc. Multifunction device with integrated search and application selection
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8839155B2 (en) 2009-03-16 2014-09-16 Apple Inc. Accelerated scrolling for a multifunction device
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
EP2425322A4 (en) 2009-04-30 2013-11-13 Synaptics Inc Control circuitry and method
JP2010262555A (en) 2009-05-11 2010-11-18 Sony Corp Information processing apparatus and method
JP5141984B2 (en) 2009-05-11 2013-02-13 ソニー株式会社 Information processing apparatus and method
US20100295799A1 (en) 2009-05-21 2010-11-25 Sony Computer Entertainment America Inc. Touch screen disambiguation based on prior ancillary touch input
JP5378884B2 (en) * 2009-06-01 2013-12-25 パナソニック株式会社 Character input device and character conversion method
KR101451999B1 (en) * 2009-07-28 2014-10-21 삼성전자주식회사 Data scroll method and apparatus
DE102010026291A1 (en) 2009-08-06 2011-02-10 Volkswagen Ag motor vehicle
US8624933B2 (en) 2009-09-25 2014-01-07 Apple Inc. Device, method, and graphical user interface for scrolling a multi-section document
KR101631273B1 (en) * 2009-10-26 2016-06-17 삼성전자주식회사 Method and apparatus for providing UI animation
US20110148436A1 (en) * 2009-12-18 2011-06-23 Synaptics Incorporated System and method for determining a number of objects in a capacitive sensing region using signal grouping
US20110148438A1 (en) * 2009-12-18 2011-06-23 Synaptics Incorporated System and method for determining a number of objects in a capacitive sensing region using a shape factor
JP2011177203A (en) 2010-02-26 2011-09-15 Nintendo Co Ltd Object controlling program and object controlling apparatus
US8448084B2 (en) * 2010-04-08 2013-05-21 Twitter, Inc. User interface mechanics
JP5533254B2 (en) * 2010-05-24 2014-06-25 アイシン・エィ・ダブリュ株式会社 Information display device, information display method, and program
JP5354548B2 (en) * 2010-06-11 2013-11-27 サミー株式会社 Game device
US8552999B2 (en) 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
JP5241038B2 (en) 2010-07-01 2013-07-17 パナソニック株式会社 Electronic device, display control method, and program
CN102314297B (en) * 2010-07-07 2016-04-13 腾讯科技(深圳)有限公司 A kind of Window object inertia displacement method and implement device
JP5711479B2 (en) 2010-08-17 2015-04-30 キヤノン株式会社 Display control apparatus and control method thereof
US8797283B2 (en) 2010-11-22 2014-08-05 Sony Computer Entertainment America Llc Method and apparatus for performing user-defined macros
US20120159395A1 (en) 2010-12-20 2012-06-21 Microsoft Corporation Application-launching interface for multiple modes
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
US8907903B2 (en) 2011-01-13 2014-12-09 Sony Computer Entertainment America Llc Handing control of an object from one touch input to another touch input
JP5651494B2 (en) 2011-02-09 2015-01-14 日立マクセル株式会社 Information processing device
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
US10097875B2 (en) 2011-05-25 2018-10-09 Echostar Technologies L.L.C. Apparatus, systems and methods for presentation management of erotica-related media content
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
KR101794000B1 (en) * 2011-06-13 2017-11-06 삼성전자주식회사 Apparatus and method for scrolling in portable terminal
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
JP5694883B2 (en) * 2011-08-23 2015-04-01 京セラ株式会社 Display device
US20130057587A1 (en) 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
US20130067390A1 (en) * 2011-09-09 2013-03-14 Paul J. Kwiatkowski Programming Interface for Semantic Zoom
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
WO2013059488A1 (en) 2011-10-18 2013-04-25 Carnegie Mellon University Method and apparatus for classifying touch events on a touch sensitive surface
US9223761B2 (en) * 2011-11-04 2015-12-29 Microsoft Technology Licensing, Llc Real time visual feedback during move, resize and/or rotate actions in an electronic document
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
JP6196017B2 (en) * 2012-01-13 2017-09-13 サターン ライセンシング エルエルシーSaturn Licensing LLC Information processing apparatus, information processing method, and computer program
JP5994991B2 (en) 2012-01-24 2016-09-21 パナソニックIpマネジメント株式会社 Electronics
JP5936381B2 (en) * 2012-02-09 2016-06-22 キヤノン株式会社 Image processing apparatus, control method therefor, and program
KR102413059B1 (en) 2012-05-11 2022-06-23 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Electronic device, storage medium, program, and displaying method
US8949735B2 (en) 2012-11-02 2015-02-03 Google Inc. Determining scroll direction intent
US9298970B2 (en) * 2012-11-27 2016-03-29 Nokia Technologies Oy Method and apparatus for facilitating interaction with an object viewable via a display
US10691230B2 (en) 2012-12-29 2020-06-23 Apple Inc. Crown input for a wearable electronic device
KR20140114766A (en) 2013-03-19 2014-09-29 퀵소 코 Method and device for sensing touch inputs
US9612689B2 (en) 2015-02-02 2017-04-04 Qeexo, Co. Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer
US9013452B2 (en) 2013-03-25 2015-04-21 Qeexo, Co. Method and system for activating different interactive functions using different types of finger contacts
US9213403B1 (en) 2013-03-27 2015-12-15 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
US9507426B2 (en) 2013-03-27 2016-11-29 Google Inc. Using the Z-axis in user interfaces for head mountable displays
JP5787238B2 (en) 2013-04-10 2015-09-30 コニカミノルタ株式会社 Control device, operation control method, and operation control program
JP5686422B2 (en) * 2013-05-29 2015-03-18 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
JP5511040B2 (en) * 2013-05-29 2014-06-04 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
JP2015035092A (en) 2013-08-08 2015-02-19 キヤノン株式会社 Display controller and method of controlling the same
US10503388B2 (en) 2013-09-03 2019-12-10 Apple Inc. Crown input for a wearable electronic device
JP6200270B2 (en) * 2013-10-11 2017-09-20 サターン ライセンシング エルエルシーSaturn Licensing LLC Information processing apparatus, information processing method, and computer program
CN105247461B (en) * 2014-02-12 2019-05-31 齐科斯欧公司 Pitching and yaw are determined for touch screen interaction
JP6300604B2 (en) 2014-04-01 2018-03-28 キヤノン株式会社 Touch control device, touch control method, and program
KR102298602B1 (en) 2014-04-04 2021-09-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Expandable application representation
EP3129847A4 (en) 2014-04-10 2017-04-19 Microsoft Technology Licensing, LLC Slider cover for computing device
EP3129846A4 (en) 2014-04-10 2017-05-03 Microsoft Technology Licensing, LLC Collapsible shell cover for computing device
JP6482196B2 (en) * 2014-07-09 2019-03-13 キヤノン株式会社 Image processing apparatus, control method therefor, program, and storage medium
JP6478502B2 (en) * 2014-07-11 2019-03-06 キヤノン株式会社 Information processing apparatus, information processing method, and program
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
US9329715B2 (en) 2014-09-11 2016-05-03 Qeexo, Co. Method and apparatus for differentiating touch screen users based on touch event analysis
US11619983B2 (en) 2014-09-15 2023-04-04 Qeexo, Co. Method and apparatus for resolving touch screen ambiguities
US10606417B2 (en) 2014-09-24 2020-03-31 Qeexo, Co. Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns
US10282024B2 (en) 2014-09-25 2019-05-07 Qeexo, Co. Classifying contacts or associations with a touch sensitive device
CN106662891B (en) 2014-10-30 2019-10-11 微软技术许可有限责任公司 Multi-configuration input equipment
US20160202865A1 (en) 2015-01-08 2016-07-14 Apple Inc. Coordination of static backgrounds and rubberbanding
KR20160138806A (en) * 2015-05-26 2016-12-06 엘지전자 주식회사 Glass type terminal and method for controlling the same
JP6643776B2 (en) * 2015-06-11 2020-02-12 株式会社バンダイナムコエンターテインメント Terminal device and program
US10642404B2 (en) 2015-08-24 2020-05-05 Qeexo, Co. Touch sensitive device with multi-sensor stream synchronized data
JP6455489B2 (en) 2016-06-20 2019-01-23 京セラドキュメントソリューションズ株式会社 Display device and display control program
US10984179B2 (en) * 2017-12-04 2021-04-20 Microsoft Technology Licensing, Llc Intelligent object movement
US11009989B2 (en) 2018-08-21 2021-05-18 Qeexo, Co. Recognizing and rejecting unintentional touch events associated with a touch sensitive device
US11252322B2 (en) 2019-04-26 2022-02-15 Canon Kabushiki Kaisha Electronic device capable of performing control in accordance with a movement operation of an operating body and control method thereof
US10942603B2 (en) 2019-05-06 2021-03-09 Qeexo, Co. Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device
US11231815B2 (en) 2019-06-28 2022-01-25 Qeexo, Co. Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing
US11592423B2 (en) 2020-01-29 2023-02-28 Qeexo, Co. Adaptive ultrasonic sensing techniques and systems to mitigate interference

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62150477A (en) * 1985-12-25 1987-07-04 Canon Inc Display device
EP0279652A2 (en) * 1987-02-17 1988-08-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
EP0314395A1 (en) * 1987-10-26 1989-05-03 Crosfield Electronics Limited Interactive image display
EP0448496A2 (en) * 1990-03-12 1991-09-25 International Business Machines Corporation Dynamic selective correlation of graphic entities

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3673327A (en) * 1970-11-02 1972-06-27 Atomic Energy Commission Touch actuable data input panel assembly
JPS6066298A (en) * 1983-09-21 1985-04-16 キヤノン株式会社 Information processor
DE3572878D1 (en) * 1984-05-07 1989-10-12 Siemens Ag Device for moving symbols on the screen of a display unit
JPS63201816A (en) * 1987-02-18 1988-08-19 Hitachi Ltd Cursor display device
JPH083785B2 (en) * 1987-02-24 1996-01-17 富士通株式会社 Display scroll method
US4914624A (en) * 1988-05-06 1990-04-03 Dunthorn David I Virtual button for touch screen
JP2595045B2 (en) * 1988-06-14 1997-03-26 株式会社日立製作所 Touch panel input device
US5174759A (en) * 1988-08-04 1992-12-29 Preston Frank S TV animation interactively controlled by the viewer through input above a book page
US5075673A (en) * 1989-06-16 1991-12-24 International Business Machines Corp. Variable speed, image pan method and apparatus
US5347628A (en) * 1990-01-18 1994-09-13 International Business Machines Corporation Method of graphically accessing electronic data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62150477A (en) * 1985-12-25 1987-07-04 Canon Inc Display device
EP0279652A2 (en) * 1987-02-17 1988-08-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
EP0314395A1 (en) * 1987-10-26 1989-05-03 Crosfield Electronics Limited Interactive image display
EP0448496A2 (en) * 1990-03-12 1991-09-25 International Business Machines Corporation Dynamic selective correlation of graphic entities

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Virtual Integrated Mouse" IBM TECHNICAL DISCLOSURE BULLETIN., vol. 30, no. 10, March 1988 (1988-03), pages 398-401, XP002129895 IBM CORP. NEW YORK., US ISSN: 0018-8689 *
PATENT ABSTRACTS OF JAPAN vol. 011, no. 383 (P-646), 15 December 1987 (1987-12-15) & JP 62 150477 A (CANON INC), 4 July 1987 (1987-07-04) *

Also Published As

Publication number Publication date
EP0938037A3 (en) 2000-03-22
EP0536715A3 (en) 1993-05-26
DE69233133T2 (en) 2004-02-05
US5844547A (en) 1998-12-01
EP0938039A2 (en) 1999-08-25
DE69233211T2 (en) 2004-04-29
EP0938037B1 (en) 2004-01-07
DE69233211D1 (en) 2003-10-23
JPH05100809A (en) 1993-04-23
KR950012490B1 (en) 1995-10-18
EP0536715B1 (en) 2000-07-19
DE69233600D1 (en) 2006-04-20
EP1645946A3 (en) 2006-06-07
DE69231270D1 (en) 2000-08-24
EP0938039A3 (en) 2000-04-05
JP2827612B2 (en) 1998-11-25
EP0938038A2 (en) 1999-08-25
EP0938037A2 (en) 1999-08-25
EP0938040A3 (en) 2000-03-29
DE69233284D1 (en) 2004-02-12
DE69233600T2 (en) 2006-08-10
EP0938039B1 (en) 2006-02-15
EP0938038B1 (en) 2003-09-17
DE69233133D1 (en) 2003-08-21
DE69231270T2 (en) 2000-11-30
EP0536715A2 (en) 1993-04-14
EP0938040A2 (en) 1999-08-25
KR930008594A (en) 1993-05-21
EP0938040B1 (en) 2003-07-16
DE69233284T2 (en) 2004-06-03
EP1645946B1 (en) 2011-06-08
EP0938038A3 (en) 2000-03-22

Similar Documents

Publication Publication Date Title
EP1645946B1 (en) An apparatus for manipulating an object displayed on a display device
US7345675B1 (en) Apparatus for manipulating an object displayed on a display device by using a touch screen
US5821930A (en) Method and system for generating a working window in a computer system
US5568604A (en) Method and system for generating a working window in a computer system
US6771280B2 (en) Apparatus and method for data-processing
US5471571A (en) Method and apparatus for setting a graphical object's position and orientation with viscous dragging
US7770135B2 (en) Tracking menus, system and method
US6473073B1 (en) Digitizer system with on-screen cue indicative of stylus position
US5250929A (en) Interactive overlay-driven computer display system
US20040263424A1 (en) Display system and method
JPH04276821A (en) Operating method for computer display control system and computer display system, control method in computer display system and computer display control apparatus
CN1160242A (en) Improved method and apparatus in computer systems to selectively map tablet input devices using virtual boundary
KR20140038568A (en) Multi-touch uses, gestures, and implementation
JP2000293280A (en) Information input device
US6342894B1 (en) Icon display method
JP2995719B2 (en) Pen input / menu display device
JPH0782314B2 (en) Display scroll method
JPH0580939A (en) Method and device for coordinate input
JP2953587B2 (en) Industrial control equipment
JPH06149464A (en) Screen scroll method in stylus
JP3103085B2 (en) Display input device
CN116594533A (en) Method, device, equipment and medium for processing movement of software interface mouse icon
JPH04218825A (en) Electronic computer
CN112269512A (en) Single-hand operation method and device for mobile equipment
JPS62150422A (en) Coordinate input device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AC Divisional application: reference to earlier application

Ref document number: 0938039

Country of ref document: EP

Kind code of ref document: P

Ref document number: 0536715

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB

17P Request for examination filed

Effective date: 20061204

AKX Designation fees paid

Designated state(s): DE FR GB

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: APPLE INC.

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/048 20060101AFI20101104BHEP

RIN1 Information on inventor provided before grant (corrected)

Inventor name: OKUYAMA, SATOSHI

Inventor name: MINAKUCHI, YU

Inventor name: FUKUE, AKIKO

Inventor name: KAMATA, HAJIME

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 0938039

Country of ref document: EP

Kind code of ref document: P

Ref document number: 0536715

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 69233810

Country of ref document: DE

Effective date: 20110721

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20111028

Year of fee payment: 20

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20120309

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 69233810

Country of ref document: DE

Effective date: 20120309

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 69233810

Country of ref document: DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 69233810

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20121006

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20121006