EP2332035A2 - Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts - Google Patents
Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontactsInfo
- Publication number
- EP2332035A2 EP2332035A2 EP09741332A EP09741332A EP2332035A2 EP 2332035 A2 EP2332035 A2 EP 2332035A2 EP 09741332 A EP09741332 A EP 09741332A EP 09741332 A EP09741332 A EP 09741332A EP 2332035 A2 EP2332035 A2 EP 2332035A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- graphic objects
- main
- subordinate
- pointing means
- objects
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to the field of human-machine interfaces, which concerns all devices allowing users to control electronic or computerized equipment. These are for example mechanical interfaces such as buttons, a keyboard or a wheel. It may also be pointing interfaces such as a mouse, a touchpad ("trackpad” in English), a joystick or a graphics tablet.
- this device relates more particularly to a device for the control of electronic equipment by the manipulation of graphic objects, this device comprising a display screen, a transparent multicontact tactile sensor for the acquisition of multicontact tactile information produced by a plurality pointing means and an electronic control circuit capable of generating control signals based on this tactile information and generating graphic objects on this display screen, each of these graphic objects being associated with at least one specific processing law, each of these tactile information being the subject of a specific treatment determined by its location with respect to the position of these graphic objects.
- the control of an electronic equipment requires at least one human-machine interface device in order to be able to access the various functionalities proposed by appropriate manipulation.
- the user has two devices: the keyboard and the mouse.
- Each hand is then assigned to a set of tasks: the dominant hand points with the mouse (or any other pointing device) - as well as possibly other operations (right click, scroll wheel) - and the other hand performs tasks subordinate (keyboard shortcut, function keys).
- the combination of both hands provides access to all complex software functions much faster and more efficiently than using the dominant hand alone.
- a key with a non-dominant hand finger may also be possible to press a key with a non-dominant hand finger to display a drop-down menu at the location where the cursor corresponding to the position of the mouse is located. The dominant hand can then select an operation from a list by moving the cursor inside the drop-down menu with the mouse.
- some software offers advanced means of document editing that require synchronous intervention of both hands. For example, to duplicate a graphic object, a file or a block containing text, the user presses a key on the keyboard with a finger of the non-dominant hand while simultaneously pressing the mouse button with a finger of the hand dominant. It then moves the selected object by holding down the keyboard key and mouse button to duplicate it.
- a device comprising a keyboard of reduced size and a touch screen associated with the keyboard.
- This keyboard comprises on the one hand a number of keys less than the number of keys of conventional keyboards and on the other hand allows to position a cursor and select objects on the screen.
- the touch screen allows the entry of characters not registered on the keyboard and the activation of contextual commands. Piloting means can be actuated by the keys of the keyboard and / or by the keys of the touch screen, allowing both to display on the touch screen alphanumeric characters not written on the keyboard, and to display on the screen. Display screen for alphanumeric characters written on the keyboard. Control means of the display screen can also be activated by the keys of the keyboard and / or by the keys of the touch screen, making it possible to trigger the display on the display screen of a registered alphanumeric character on the keyboard or on the touch screen.
- the touch screen displays a virtual standard keyboard with a set of virtual keys. The user can enter a key by pointing it with a pointing means (finger, stylus, etc.).
- a pointing means finger, stylus, etc.
- the human-machine interface available to the touch screen allows the user to move a pointing means to easily select a graphic object.
- the touch screen comprises a screen, a transparent multicontact tactile sensor for the acquisition of tactile information, as well as calculation means generating control signals based on this touch information.
- Graphic objects are generated on the touch screen, each of these graphic objects being associated with at least one specific processing law.
- the sensor delivers on each acquisition a plurality of tactile information.
- Each of these tactile information is the subject of a specific treatment determined by its location with respect to the position of one of these graphic objects.
- the object of the present invention is to remedy this technical problem, namely the division of tasks between the two hands in order to proceed with fast and complex manipulations.
- it proposes defining a first series of main graphic objects and a second series of subordinate graphical objects, associated respectively with specific processing laws and with main and subordinate functions.
- These two types of graphic objects may interact in a complementary manner to perform predetermined complex tasks.
- the graphic objects are arranged at the viewing screen so that the first set of objects can be manipulated by a first set of pointing means and the second set of objects can be manipulated by a second set of means. pointing.
- the invention relates to a device for controlling an electronic equipment by the manipulation of graphic objects.
- This device comprises a display screen, a transparent multicontact tactile sensor for acquiring multicontact tactile information produced by a plurality of pointing means and an electronic control circuit capable of generating control signals based on this touch information and generate graphic objects on this visualization screen.
- Each of these graphic objects is associated with at least one specific processing law.
- Each of these tactile information is the subject of a specific treatment determined by its location with respect to the position of these graphic objects.
- This device is characterized in that it comprises a first series of main graphic objects and a second series of subordinate graphic objects, each of these main graphic objects having a main function and being associated with at least one main specific processing law , each of these subordinate graphic objects presenting a subordinate function and being associated with at least one subordinate specific treatment law.
- the functions of these subordinate graphical objects are complementary to the functions of these main graphic objects.
- These subordinate graphic objects are arranged on the display screen at locations different from those of these main graphic objects.
- These graphic objects are arranged so that this first series is manipulated by a first set of pointing means and this second series by a second set of pointing means distinct from the first set. Manipulating at least one of these subordinate graphical objects causes changes in the properties of at least one of these main graphic objects.
- one of the properties of at least one of the main graphic objects modified by the manipulation of at least one of the subordinate graphic objects is the display.
- one of the properties of at least one of the main graphic objects modified by the manipulation of at least one of the subaltern graphic objects is the associated specific processing law.
- one of the properties of at least one of the main graphic objects modified by the manipulation of at least one of the subordinate graphic objects is the location on the display screen .
- the first set of pointing means is manipulated by one hand.
- the first set of pointing means comprises a stylus manipulated by a hand, which allows in particular to perform fine writing functions on the touch screen which is provided with the electronic equipment.
- the first set of pointing means comprises at least one finger of a hand.
- the second set of pointing means is manipulated by a hand.
- the second set of pointing means comprises at least one finger of a hand. It is thus possible to benefit from this hand for example for a support function of the electronic equipment according to the need that the application requires.
- the hands respectively of the first and second sets of pointing means are two different hands of a user. This optimizes the manipulation of graphic objects by using the complementarity of the two hands of the user.
- the hands respectively of the first and second sets of pointing means comprise a dominant hand and a non-dominant hand
- the dominant hand preferentially manipulates the main graphic objects
- the non-dominant hand preferentially manipulates the subaltern graphic objects. This assigns the tasks to be performed by each hand at appropriate levels of complexity, which provides a more precise manipulation of graphic objects.
- a main function of at least one main graphic object is a write function.
- a main function of at least one main graphic object is a pointing function.
- a subordinate function of at least one subordinate graphic object is a function of selecting the functions assigned to at least one main graphic object manipulated by the dominant hand.
- the main and subordinate graphic objects are preferentially and respectively located so as to be brought closer to the dominant hand and the non-dominant hand.
- the distances to be covered by the pointing means are thus reduced and thus optimizes the speed of handling.
- the non-dominant hand advantageously carries out the support of the electronic equipment, which confers on it a dual function, in addition to that of implementation of the secondary tasks.
- the position of at least one subordinate graphic object is modified by the user.
- This operation can be performed by means of an additional graphic object or a sliding operation of the corresponding pointing means while holding down the corresponding detection zone.
- the acquisition properties are modified according to the first and second set of pointing means and the main and subordinate graphic objects. It is thus possible, for example, to adapt the resolution and the scanning frequency according to the needs of the graphic object under consideration. In the case, for example, of a main graphical object whose function is writing, it will be possible to carry out a very high resolution acquisition on this graphic object in order to obtain finer tactile information on what the user has written. user. BRIEF DESCRIPTION OF THE DRAWINGS
- FIG. 1 a diagram of a device for controlling an electronic equipment incorporating a multicontact touch screen according to the invention
- FIG. 2 a structural diagram of a device comprising a multicontact touch screen according to the invention
- FIGS. 3A to 3C diagrams illustrating various functions that a main graphic object can exhibit
- FIGS. 4, 5A, 5B, 6, 7, 8A and 8B diagrams illustrating examples of interaction and complementarity between main and subordinate graphic objects.
- the device comprises electronic equipment 1 such as a display of known type.
- This display can be for example a liquid crystal display.
- This display makes it possible to display a plurality of graphic objects 2 and 3.
- a transparent multicontact tactile sensor 4 is disposed above the display. It makes it possible to acquire a set of points of simultaneous contacts, each contact corresponding to the presence of an object 5b or a finger 5c, 5d or 6b on the surface of the sensor 4 in order to manipulate the main graphic object 2 and the set of subaltern graphic objects 3a, 3b, 3c and 3d.
- a first set 5 consists of the dominant hand 5a, the latter holding a stylet 5b.
- the thumb 5c and the index 5d can also serve as pointing means.
- a second set 6 consists of the non-dominant hand 6a, which is also used to support the electronic device in the case where the latter is portable. In this case, it is the thumb 6b, the only moving finger, which realizes the pointing.
- a first main graphic object 2 can be manipulated by the dominant hand 5a while a set of subaltern graphic objects 3a, 3b, 3c and 3d can be manipulated by the non-dominant hand 6a.
- These graphic objects are arranged on the screen 4 so that the thumb 6b is located near the subaltern graphic objects 3a to 3d and the stylus 5b manipulated by the dominant hand 5a is located near the main graphic object 2. The travel distances of the pointing means are thus significantly reduced.
- the subaltern graphic objects 3a to 3d are arranged next to each other along a vertical axis to facilitate the passage of the thumb 6b from one to the other objects.
- Figure 1 shows the case where the dominant hand 5a is the right hand of the user, while the non-dominant hand 6a is his left hand.
- the user may be provided to have a screen adjustment by the user to position the graphic objects according to this new provision, namely by placing subordinate graphic objects 3a to 3d on the right of the screen and the main graphic object 2 on the left.
- each subordinate graphic object 3a to 3d constituting an alternative that can affect at least one of the main graphic objects displayed.
- the contact operation of a subordinate graphical object 3a, 3b, 3c or 3d can thus cause a modification of the properties of the main graphic object 2 or of another main graphic object previously not displayed, such as for example the display, the associated specific processing law or the location on the display screen 4.
- This device comprises a transparent matrix touch sensor 7, a display screen 4, a capture interface 9, a main processor 10 and a graphics processor 11.
- the first fundamental element of this tactile device is the touch sensor 7, necessary for the acquisition - the multicontact manipulation - using a capture interface 9.
- This capture interface 9 contains the acquisition and acquisition circuits. analysis.
- the touch sensor 7 is of the matrix type. It can be optionally divided into several parts in order to accelerate the capture, each part being scanned simultaneously.
- the data coming from the capture interface 9 is transmitted after filtering, to the main processor 10.
- This executes the local program making it possible to associate the data of the slab with graphical objects which are displayed on the screen 4 in order to to be manipulated.
- the main processor 10 also transmits to the graphics processor 11 the data to be displayed on the display screen 4. This graphics processor 11 performs the control of the graphical interface.
- the matrix sensor 1 is for example a resistive type sensor or projected capacitive type. It is composed of two transparent and superimposed layers on which rows or columns corresponding to conducting wires are arranged. These layers thus form a matrix network of conducting wires.
- the Touch sensor 7 thus consists of two superimposed layers each having a network of transparent electrodes. These are made for example of Indium Tin Oxides (ITO).
- the electrodes of the first layer are configured perpendicular to the electrodes of the second layer to form a matrix.
- the two layers are isolated by spacing spacers.
- the set of electrodes composing the two layers is connected to a control circuit that sequentially supplies and sweeps the touch sensor in order to deduce the state of each node of the matrix at each acquisition phase.
- the device makes it possible to acquire the data on the entire sensor 7 with a sampling frequency of the order of 100 Hz, by implementing the sensor 7 and the control circuit integrated in the main processor 10.
- the acquisition is performed as follows: the columns are fed and the response is detected on each of the lines of the sensor. Contact zones corresponding to the nodes whose state is modified with respect to the idle state are determined as a function of these responses. One or more sets of adjacent nodes are determined whose state is changed. A set of such adjacent nodes defines a contact area. From this node set is computed a position information, referred to as a cursor within the meaning of this patent. In the case of several sets of nodes separated by non-active zones, several independent cursors will be determined during the same scanning phase.
- Cursors are created, tracked or destroyed based on information obtained during successive scans.
- the cursor is for example calculated by a barycentric function of the contact zone.
- the general principle is to create as many sliders as there are contact areas determined on the sensor touch and track their evolution over time. When the user removes his fingers from the sensor, the associated sliders are destroyed. In this way, it is possible to capture the position and the evolution of several fingers on the touch sensor simultaneously.
- the main processor 10 executes the program for associating the sensor data with graphic objects that are displayed on the display screen 8 for manipulation.
- the electrical characteristic measured at each acquisition phase and at each node of the matrix is the potential.
- the potential is the potential.
- the control circuit may be integrated in an integrated circuit 10. This may be a microcontroller of known type. Alternatively, the integrated circuit may be an FPGA or a microprocessor. Such a microprocessor may be the main processor of the electronic apparatus 1. In order to limit the number of inputs and outputs on the integrated circuit, a multiplexer may be interposed.
- FIG. 3 shows different pointing modes with the dominant hand on a graphic object according to an embodiment of the invention. If only one point of contact 12 - a cursor - is detected on top of a main graphic object 2a, a first pointing mode is activated (FIG. 3A). If two fingers are detected, the control circuit calculates the path, the direction and the distance between the two cursors 13a and 13b created. If these sliders move simultaneously in the same direction without the distance between them has been substantially increased from one frame to another, then a second pointing mode is activated ( Figure 3B). If the sliders 13a and 13b move in two opposite directions or the distance between them increases or decreases substantially from one acquisition phase to another, then a third pointing mode is activated (FIG. 3C).
- FIG. 9 represents a functional diagram of the steps for selecting an appropriate control law as a function of the number and displacement of each of the cursors detected above a main object.
- the specific processing law associated with the graphical object displayed will then be applied according to the pointing mode.
- the first mode will allow for example to move the graphic object 2a according to the positioning of the cursor 12.
- the second mode will for example accelerate the displacement based the trajectory of the two sliders 13a and 13b, while the third mode will for example enlarge the object according to the distance between the two sliders 13a and 13b.
- the pointing modes illustrated in Figure 3 can be applied to other types of graphic objects.
- the graphic object is a text box that allows the user to enter a manuscript with the help of fingers or a stylus
- treatment laws specific to that object will be applied according to the pointing mode.
- the cursor path will be followed from one acquisition phase to another and a line segment or an interpolated curve will be drawn from one point to another. This allows rendering a handwriting by applying a character recognition process.
- the user can scroll the text box.
- the third pointing mode the user pushes aside or presses both fingers to enlarge the text box.
- Figure 4 shows a pointing mode with the non-dominant hand 6a.
- the non-dominant hand 6a supports the apparatus. Only 6b thumb is available to operate. The latter can nevertheless handle graphic objects, such as buttons, arranged on the edge of the screen.
- a privileged pointing mode makes it possible to activate a first button 14 positioned at lower left corner of the screen 4, which has the effect of unfolding a menu along the screen, comprising a defined number of buttons 3a to 3d. To access the various buttons, the thumb 6b of the non-dominant hand 6a slides vertically.
- a countdown (TTL, or "Time to Live” in English language) is activated.
- TTL Time to Live
- the delay of the TTL can be for example two seconds.
- the drop-down menu proposes submenus arranged perpendicularly to the first drop-down menu.
- FIGS. 5A and 5B show a combination mode of the pointing modes described in FIGS. 3 and 4.
- the activation of a button of FIG. drop-down menu by the thumb 6b of the non-dominant hand 6a modifies the specific processing law applied to the graphic object 2a manipulated by the dominant hand 5a.
- the 3d button "select" "select" in English language
- the first pointing mode ( Figure 5A) of the dominant hand 5a has the effect of cutting a portion of the image to the select, while the third pointing mode ( Figure 5B) widens this selection.
- FIG. 10 is a diagram of the steps for determining a collection of specific control laws using a subordinate object (3a-3d) and then selecting according to the pointing mode an appropriate control law from the collection of command laws.
- FIGS. 6 and 7 illustrate another mode of combination of the pointing modes described above.
- the activation of a button 3c of the drop-down menu by the thumb 6b of the non-dominant hand 6a has the effect of temporarily displaying another main object superimposed on the first main object 2a.
- This object can be, for example, a drop-down menu with different features in the form of an enumerated list or a list of icons.
- the dominant hand 5a can then select one of the functions.
- the thumb of the non-dominant hand 6a selects the "edit" button 3c ("edit" in English language) as illustrated in FIG.
- a drop-down menu 15 appears and makes it possible to carry out with the dominant hand 5a a plurality of operations: copy 15a, paste 15b, delete 15c. If the thumb 6b of the non-dominant hand 6a selects the "export" button 3a ("export" in English language) as illustrated in FIG. 7, a menu 16 appears and makes it possible to select, for example, the format or the target towards which one wishes to "export" the document.
- Figure 11 shows a diagram of the steps for displaying a contextual object. It should be noted that the contextual object is displayed at an arbitrary point on the screen, for example, in the center.
- FIG. 12 represents a variant of the method described in FIG. 11.
- the contextual object is not displayed at coordinates determined arbitrarily, but as a function of the last position of the last detected pointer. This embodiment is particularly advantageous for large screens. So, the contextual object is displayed in the last place where the dominant hand of the user was present, which avoids moving the dominant hand to an arbitrary place on the screen to use the contextual object.
- the stylus 5b is manipulated by the dominant hand 5a in order to write on the graphic object 2a of the display screen 4 (FIG. 8A).
- it may be provided to perform the acquisition of the matrix of the multicontact tactile sensor with a higher resolution at the level of the zone where the writing is detected by the presence of a contact zone 17 (FIG. 8B).
- the resolution is increased in a small area around the last point of contact 17 detected, which allows to anticipate the next acquisition the movement of the stylus 5b.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0805183A FR2936326B1 (fr) | 2008-09-22 | 2008-09-22 | Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts |
PCT/FR2009/001121 WO2010103195A2 (fr) | 2008-09-22 | 2009-09-22 | Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2332035A2 true EP2332035A2 (fr) | 2011-06-15 |
Family
ID=40578023
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP09741332A Withdrawn EP2332035A2 (fr) | 2008-09-22 | 2009-09-22 | Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts |
Country Status (7)
Country | Link |
---|---|
US (1) | US20110169760A1 (fr) |
EP (1) | EP2332035A2 (fr) |
JP (1) | JP2012503241A (fr) |
KR (1) | KR20110063561A (fr) |
CN (1) | CN102160025A (fr) |
FR (1) | FR2936326B1 (fr) |
WO (1) | WO2010103195A2 (fr) |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090237374A1 (en) * | 2008-03-20 | 2009-09-24 | Motorola, Inc. | Transparent pressure sensor and method for using |
US9018030B2 (en) | 2008-03-20 | 2015-04-28 | Symbol Technologies, Inc. | Transparent force sensor and method of fabrication |
US8988191B2 (en) | 2009-08-27 | 2015-03-24 | Symbol Technologies, Inc. | Systems and methods for pressure-based authentication of an input on a touch screen |
EP3882750A1 (fr) | 2010-01-20 | 2021-09-22 | Nokia Technologies Oy | Entrée d'utilisateur |
US8963874B2 (en) | 2010-07-31 | 2015-02-24 | Symbol Technologies, Inc. | Touch screen rendering system and method of operation thereof |
JP5640680B2 (ja) * | 2010-11-11 | 2014-12-17 | ソニー株式会社 | 情報処理装置、立体視表示方法及びプログラム |
US8593421B2 (en) * | 2011-03-22 | 2013-11-26 | Adobe Systems Incorporated | Local coordinate frame user interface for multitouch-enabled devices |
US8553001B2 (en) | 2011-03-22 | 2013-10-08 | Adobe Systems Incorporated | Methods and apparatus for determining local coordinate frames for a human hand |
CN102819380A (zh) * | 2011-06-09 | 2012-12-12 | 英业达股份有限公司 | 电子装置及其操控方法 |
CN102855076B (zh) * | 2011-07-01 | 2016-06-15 | 上海博泰悦臻电子设备制造有限公司 | 触摸屏的控制方法及装置、移动终端设备 |
CN108694012B (zh) * | 2011-11-28 | 2022-04-22 | 联想(北京)有限公司 | 在屏幕上显示对象的方法和系统 |
TW201327273A (zh) * | 2011-12-23 | 2013-07-01 | Wistron Corp | 觸控按鍵模組及其模式切換方法 |
US8863042B2 (en) | 2012-01-24 | 2014-10-14 | Charles J. Kulas | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
AU2013262488A1 (en) * | 2012-05-18 | 2014-12-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs |
US9684398B1 (en) * | 2012-08-06 | 2017-06-20 | Google Inc. | Executing a default action on a touchscreen device |
JP6016555B2 (ja) * | 2012-09-25 | 2016-10-26 | キヤノン株式会社 | 情報処理装置及びその制御方法、並びにプログラムと記憶媒体 |
US10620775B2 (en) * | 2013-05-17 | 2020-04-14 | Ultrahaptics IP Two Limited | Dynamic interactive objects |
US9436288B2 (en) | 2013-05-17 | 2016-09-06 | Leap Motion, Inc. | Cursor mode switching |
US9261991B2 (en) * | 2013-05-28 | 2016-02-16 | Google Technology Holdings LLC | Multi-layered sensing with multiple resolutions |
EP2816460A1 (fr) * | 2013-06-21 | 2014-12-24 | BlackBerry Limited | Clavier et système de geste à écran tactile |
US20160180813A1 (en) * | 2013-07-25 | 2016-06-23 | Wei Zhou | Method and device for displaying objects |
US9841821B2 (en) * | 2013-11-06 | 2017-12-12 | Zspace, Inc. | Methods for automatically assessing user handedness in computer systems and the utilization of such information |
KR20150127989A (ko) * | 2014-05-08 | 2015-11-18 | 삼성전자주식회사 | 사용자 인터페이스 제공 방법 및 장치 |
CN104076986B (zh) * | 2014-07-25 | 2015-12-09 | 上海逗屋网络科技有限公司 | 一种用于多点触摸终端的触摸控制方法与设备 |
JP2016095716A (ja) * | 2014-11-14 | 2016-05-26 | 株式会社コーエーテクモゲームス | 情報処理装置、情報処理方法、ならびに、プログラム |
JP6757140B2 (ja) * | 2016-01-08 | 2020-09-16 | キヤノン株式会社 | 表示制御装置及びその制御方法、プログラム、並びに記憶媒体 |
JP6500830B2 (ja) * | 2016-04-27 | 2019-04-17 | 京セラドキュメントソリューションズ株式会社 | 手書き文字入力装置、画像形成装置及び手書き文字入力方法 |
US11409410B2 (en) | 2020-09-14 | 2022-08-09 | Apple Inc. | User input interfaces |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US9292111B2 (en) * | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US20050052427A1 (en) * | 2003-09-10 | 2005-03-10 | Wu Michael Chi Hung | Hand gesture interaction with touch surface |
FR2866726B1 (fr) * | 2004-02-23 | 2006-05-26 | Jazzmutant | Controleur par manipulation d'objets virtuels sur un ecran tactile multi-contact |
US20060022953A1 (en) * | 2004-07-30 | 2006-02-02 | Nokia Corporation | Left-hand originated user interface control for a device |
EP2000894B1 (fr) * | 2004-07-30 | 2016-10-19 | Apple Inc. | Interfaces d'utilisateur graphique à base de mode pour dispositifs d'entrée tactiles |
CN101133385B (zh) * | 2005-03-04 | 2014-05-07 | 苹果公司 | 手持电子设备、手持设备及其操作方法 |
CN102169415A (zh) * | 2005-12-30 | 2011-08-31 | 苹果公司 | 具有多重触摸输入的便携式电子设备 |
ATE423344T1 (de) * | 2006-06-30 | 2009-03-15 | Christian Iten | Verfahren zur positionierung eines cursors auf einem berührungsempfindlichen bildschirm |
DE102006051967A1 (de) * | 2006-11-03 | 2008-05-08 | Ludwig-Maximilians-Universität | Digitales Informationsverarbeitungssystem mit Benutzerinteraktionselement |
US7956847B2 (en) * | 2007-01-05 | 2011-06-07 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
KR101377949B1 (ko) * | 2007-04-13 | 2014-04-01 | 엘지전자 주식회사 | 오브젝트 검색 방법 및 오브젝트 검색 기능을 갖는 단말기 |
-
2008
- 2008-09-22 FR FR0805183A patent/FR2936326B1/fr not_active Expired - Fee Related
-
2009
- 2009-09-22 KR KR1020117009144A patent/KR20110063561A/ko not_active Application Discontinuation
- 2009-09-22 CN CN2009801370554A patent/CN102160025A/zh active Pending
- 2009-09-22 EP EP09741332A patent/EP2332035A2/fr not_active Withdrawn
- 2009-09-22 WO PCT/FR2009/001121 patent/WO2010103195A2/fr active Application Filing
- 2009-09-22 US US13/062,883 patent/US20110169760A1/en not_active Abandoned
- 2009-09-22 JP JP2011527370A patent/JP2012503241A/ja active Pending
Non-Patent Citations (1)
Title |
---|
See references of WO2010103195A3 * |
Also Published As
Publication number | Publication date |
---|---|
FR2936326A1 (fr) | 2010-03-26 |
KR20110063561A (ko) | 2011-06-10 |
US20110169760A1 (en) | 2011-07-14 |
WO2010103195A3 (fr) | 2011-04-07 |
FR2936326B1 (fr) | 2011-04-29 |
CN102160025A (zh) | 2011-08-17 |
WO2010103195A2 (fr) | 2010-09-16 |
JP2012503241A (ja) | 2012-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010103195A2 (fr) | Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts | |
US20210390252A1 (en) | Natural quick function gestures | |
US8413075B2 (en) | Gesture movies | |
FR2917516A1 (fr) | Transpositions mode vitesse / mode positionnel | |
EP2524294B1 (fr) | Procede de selection d'un element d'une interface utilisateur et dispositif mettant en oeuvre un tel procede. | |
EP2235615B1 (fr) | Circuit electronique d'analyse a modulation de caracteristiques de balayage pour capteur tactile multicontacts a matrice passive | |
US7777732B2 (en) | Multi-event input system | |
EP2310932B1 (fr) | Procédé d'acquisition et d'analyse d'un capteur tactile multicontacts suivant un principe dichotomique, circuit électronique et capteur tactile multicontacts mettant en oeuvre un tel procédé | |
US20120030566A1 (en) | System with touch-based selection of data items | |
EP2956846B1 (fr) | Procédé, appareil et support de stockage pour naviguer dans un écran d'affichage | |
EP2321833A1 (fr) | Capteur tactile multicontacts a moyens d'espacement de taille et impedance variables | |
FR2980004A1 (fr) | Dispositif de pointage temporaire pour terminal mobile equipe d'un ecran de visualisation tactile principal et d'un ecran de visualisation auxiliaire | |
EP2898391B1 (fr) | Methode de selection de mode d'interactivite | |
US10970476B2 (en) | Augmenting digital ink strokes | |
FR3079048A1 (fr) | Procede d’interaction entre d’une part au moins un utilisateur et/ou un premier dispositif electronique et d’autre part un second dispositif electronique | |
US20170228128A1 (en) | Device comprising touchscreen and camera | |
FR2751442A1 (fr) | Dispositif interface homme/machine de poche | |
WO2015082817A1 (fr) | Procédé de commande de l'interaction avec un écran tactile et équipement mettant en oeuvre ce procédé | |
WO2021122410A1 (fr) | Procédé et système de visualisation d'un contenu numérique affiché par un appareil électronique | |
FR2966312A1 (fr) | Procede de navigation au sein d'une interface de terminal mobile, et terminal mobile correspondant | |
FR3112628A1 (fr) | Dispositif de pointage informatique | |
FR3017470A1 (fr) | Procede de saisie sur un clavier numerique, interface homme machine et appareil mettant en œuvre un tel procede | |
WO2019058036A1 (fr) | Procédé d'exploitation d'un dispositif informatique et dispositif informatique mettant en œuvre celui-ci | |
CN117707732A (zh) | 应用切换方法、装置、电子设备和可读存储介质 | |
TWI427495B (zh) | 作業平台系統、作業方法與主機裝置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20110330 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA RS |
|
DAX | Request for extension of the european patent (deleted) | ||
19U | Interruption of proceedings before grant |
Effective date: 20150422 |
|
19W | Proceedings resumed before grant after interruption of proceedings |
Effective date: 20211001 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
PUAJ | Public notification under rule 129 epc |
Free format text: ORIGINAL CODE: 0009425 |
|
32PN | Public notification |
Free format text: CONSTATATION DE LA PERTE D'UN DROIT CONFORMEMENT A LA REGLE 112(1) CBE (OEB FORM 2524 EN DATE DU 07.10.2021) |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20150401 |