EP2671146A1 - Interface homme-machine tridimensionnelle - Google Patents

Interface homme-machine tridimensionnelle

Info

Publication number
EP2671146A1
EP2671146A1 EP12708880.5A EP12708880A EP2671146A1 EP 2671146 A1 EP2671146 A1 EP 2671146A1 EP 12708880 A EP12708880 A EP 12708880A EP 2671146 A1 EP2671146 A1 EP 2671146A1
Authority
EP
European Patent Office
Prior art keywords
distance
symbol
control object
display
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12708880.5A
Other languages
German (de)
English (en)
French (fr)
Inventor
Didier Roziere
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quickstep Technologies LLC
Original Assignee
Nanotec Solution SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanotec Solution SAS filed Critical Nanotec Solution SAS
Publication of EP2671146A1 publication Critical patent/EP2671146A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • the present invention relates to a method for selecting commands that can be implemented in a three-dimensional man-machine interface. It also relates to a device implementing the method.
  • the field of the invention is more particularly but in a nonlimiting manner that of contactless machine man interfaces.
  • Touch interfaces or touch screens, are currently widely used for controlling devices as varied as computers, mobile phones, ....
  • they comprise a display screen and sensors that make it possible to determine the point (s) of contact between the surface of the screen and one or more control objects such as fingers or a stylus.
  • the screen may be for example covered with a mesh of capacitive electrodes, and the position of the object is detected from its interactions, in the form of capacitive couplings, with the electrodes.
  • the touch interfaces also include a software part for interpreting the user's commands.
  • the display is modified according to the position of the detected control object (s), which allows the user to have a visual control of his actions and to select commands.
  • Gesture interfaces or 3D interfaces, are also known in which a third dimension is added with the possibility of detecting objects at a distance before they touch the surface of the screen. These interfaces are equipped with sensors for measuring the position in the space, relative to the interface, of one or more control objects.
  • Capacitive measurement technologies are also well suited to the realization of this type of interfaces.
  • Rozière document FR 2 844 349 discloses a capacitive proximity sensor comprising a plurality of independent electrodes, which makes it possible to measure the capacitance and the distance between the electrodes and an object in proximity to distances. several tens, even hundreds of millimeters.
  • the electrodes can be made transparently using, for example, ⁇ (indium tin oxide) and deposited on the display screen.
  • HMI human machine interface
  • the object of the present invention is to propose a method for selecting commands (or computer objects) in a human machine interface (HMI) with three-dimensional measurement capabilities, which makes full use of the three-dimensional dimension of the software interface. measurements.
  • HMI human machine interface
  • This objective is achieved with a method for selecting commands, implementing a control interface, a display and at least one sensor capable of detecting at least one control object, comprising steps:
  • the method according to the invention may furthermore comprise steps of: obtaining position information of at least one control object with respect to the control interface by means of the sensor (s),
  • the display mode of a symbol may comprise differentiated graphic representations of this symbol making it possible to visualize a state such as highlighting with a view to a selection, a selection, the execution of a command, a displacement , a rotation, a modification ....
  • the display mode may correspond, for example, to a highlighting, a graphical differentiation with respect to other symbols displayed by means of a change of color or size, or a re-display of the symbol differently and shifted to be visible for example beyond a control object.
  • the method according to the invention may further comprise a step of using at least one of the following sets of information: distance information, distance and position information, to determine the symbol (s) displayed (s).
  • the determination of the displayed symbol may include a selection of the symbols displayed on the display, thus commands and / or groups of commands accessible on the interface, depending on the distance and / or position information.
  • Distance and position information can include:
  • the information provided by the sensors for example the physical quantities measured by these sensors
  • dependent preferably monotonically on the distance and / or the position of the control object (s) relative to the interface control ;
  • quantities representative of speeds and / or accelerations of the control object corresponding to quantities derived from distances and / or positions; information relating to trajectories, that is to say to time sequences of distances and / or trajectories.
  • the method according to the invention can implement at least one of the following types of measurements:
  • These measurements may in particular make it possible to obtain distance and / or position information.
  • the measured capacitive interactions can include:
  • the variations in measured light intensities can be generated for example by the interruption of light beams by control objects, or shadow effects due to the presence of control objects.
  • the method according to the invention may further comprise steps:
  • This inclusion can be defined logically, such as in a tree of hierarchical commands, or a stack of commands or sets of commands.
  • the second symbols may be displayed at positions substantially different from that of the first symbol on the display, for example so as not to mask the first symbol.
  • the method according to the invention may further comprise steps: displaying a first symbol representing a first command or a first set of commands,
  • the second symbol may be displayed at a position substantially identical to that of the first symbol on the display, as for example to illustrate a movement in the direction of the depth in a stack of symbols from which elements would be removed as and when moving the command object.
  • the method according to the invention may further comprise a step of selecting a command comprising a step of verifying at least one selection condition based on a set of information among: distance information, distance information and position.
  • the distance of the control object is less than a predetermined selection distance
  • the distance of the control object is less than a predetermined selection distance for a predetermined minimum duration
  • control object is in contact with the surface of the control interface
  • control object makes a fast return trip, that is to say for example a round trip over a distance less than a predetermined distance for a duration shorter than a predetermined duration,
  • At least two control objects perform a convergent movement in position to a predetermined position, such as a gripping or pinching motion (or any other relative motion).
  • the selection of a command can take place when the command object or objects are in the neighborhood or converge to a position defined for this command.
  • the method according to the invention may furthermore comprise a step of executing a command (previously selected) of one of the following types: execution of a computer program, execution of an application, display of the content of a file stored on a means of computer storage, displaying an image, playing a sound, playing multimedia content, or any other command.
  • the method according to the invention may further comprise a step of executing a command for moving a symbol, which comprises:
  • a step of validating the movement of said symbol comprising a step of verifying at least one validation condition based on a set of information among: distance information, distance and position information.
  • the method according to the invention may furthermore comprise a step of verifying at least one validation condition among the following validation conditions:
  • the distance of the control object is less than a predetermined selection distance
  • the distance of the control object is less than a predetermined selection distance for a predetermined minimum duration
  • control object is in contact with the surface of the control interface
  • At least two command objects perform a divergent movement in position around a predetermined position (or any other relative movement).
  • the method according to the invention may further comprise steps:
  • Activation of the display when the distance of the control object is less than a predetermined activation distance for example to turn off the screen when not in use and save energy.
  • a device for selecting commands comprising:
  • a display at least one sensor capable of detecting a control object
  • display management means capable of producing a display of at least one symbol representing a command or a set of commands
  • which device further comprises calculation means able to process said distance information to determine the symbol (s) displayed (s).
  • the display may be a display screen, or any other display means, for example in relief (3D display).
  • control interface the sensors and the display can be in any arrangement, such as for example:
  • control interface provided with sensors and on the other hand a display.
  • the control interface may include a pad connected to a computer, and display may be the computer screen, separate from the pad.
  • the device according to the invention may furthermore comprise:
  • calculation means able to process said position information to determine the symbol (s) displayed.
  • the device according to the invention may further comprise sensors of at least one of the following types:
  • the optical sensors may comprise, for example, optical barriers with light sources emitting light beams and photodetectors arranged so as to be illuminated by these light beams when they are not interrupted by control objects. They may also include photodetectors sensitive to illumination variations such as shadow or reflection effects due to the presence of control objects, for example integrated in a screen based on TFT or OLED technology.
  • the device according to the invention may further comprise all types of sensors capable of producing distance and / or position information. It may include ultrasound acoustic sensors, arranged for example so as to allow a localization of the control objects by echo measurements and triangulation.
  • a device of one of the following types computer, telephone, smartphone, tablet, display screen, terminal, characterized in that it comprises a device for selecting commands implementing the process according to the invention.
  • FIG. 1 shows a diagram of a three-dimensional human-machine interface according to the invention
  • FIG. 2 presents a convention of designation of positions on the detection surface
  • FIG. 3 presents a command set structure with the position and distance conditions for accessing it, and an example of a route in this structure
  • FIGS. 4 (a) to (d) illustrate a temporal sequence of command symbols or set of commands as they are displayed on the display screen when the structure of FIG. 3 is traversed
  • FIGS. 5 (a), 5 (b) and 5 (c) illustrate a first time sequence variant of command symbols or set of commands as they are displayed when traversing a stack
  • FIGS. 5 (a), 5 (b), 5 (d) and 5 (e) illustrate a second variant of this time sequence
  • FIG. 6 presents a stack of controls or sets of controls with the position and distance conditions allowing access to the elements, according to a first variant illustrated in FIG. 6 (a) and corresponding to FIG. 5 (a). , 5 (b) and 5 (c), and according to a second variant it is lustrous in Fig. 6 (b) and corresponding to Figs. 5 (a), 5 (b), 5 (d) and 5 (e), respectively,
  • Fig. 7 shows a motion control sequence, with Fig. 7 (a) control selection, Fig. 7 (b) control enable, and Fig. 7 (c) rotation of the symbol.
  • a mode of realization of the invention is described implementing a human-machine interface (IH M) which includes capacitive sensors.
  • IH M human-machine interface
  • this mode of real isation is a non-limiting example of implementation of the invention.
  • Such an interface is for example well suited to the realization of a human machine interface (IH M) for a host system such as a mobile phone, a smartphone, a tablet or a computer.
  • the interface comprises:
  • a display screen 2 based, in a non-limiting manner, on a liquid crystal, LCD, TFT (Thin-film transistor) or OLED (Organic Light-Emitting Diode) technology; in English or organic electroluminescent iodine);
  • control interface 1 comprising a substantially transparent detection surface equipped with capacitive sensors 6 which are also substantially transparent and able to detect the presence of at least one control object 3 such as a hand, a finger or a stylet.
  • the sensors 6 provide information relating to the distance 4 along the Z axis between the object 3 and the detection surface of the interface 1, and information relating to the position 5 in the plane (X, Y 3) on the control interface 1. They are also able to detect a contact between the control object 3 and the detection surface of the interface 1. .
  • the information relating to section 4 and position 5 includes equivalent distance and position measurements. These measures, not necessarily expressed in units of length, are trad uctions of capacity measurements or capacity variations. In particular, physical characteristics of the control object 3 may affect the measured capacitances and hence their translation in terms of distance and / or equivalent positions.
  • the information relating to the distance 4 and to the position 5 may also comprise trajectories, defined as time sequences of distances 4 and / or of positions 5, and derived quantities such as speeds and accelerations.
  • the sensors 6 comprise capacitive electrodes based on ITO
  • the capacitive electrodes of the sensors 6 are connected to a measurement electronics 7 which makes it possible to calculate the distance 4 and the position 5.
  • a measurement electronics 7 which makes it possible to calculate the distance 4 and the position 5.
  • the sensors 6 and the electronics 7 are made according to a mode described in FR 2,844,349 to Rozière. They comprise a plurality of independent electrodes 6 distributed on the surface of the interface 1. These electrodes 6 are connected to a floating detection electronics 7 or in other words referenced to a floating electrical potential. A guard electrode, also at the floating reference potential, is placed along the rear face of the measurement electrodes 6, between them and the display screen 2, so as to eliminate any parasitic capacitance. All the electrodes are at the same potential and there is thus no coupling capacity between the electrodes likely to degrade the measurement of the capacitance.
  • This detection electronics 7 and its modes of implementation that can be used in the context of the present invention are also described in detail in document FR 2756048 of Rozière, to which the reader is invited to refer.
  • Scanners make it possible to measure sequentially the capacitance and therefore the distance between the electrodes 6 and the control object 3.
  • the electrodes 6 that are not “interrogated” are also kept at the potential of the guard, again to eliminate parasitic capacitances.
  • the host system also comprises computing means 8.
  • These calculation means 8 usually include a microprocessor (CPU "Central Processing Unit” in English) associated with components such as random access memory (RAM), means of mass storage (hard disk, flash memory, ...), and allow to execute one (or a plurality) of computer program (s) or software.
  • CPU Central Processing Unit
  • RAM random access memory
  • HDD hard disk, flash memory
  • This software interface contributes to carrying out the steps of the method according to the invention, which comprise:
  • the human machine software interface corresponds to what the user sees on the display 2. It interacts with this software HMI by using one or more control objects 3 such as his fingers, a stylus, .. ..
  • the software comprises a graphical representation, symbolic, of the host system and / or possible actions:
  • the commands can be organized according to hierarchical structures of three-dimensional nature, which represent sets of commands and among which we distinguish in particular: - tree structures or folder trees in which each folder includes a set of commands and / or subfolders,
  • the prior art HMIs are essentially based on a two-dimensional type of navigation, which only takes into account the position 5 of the control object 3 to select the commands, whether it is the cursor of a mouse (flyover or click), a physical contact between an object 3 and the detection surface of the interface 1 (tapping) or even an overflight of the detection surface of the interface 1.
  • the navigation in structures of three-dimensional nature is actually reduced to a series of actions in the plane: For example, you have to tap an icon to open a folder and view the contents or view stacked commands, that is to say access a different hierarchical (or topological) level.
  • the method according to the invention makes it possible to navigate in a truly three-dimensional manner in an HMI by exploiting the distance measurements 4.
  • it makes it possible to access the different hierarchical (or topological) layers of a set of commands arranged according to a structure of three-dimensional nature by varying the distance 4 between the control object 3 and the detection surface of the interface 1.
  • This "access” is displayed on the display 2 by displaying the symbols (or icons) representing a command or a set of hierarchical (or topological) level commands selected based on distance 4.
  • Navigation is said to be three-dimensional insofar as it is possible by using the distance information 4 to browse hierarchical or topological levels of a command structure and / or command group for which levels can be represented on the display 2 by one or a plurality of symbols.
  • a command can be selected by selecting its representative symbol on ⁇ . This selection can be made especially for its execution, or to move the representative symbol on the display 2 (in which case the command in question comprises moving the symbol on the display).
  • the selection of a command includes the verification of at least one selection condition, or in other words, the selection of a command is validated when one or more selection conditions (or time sequences of selection conditions) are satisfied.
  • Various selection conditions can be implemented, including within the same HMI.
  • Different selection conditions can be implemented to allow the execution of different commands possibly attached or represented by the same symbol on the display 2. These commands may for example relate to the execution of an application represented by an icon, and the moving this icon.
  • selection conditions applicable in the context of the invention, there are in particular the following selection conditions:
  • the distance 4 of the control object 3 is less than a predetermined selection distance
  • control object 3 is in contact with the surface of the control interface 1.
  • selection conditions based on a distance detection 4 minimum or less than a threshold can be used without generating ambiguity with respect to the command selection tasks because a command has no lower hierarchical or topological level (at least in the application in which this command is selected).
  • a command has no lower hierarchical or topological level (at least in the application in which this command is selected).
  • These selection conditions can be implemented by adding a condition on the urea (a predetermined minimum amount) to limit the risk of false orders.
  • control object 3 performs a fast reverse-feed in phase 4, that is to say for example a round trip in a range (or a difference) of distances 4 less than a predetermined distance. during a urea less than a predetermined urea.
  • a condition of this type corresponds to a virtual "click", since it is done without contact.
  • the position measurement 5 is used to determine the selected command.
  • At least two command objects perform a convergent movement in position towards a position corresponding to the symbol of the command, according to a gripping or gripping movement.
  • a selection condition can also be used as a deselection or validation condition, in particular for "releasing" an object when the command concerns a manipulation or movement of a symbol on the display 2.
  • the position 5 and the distance 4 of the control object 3 can be visualized on the display screen 2 by means of a circular pattern centered on the position 5 and of a dameter dependent on step 4, or any other reason.
  • This tree includes groups of commands or folders represented by the symbol or icon 10, and commands represented by the symbol or icon 11 on the display screen 2.
  • zones P1, P2, P3, P4 corresponding to four positions 5 are defined on the detection surface of the interface 1 and on the display screen 2 placed below.
  • the control object 3 is also likened to the finger 3 of a user.
  • control structures are defined whose first hierarchical level, namely the command sets DU, D12, D13 and D14 respectively, is accessible when the control object 3 is located at the same time. distances 4 between H1 and H2.
  • Command sets D21, D22, D23 and control C24 are also defined which are included in D12 and belong to a second hierarchical level accessible when control object 3 is at distances 4 between H2 and H3.
  • Arrows 12 illustrate the path of finger 3 in the space of distances 4 (H1, H2, H3) and positions 5 (P1, ... P4) corresponding to the example below.
  • the finger 3 is at a distance 4 greater than H1 and no control unit is selected on the display screen 2.
  • the display of a new hierarchical level can replace that of the previous level to maintain a good readability, for example on a small screen 2. It is also possible to display the contents of a lower hierarchical level near the symbol of the group of higher hierarchical level commands selected.
  • the representative symbol of a command group may comprise a representation of the symbols of the elements or commands that it includes (thus a representation of their reduced icons), and the display of content icons can be made in such a way that the user has the impression of zooming in the content when he / she accesses the hierarchical level of this content.
  • commands can be gathered on the display screen 2 in the form of a stack 21 which groups together commands 11 and / or sets of commands 10.
  • FIG. 5 illustrates a case where there is only one stack 21 of controls 11 initially visible on the display 2.
  • the finger 3 is at a distance 4 greater than a distance H1 and no command or control unit is selected.
  • the arrows 22 in FIG. 6 (a) illustrate the path of the finger 3 in the space of the distances 4 (H 1,... H 4) and the positions 5 (PI, P2) corresponding to this variant.
  • This variant is well suited for example to the visualization of images, in which case the symbol is the image and simply controls its visualization.
  • FIGS. 5 (d), 5 (e) and 6 (b) by lowering its finger 3 above the position PI at a distance 4 between H2 and H3, the user views the first command Cl of the stack, whose symbol is displayed at a position different from PI, for example P2. The display of FIG. 5 (d) is thus obtained.
  • the user can highlight a displayed command for the purpose of selecting it by moving his finger 3 in position P2.
  • the situation is illustrated in Figure 5 (e) with the selection of C2.
  • the arrows 23 in FIG. 6 (b) illustrate the path of the finger 3 in the space of the distances 4 (H 1,... H 4) and the positions 5 (PI, P2) corresponding to this variant.
  • the stack 21 can comprise commands 11 and / or sets of commands 10. Once a command 11 or a command set 10 is highlighted, it can be selected or navigated. its tree in the same manner as previously described in connection with Figures 3 and 4.
  • the distance thresholds 4 can be managed in the following manner, it being understood that several management modes of these thresholds can be implemented according to the command structures considered, and / or choices that the user can make via a configuration menu:
  • a first distance threshold H1 making it possible to select a particular control structure is defined as corresponding to a predetermined distance 4,
  • the distance intervals between successive thresholds H1, H2, H3,... are calculated taking into account the number of hierarchical levels of the structure or the number of elements 10, 11 of the stack 21, so as to allow all to explore by varying the height 4 of the finger 3 to the contact with the detection surface of the interface 1,
  • a particular command symbol 11 may represent several possibilities of commands (execution of an application for example), or only a movement command (for example if the symbol represents a piece of a game displayed on the screen of the game). display 2.
  • the user approaches two fingers 3 (or more) of the surface of the interface 2, up to a distance at which the sensors 6 are able to "distinguish” the fingers.
  • the fingers 3 are detected, and if their positions 5 substantially correspond to that of the symbol 11 on the display 2, the symbol 11 is highlighted (for example highlighted).
  • the fingers 3 may be necessary for the fingers 3 to come into contact with the surface of the interface 2.
  • the user can also browse a structure or stack of commands for as explained above to reach the step of highlighting the symbol 11.
  • the user selects the command symbol movement command 11 by effecting a pinching movement 30 or bringing the fingers 3 together as shown in FIG. 3 (a). This movement corresponds, from the point of view of the measurement, to a reconciliation of the positions 5 which is the selection condition chosen.
  • the symbol 11 can be moved by moving the fingers 3, whose position it follows.
  • the validation of the command, and therefore the positioning of the symbol 11 at an arrival position, is performed by moving the fingers 3 apart as shown in FIG. 7 (b).
  • This spacing movement 31 corresponds, from the point of view of the measurement, to a distance from the positions 5 which is the chosen validation condition.
  • the distance 4 of the fingers 3 is increased beyond a certain limit during the movement, it can be provided according to the applications, that the symbol 11 freezes, changes appearance, disappears or returns to its starting position. Increasing the distance 4 beyond a certain limit can also be used as a deselecting condition of the motion control without validating it, with a return of the symbol 11 to its starting position.
  • This mode of control of displacement commands can allow for example to implement playful games of checkerboard (chess, checkers, ).
  • Capacitive sensors and their associated detection electronics can be made according to all arrangements.
  • they may comprise crossed electrodes (for example arranged in rows and columns), and allow direct capacitance measurements (ie capacitance measurements between the electrodes 6 and the object 3), and / or coupling capacitance measurements (i.e. capacitance measurements between transmitting electrodes and receiving electrodes disturbed by the presence of the object 3).
  • Such embodiments are, for example, well suited to large interfaces 1 covering computer display screens 2;
  • the method according to the invention is applicable to the selection of commands in all command or command group structures for which it is possible by using the distance information to traverse hierarchical or topological levels, and to represent these levels. on the display 2 by means of one or more symbols or icons;
  • the navigation in the hierarchical or topological levels of the control structures may depend on distance (s) 4 and / or position (s) 5, as well as any information relating to these quantities such as speeds and accelerations.
  • distance movement speed 4 of a control object 3 can be taken into account to browse more quickly a stack or other control structure, viewing only one element on n;
  • the intervals between the threshold distances 4 (H1, ...) can be determined anyway. In particular, they can be fixed, predefined, adjusted according to the number of levels of a command structure displayed so that the totality of a structure can always be traversed in the same overall range of distances 4, variable in a limited range, etc. ;
  • a delay can be provided which turns off screen 2 after a period of inactivity.
  • the screen 2 is then reactivated for example when a control object 3 appears at a distance 4 less than an activation distance, or simply when an object is detected by the sensors 6.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
EP12708880.5A 2011-01-31 2012-01-30 Interface homme-machine tridimensionnelle Withdrawn EP2671146A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1150726A FR2971066B1 (fr) 2011-01-31 2011-01-31 Interface homme-machine tridimensionnelle.
PCT/FR2012/050183 WO2012104529A1 (fr) 2011-01-31 2012-01-30 Interface homme-machine tridimensionnelle

Publications (1)

Publication Number Publication Date
EP2671146A1 true EP2671146A1 (fr) 2013-12-11

Family

ID=44501813

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12708880.5A Withdrawn EP2671146A1 (fr) 2011-01-31 2012-01-30 Interface homme-machine tridimensionnelle

Country Status (5)

Country Link
US (2) US10303266B2 (zh)
EP (1) EP2671146A1 (zh)
CN (2) CN103460175B (zh)
FR (1) FR2971066B1 (zh)
WO (1) WO2012104529A1 (zh)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7920129B2 (en) 2007-01-03 2011-04-05 Apple Inc. Double-sided touch-sensitive panel with shield and drive combined layer
FR2949007B1 (fr) 2009-08-07 2012-06-08 Nanotec Solution Dispositif et procede d'interface de commande sensible a un mouvement d'un corps ou d'un objet et equipement de commande integrant ce dispositif.
FR2971066B1 (fr) 2011-01-31 2013-08-23 Nanotec Solution Interface homme-machine tridimensionnelle.
FR2976688B1 (fr) 2011-06-16 2021-04-23 Nanotec Solution Dispositif et procede pour generer une alimentation electrique dans un systeme electronique avec un potentiel de reference variable.
US9259904B2 (en) 2011-10-20 2016-02-16 Apple Inc. Opaque thin film passivation
FR2985048B1 (fr) 2011-12-21 2014-08-15 Nanotec Solution Dispositif et procede de mesure capacitive sensible a la pression pour interfaces tactiles et sans contact
FR2985049B1 (fr) 2011-12-22 2014-01-31 Nanotec Solution Dispositif de mesure capacitive a electrodes commutees pour interfaces tactiles et sans contact
FR2988176B1 (fr) 2012-03-13 2014-11-21 Nanotec Solution Procede de mesure capacitive entre un objet et un plan d’electrodes par demodulation synchrone partielle
FR2988175B1 (fr) 2012-03-13 2014-04-11 Nanotec Solution Procede de mesure capacitive par des electrodes non-regulieres, et appareil mettant en œuvre un tel procede
FR3002052B1 (fr) 2013-02-14 2016-12-09 Fogale Nanotech Procede et dispositif pour naviguer dans un ecran d'affichage et appareil comprenant une telle navigation
CN104077013B (zh) * 2013-03-28 2019-02-05 联想(北京)有限公司 指令识别方法和电子设备
FR3003964B1 (fr) 2013-04-02 2016-08-26 Fogale Nanotech Dispositif pour interagir, sans contact, avec un appareil electronique et/ou informatique, et appareil equipe d'un tel dispositif
FR3004551A1 (fr) 2013-04-15 2014-10-17 Fogale Nanotech Procede de detection capacitif multizone, dispositif et appareil mettant en oeuvre le procede
FR3005763B1 (fr) 2013-05-17 2016-10-14 Fogale Nanotech Dispositif et procede d'interface de commande capacitive adapte a la mise en œuvre d'electrodes de mesures fortement resistives
FR3008809B1 (fr) 2013-07-18 2017-07-07 Fogale Nanotech Dispositif accessoire garde pour un appareil electronique et/ou informatique, et appareil equipe d'un tel dispositif accessoire
FR3013472B1 (fr) 2013-11-19 2016-07-08 Fogale Nanotech Dispositif accessoire couvrant pour un appareil portable electronique et/ou informatique, et appareil equipe d'un tel dispositif accessoire
FR3017470B1 (fr) 2014-02-12 2017-06-23 Fogale Nanotech Procede de saisie sur un clavier numerique, interface homme machine et appareil mettant en œuvre un tel procede
FR3017723B1 (fr) 2014-02-19 2017-07-21 Fogale Nanotech Procede d'interaction homme-machine par combinaison de commandes tactiles et sans contact
KR101628246B1 (ko) * 2014-02-24 2016-06-08 삼성전자주식회사 컨텐츠 표시 방법 및 장치
FR3019320B1 (fr) 2014-03-28 2017-12-15 Fogale Nanotech Dispositif electronique de type montre-bracelet avec interface de commande sans contact et procede de controle d'un tel dispositif
US10719132B2 (en) * 2014-06-19 2020-07-21 Samsung Electronics Co., Ltd. Device and method of controlling device
FR3025623B1 (fr) 2014-09-05 2017-12-15 Fogale Nanotech Dispositif d'interface de commande et capteur d'empreintes digitales
FR3028061B1 (fr) 2014-10-29 2016-12-30 Fogale Nanotech Dispositif capteur capacitif comprenant des electrodes ajourees
FR3032287B1 (fr) 2015-02-04 2018-03-09 Quickstep Technologies Llc Dispositif de detection capacitif multicouches, et appareil comprenant le dispositif
FR3033203B1 (fr) 2015-02-27 2018-03-23 Quickstep Technologies Llc Procede pour interagir avec un appareil electronique et/ou informatique mettant en œuvre une surface de commande capacitive et une surface peripherique, interface et appareil mettant en œuvre ce procede
US9996222B2 (en) * 2015-09-18 2018-06-12 Samsung Electronics Co., Ltd. Automatic deep view card stacking
US20180004385A1 (en) 2016-06-30 2018-01-04 Futurewei Technologies, Inc. Software defined icon interactions with multiple and expandable layers
WO2018057969A1 (en) 2016-09-23 2018-03-29 Apple Inc. Touch sensor panel with top and/or bottom shielding
US10372282B2 (en) 2016-12-01 2019-08-06 Apple Inc. Capacitive coupling reduction in touch sensor panels
US10798397B2 (en) * 2019-01-02 2020-10-06 Tencent America LLC Method and apparatus for video coding
JP7365429B2 (ja) * 2019-12-13 2023-10-19 アルプスアルパイン株式会社 入力装置
EP3846003A1 (en) 2019-12-30 2021-07-07 Dassault Systèmes Selection of a face with an immersive gesture in 3d modeling
EP3846004A1 (en) 2019-12-30 2021-07-07 Dassault Systèmes Selection of an edge with an immersive gesture in 3d modeling
EP3846064A1 (en) * 2019-12-30 2021-07-07 Dassault Systèmes Selection of a vertex with an immersive gesture in 3d modeling
CN113238788B (zh) * 2021-05-14 2024-03-29 山东云海国创云计算装备产业创新中心有限公司 一种bios升级方法及相关装置
US11789561B2 (en) 2021-09-24 2023-10-17 Apple Inc. Architecture for differential drive and sense touch technology

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
WO2009028892A2 (en) * 2007-08-30 2009-03-05 Lg Electronics Inc. A user interface for a mobile device using a user's gesture in the proximity of an electronic device
US20090128498A1 (en) * 2004-06-29 2009-05-21 Koninklijke Philips Electronics, N.V. Multi-layered display of a graphical user interface
US20090327969A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Semantic zoom in a virtual three-dimensional graphical user interface

Family Cites Families (218)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4549279A (en) 1983-01-21 1985-10-22 The Laitram Corporation Single hand, single finger stroke alphameric data processing keyboard system
JPS63167923A (ja) 1987-01-05 1988-07-12 Pfu Ltd イメ−ジデ−タ入力装置
GB2204131B (en) 1987-04-28 1991-04-17 Ibm Graphics input tablet
CA2011517C (en) 1989-05-15 1998-04-21 Gordon W. Arbeitman Flat touch screen workpad for a data processing system
JP3301079B2 (ja) 1990-06-18 2002-07-15 ソニー株式会社 情報入力装置、情報入力方法、情報処理装置及び情報処理方法
GB2245708A (en) 1990-06-29 1992-01-08 Philips Electronic Associated Touch sensor array systems
US5103085A (en) 1990-09-05 1992-04-07 Zimmerman Thomas G Photoelectric proximity detector and switch
US5347295A (en) 1990-10-31 1994-09-13 Go Corporation Control of a computer through a position-sensed stylus
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5488204A (en) 1992-06-08 1996-01-30 Synaptics, Incorporated Paintbrush stylus for capacitive touch sensor pad
US5270818A (en) 1992-09-17 1993-12-14 Alliedsignal Inc. Arrangement for automatically controlling brightness of cockpit displays
JP3469912B2 (ja) 1992-11-18 2003-11-25 株式会社デジタル 複数の同時入力が可能なタッチパネル入力装置及び入力方法
US5363051A (en) 1992-11-23 1994-11-08 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Steering capaciflector sensor
US5345550A (en) * 1992-12-23 1994-09-06 International Business Machines Corporation User-modifiable popup menus for object oriented behavior
JP2752309B2 (ja) 1993-01-19 1998-05-18 松下電器産業株式会社 表示装置
US5572573A (en) 1994-01-25 1996-11-05 U S West Advanced Technologies, Inc. Removable user interface for use with interactive electronic devices
US6947571B1 (en) 1999-05-19 2005-09-20 Digimarc Corporation Cell phones with optical capabilities, and related applications
GB9406702D0 (en) 1994-04-05 1994-05-25 Binstead Ronald P Multiple input proximity detector and touchpad system
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
JP2739445B2 (ja) * 1995-03-22 1998-04-15 株式会社エイ・ティ・アール通信システム研究所 把持目標物体推測装置およびそれを具備する人工現実感装置
US5760760A (en) 1995-07-17 1998-06-02 Dell Usa, L.P. Intelligent LCD brightness control system
WO1997018547A1 (en) 1995-11-16 1997-05-22 Ure Michael J Multi-touch input device, method and system that minimize the need for memorization
US5730165A (en) 1995-12-26 1998-03-24 Philipp; Harald Time domain capacitive field detector
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5835079A (en) 1996-06-13 1998-11-10 International Business Machines Corporation Virtual pointing device for touchscreens
US6308144B1 (en) 1996-09-26 2001-10-23 Computervision Corporation Method and apparatus for providing three-dimensional model associativity
GB9620464D0 (en) 1996-10-01 1996-11-20 Philips Electronics Nv Hand held image display device
US5684294A (en) 1996-10-17 1997-11-04 Northern Telecom Ltd Proximity and ambient light monitor
FR2756048B1 (fr) 1996-11-15 1999-02-12 Nanotec Ingenierie Pont de mesure capacitif flottant et systeme de mesure multi-capacitif associe
US6253218B1 (en) * 1996-12-26 2001-06-26 Atsushi Aoki Three dimensional data display method utilizing view point tracing and reduced document images
US6105419A (en) 1997-03-25 2000-08-22 Recot, Inc. Apparatus and process for inspecting sealed packages
US6073036A (en) 1997-04-28 2000-06-06 Nokia Mobile Phones Limited Mobile station with touch input having automatic symbol magnification function
US6034803A (en) 1997-04-30 2000-03-07 K2 T, Inc. Method and apparatus for directing energy based range detection sensor
US6920619B1 (en) 1997-08-28 2005-07-19 Slavoljub Milekic User interface for removing an object from a display
GB2330670B (en) 1997-10-24 2002-09-11 Sony Uk Ltd Data processing
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US6037937A (en) 1997-12-04 2000-03-14 Nortel Networks Corporation Navigation tool for graphical user interface
US6310610B1 (en) 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US6380853B1 (en) 1998-02-23 2002-04-30 Marconi Commerce Systems Inc. Customer-sensitive dispenser using proximity sensing devices
US6313853B1 (en) 1998-04-16 2001-11-06 Nortel Networks Limited Multi-service user interface
US5956291A (en) 1998-04-17 1999-09-21 Ductech, Llc Underwater diving assistant apparatus
JP3646848B2 (ja) 1998-04-28 2005-05-11 日本精機株式会社 表示装置
JP3792920B2 (ja) 1998-12-25 2006-07-05 株式会社東海理化電機製作所 タッチ操作入力装置
US6188391B1 (en) 1998-07-09 2001-02-13 Synaptics, Inc. Two-layer capacitive touchpad and method of making same
US7256770B2 (en) 1998-09-14 2007-08-14 Microsoft Corporation Method for displaying information responsive to sensing a physical presence proximate to a computer input device
JP4542637B2 (ja) 1998-11-25 2010-09-15 セイコーエプソン株式会社 携帯情報機器及び情報記憶媒体
US6259436B1 (en) 1998-12-22 2001-07-10 Ericsson Inc. Apparatus and method for determining selection of touchable items on a computer touchscreen by an imprecise touch
US7293231B1 (en) 1999-03-18 2007-11-06 British Columbia Ltd. Data entry for personal computing devices
US7151528B2 (en) 1999-06-22 2006-12-19 Cirque Corporation System for disposing a proximity sensitive touchpad behind a mobile phone keypad
US20010031633A1 (en) 1999-12-01 2001-10-18 Nokia Mobile Phones Ltd. Method and apparatus for providing context-based call transfer operation
US6414674B1 (en) 1999-12-17 2002-07-02 International Business Machines Corporation Data processing system and method including an I/O touch pad having dynamically alterable location indicators
US7434177B1 (en) 1999-12-20 2008-10-07 Apple Inc. User interface for providing consolidation and access
US7171221B1 (en) 1999-12-20 2007-01-30 Cingular Wirelesss Ii, Llc System and method for automatically transferring a call from a first telephone to a designated telephone in close proximity
US6661920B1 (en) 2000-01-19 2003-12-09 Palm Inc. Method and apparatus for multiple simultaneously active data entry mechanisms on a computer system
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US7138983B2 (en) * 2000-01-31 2006-11-21 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
US20020140633A1 (en) 2000-02-03 2002-10-03 Canesta, Inc. Method and system to present immersion virtual simulations using three-dimensional measurement
US6601012B1 (en) 2000-03-16 2003-07-29 Microsoft Corporation Contextual models and methods for inferring attention and location
US7417650B1 (en) 2000-03-16 2008-08-26 Microsoft Corporation Display and human-computer interaction for a notification platform
US7444383B2 (en) 2000-06-17 2008-10-28 Microsoft Corporation Bounded-deferral policies for guiding the timing of alerting, interaction and communications using local sensory information
US6847354B2 (en) 2000-03-23 2005-01-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three dimensional interactive display
US20010035858A1 (en) 2000-04-13 2001-11-01 Blumberg J. Seth Keyboard input device
US7302280B2 (en) 2000-07-17 2007-11-27 Microsoft Corporation Mobile phone operation based upon context sensing
DE10042300A1 (de) 2000-08-29 2002-03-28 Axel C Burgbacher Elektronisches Musikinstrument
CN1340976A (zh) 2000-08-31 2002-03-20 微软公司 在移动通信设备上用于检测用户接通度的方法和装置
US6480188B1 (en) 2000-09-06 2002-11-12 Digital On-Demand Thumbwheel selection system
US6520013B1 (en) 2000-10-02 2003-02-18 Apple Computer, Inc. Method and apparatus for detecting free fall
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US6680677B1 (en) 2000-10-06 2004-01-20 Logitech Europe S.A. Proximity detector to indicate function of a key
US7319454B2 (en) 2000-11-10 2008-01-15 Microsoft Corporation Two-button mouse input using a stylus
US6903730B2 (en) 2000-11-10 2005-06-07 Microsoft Corporation In-air gestures for electromagnetic coordinate digitizers
DE10059906A1 (de) 2000-12-01 2002-06-06 Bs Biometric Systems Gmbh Druckempfindliche Fläche eines Bildschirms oder Displays
JP3800984B2 (ja) 2001-05-21 2006-07-26 ソニー株式会社 ユーザ入力装置
US6904570B2 (en) 2001-06-07 2005-06-07 Synaptics, Inc. Method and apparatus for controlling a display of data on a display screen
US6583676B2 (en) 2001-06-20 2003-06-24 Apple Computer, Inc. Proximity/touch detector and calibration circuit
US20030001899A1 (en) 2001-06-29 2003-01-02 Nokia Corporation Semi-transparent handwriting recognition UI
US20050134578A1 (en) 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
US6961912B2 (en) 2001-07-18 2005-11-01 Xerox Corporation Feedback mechanism for use with visual selection methods
JP2003173237A (ja) 2001-09-28 2003-06-20 Ricoh Co Ltd 情報入出力システム、プログラム及び記憶媒体
US8117565B2 (en) 2001-10-18 2012-02-14 Viaclix, Inc. Digital image magnification for internet appliance
US7345671B2 (en) 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
US6938221B2 (en) 2001-11-30 2005-08-30 Microsoft Corporation User interface for stylus-based user input
DE10251296A1 (de) 2002-11-03 2004-05-19 Trachte, Ralf, Dr. flexibles Engabesystem / Mehrfinger-System
US6690387B2 (en) 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
US20030132922A1 (en) 2002-01-17 2003-07-17 Harald Philipp Touch screen detection apparatus
US6720942B2 (en) 2002-02-12 2004-04-13 Eastman Kodak Company Flat-panel light emitting pixel with luminance feedback
US7120872B2 (en) 2002-03-25 2006-10-10 Microsoft Corporation Organizing, editing, and rendering digital ink
US6664744B2 (en) 2002-04-03 2003-12-16 Mitsubishi Electric Research Laboratories, Inc. Automatic backlight for handheld devices
US7016705B2 (en) 2002-04-17 2006-03-21 Microsoft Corporation Reducing power consumption in a networked battery-operated device using sensors
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US7058902B2 (en) 2002-07-30 2006-06-06 Microsoft Corporation Enhanced on-object context menus
FR2844048B1 (fr) 2002-08-30 2005-09-30 Nanotec Solution Systeme et procede de mesure sans contact d'un deplacement ou positionnement relatif de deux objets adjacents par voie capacitive, et application au controle de miroirs
FR2844349B1 (fr) 2002-09-06 2005-06-24 Nanotec Solution Detecteur de proximite par capteur capacitif
US6812466B2 (en) 2002-09-25 2004-11-02 Prospects, Corp. Infrared obstacle detection in the presence of sunlight
US6954197B2 (en) 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
WO2004051392A2 (en) 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
US7194699B2 (en) 2003-01-14 2007-03-20 Microsoft Corporation Animating images to reflect user selection
US20040145601A1 (en) 2003-01-29 2004-07-29 International Business Machines Corporation Method and a device for providing additional functionality to a separate application
US20040150669A1 (en) 2003-01-31 2004-08-05 Sabiers Mark L. Graphical user interface for describing the state of a managed system
US7158123B2 (en) 2003-01-31 2007-01-02 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
DE602004006190T8 (de) 2003-03-31 2008-04-10 Honda Motor Co., Ltd. Vorrichtung, Verfahren und Programm zur Gestenerkennung
US7786983B2 (en) 2003-04-08 2010-08-31 Poa Sana Liquidating Trust Apparatus and method for a data input device using a light lamina screen
US7627343B2 (en) 2003-04-25 2009-12-01 Apple Inc. Media player system
US20040233153A1 (en) 2003-05-22 2004-11-25 Heber Robinson Communication device with automatic display and lighting activation and method therefore
US7362320B2 (en) 2003-06-05 2008-04-22 Hewlett-Packard Development Company, L.P. Electronic device having a light emitting/detecting display screen
US20050015731A1 (en) 2003-07-15 2005-01-20 Microsoft Corporation Handling data across different portions or regions of a desktop
GB2404819A (en) 2003-08-05 2005-02-09 Research In Motion Ltd Mobile communications device with integral optical navigation
DE10337743A1 (de) 2003-08-13 2005-03-10 Ego Elektro Geraetebau Gmbh Verfahren sowie Schaltungsanordnung zur Ermittlung des Betätigungszustandes mindestens eines optischen Sensorelements
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7468722B2 (en) 2004-02-09 2008-12-23 Microsemi Corporation Method and apparatus to control display brightness with ambient light correction
EP2254026A1 (en) 2004-02-27 2010-11-24 Research In Motion Limited Text input system for a mobile electronic device and methods thereof
US7715790B1 (en) 2004-03-19 2010-05-11 Apple Inc. Methods and apparatuses for configuration automation
US20050219228A1 (en) 2004-03-31 2005-10-06 Motorola, Inc. Intuitive user interface and method
US20050219223A1 (en) 2004-03-31 2005-10-06 Kotzin Michael D Method and apparatus for determining the context of a device
US20050221791A1 (en) 2004-04-05 2005-10-06 Sony Ericsson Mobile Communications Ab Sensor screen saver
US20050219394A1 (en) 2004-04-06 2005-10-06 Sterling Du Digital camera capable of brightness and contrast control
US7019622B2 (en) 2004-05-27 2006-03-28 Research In Motion Limited Handheld electronic device including vibrator having different vibration intensities and method for vibrating a handheld electronic device
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7786980B2 (en) 2004-06-29 2010-08-31 Koninklijke Philips Electronics N.V. Method and device for preventing staining of a display device
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US20060012577A1 (en) 2004-07-16 2006-01-19 Nokia Corporation Active keypad lock for devices equipped with touch screen
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US7599044B2 (en) 2005-06-23 2009-10-06 Apple Inc. Method and apparatus for remotely detecting presence
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
JP2008511045A (ja) 2004-08-16 2008-04-10 フィンガーワークス・インコーポレーテッド タッチセンス装置の空間分解能を向上させる方法
US20060044280A1 (en) 2004-08-31 2006-03-02 Huddleston Wyatt A Interface
US7292875B2 (en) 2004-09-30 2007-11-06 Avago Technologies Ecbu Ip (Singapore) Pte Ltd Electronic device with ambient light sensor
US7522065B2 (en) 2004-10-15 2009-04-21 Microsoft Corporation Method and apparatus for proximity sensing in a portable electronic device
WO2006054207A1 (en) * 2004-11-16 2006-05-26 Koninklijke Philips Electronics N.V. Touchless manipulation of images for regional enhancement
US7489305B2 (en) 2004-12-01 2009-02-10 Thermoteknix Systems Limited Touch screen control
US20060146012A1 (en) 2005-01-04 2006-07-06 Arneson Theodore R System and method for automatic display switching
US7151460B2 (en) 2005-01-10 2006-12-19 Nokia Corporation Electronic device having a proximity detector
EP1696414A1 (en) 2005-02-28 2006-08-30 Research In Motion Limited Backlight control for a portable computing device
FR2884349B1 (fr) 2005-04-06 2007-05-18 Moving Magnet Tech Mmt Actionneur electromagnetique polarise bistable a actionnement rapide
US7605804B2 (en) 2005-04-29 2009-10-20 Microsoft Corporation System and method for fine cursor positioning using a low resolution imaging touch screen
US9019209B2 (en) * 2005-06-08 2015-04-28 3M Innovative Properties Company Touch location determination involving multiple touch location processes
CN101542533A (zh) * 2005-07-06 2009-09-23 双子星移动科技公司 三维图形用户界面
FR2888319B1 (fr) 2005-07-07 2008-02-15 Nanotec Solution Soc Civ Ile Procede de mesure sans contact d'un deplacement relatif ou d'un positionnement relatif d'un premier objet par rapport a un second objet, par voie inductive.
US7728316B2 (en) 2005-09-30 2010-06-01 Apple Inc. Integrated proximity sensor and light sensor
US7714265B2 (en) 2005-09-30 2010-05-11 Apple Inc. Integrated proximity sensor and light sensor
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
TWI293000B (en) 2005-11-03 2008-01-21 Benq Corp Electronic device capable of operating a function according to detection of environmental light
FR2893711B1 (fr) 2005-11-24 2008-01-25 Nanotec Solution Soc Civ Ile Dispositif et procede de mesure capacitive par pont flottant
US7663620B2 (en) * 2005-12-05 2010-02-16 Microsoft Corporation Accessing 2D graphic content using axonometric layer views
US10521022B2 (en) 2006-03-17 2019-12-31 Conversant Wireless Licensing S.a.r.l. Mobile communication terminal and method therefor
KR20070113022A (ko) * 2006-05-24 2007-11-28 엘지전자 주식회사 사용자 입력에 반응하는 터치스크린 장치 및 이의 작동방법
US7747293B2 (en) 2006-10-17 2010-06-29 Marvell Worl Trade Ltd. Display control for cellular phone
US20080113618A1 (en) 2006-11-09 2008-05-15 Sony Ericsson Mobile Communications Ab Pairing system and method for mobile devices
US7956847B2 (en) 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
CN201266371Y (zh) * 2007-01-05 2009-07-01 苹果公司 手持移动通信装置
US8745535B2 (en) 2007-06-08 2014-06-03 Apple Inc. Multi-dimensional desktop
US8010900B2 (en) * 2007-06-08 2011-08-30 Apple Inc. User interface for electronic backup
US8074178B2 (en) * 2007-06-12 2011-12-06 Microsoft Corporation Visual feedback display
US8432365B2 (en) 2007-08-30 2013-04-30 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
US8933892B2 (en) * 2007-11-19 2015-01-13 Cirque Corporation Touchpad combined with a display and having proximity and touch sensing capabilities to enable different functions or interfaces to be displayed
KR101237640B1 (ko) 2008-01-29 2013-02-27 (주)멜파스 기생 캐패시턴스 방지 구조를 구비한 터치스크린 장치
US8650507B2 (en) * 2008-03-04 2014-02-11 Apple Inc. Selecting of text using gestures
EP2104024B1 (en) 2008-03-20 2018-05-02 LG Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen using the same
US8723811B2 (en) * 2008-03-21 2014-05-13 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
KR101486345B1 (ko) * 2008-03-21 2015-01-26 엘지전자 주식회사 이동 단말기 및 이동 단말기의 화면 표시 방법
KR101012379B1 (ko) 2008-03-25 2011-02-09 엘지전자 주식회사 단말기 및 이것의 정보 디스플레이 방법
KR101513023B1 (ko) 2008-03-25 2015-04-22 엘지전자 주식회사 단말기 및 이것의 정보 디스플레이 방법
KR101481557B1 (ko) * 2008-03-26 2015-01-13 엘지전자 주식회사 단말기 및 그 제어 방법
KR101495164B1 (ko) 2008-04-10 2015-02-24 엘지전자 주식회사 이동단말기 및 그 화면 처리 방법
US8296670B2 (en) * 2008-05-19 2012-10-23 Microsoft Corporation Accessing a menu utilizing a drag-operation
KR101502002B1 (ko) * 2008-05-21 2015-03-12 엘지전자 주식회사 근접 터치를 이용한 이동 단말기 및 그의 화면표시제어방법
US8576181B2 (en) 2008-05-20 2013-11-05 Lg Electronics Inc. Mobile terminal using proximity touch and wallpaper controlling method thereof
EP2131272A3 (en) 2008-06-02 2014-05-07 LG Electronics Inc. Mobile communication terminal having proximity sensor and display controlling method therein
JP5133791B2 (ja) * 2008-06-19 2013-01-30 株式会社ジャパンディスプレイイースト タッチパネル付き表示装置
KR101504201B1 (ko) * 2008-07-02 2015-03-19 엘지전자 주식회사 이동단말기 및 그의 키패드 표시방법
JP2010061405A (ja) 2008-09-03 2010-03-18 Rohm Co Ltd 静電容量センサ、その検出回路、入力装置および容量センサの制御方法
KR101570116B1 (ko) * 2008-09-09 2015-11-19 삼성전자주식회사 터치스크린을 이용한 컨텐츠 탐색 및 실행방법과 이를 이용한 장치
JP4775669B2 (ja) * 2008-10-10 2011-09-21 ソニー株式会社 情報処理装置、情報処理方法、情報処理システムおよび情報処理用プログラム
KR20100041006A (ko) * 2008-10-13 2010-04-22 엘지전자 주식회사 3차원 멀티 터치를 이용한 사용자 인터페이스 제어방법
KR101021440B1 (ko) * 2008-11-14 2011-03-15 한국표준과학연구원 터치입력장치, 이를 이용한 휴대기기 및 그 제어방법
US8963849B2 (en) 2008-12-04 2015-02-24 Mitsubishi Electric Corporation Display input device
DE112009003521T5 (de) 2008-12-04 2013-10-10 Mitsubishi Electric Corp. Anzeigeeingabevorrichtung
GB2466497B (en) * 2008-12-24 2011-09-14 Light Blue Optics Ltd Touch sensitive holographic displays
JP4683126B2 (ja) * 2008-12-26 2011-05-11 ブラザー工業株式会社 入力装置
US9141275B2 (en) * 2009-02-17 2015-09-22 Hewlett-Packard Development Company, L.P. Rendering object icons associated with a first object icon upon detecting fingers moving apart
KR101582686B1 (ko) * 2009-03-02 2016-01-05 엘지전자 주식회사 아이템 표시 방법 및 이를 적용한 이동 통신 단말기
US20100265204A1 (en) * 2009-04-21 2010-10-21 Sony Ericsson Mobile Communications Ab Finger recognition for authentication and graphical user interface input
US8669945B2 (en) * 2009-05-07 2014-03-11 Microsoft Corporation Changing of list views on mobile device
US20100289740A1 (en) * 2009-05-18 2010-11-18 Bong Soo Kim Touchless control of an electronic device
US8373669B2 (en) 2009-07-21 2013-02-12 Cisco Technology, Inc. Gradual proximity touch screen
GB0913734D0 (en) 2009-08-06 2009-09-16 Binstead Ronald P Masked touch sensors
FR2948997B1 (fr) 2009-08-07 2012-04-20 Nanotec Solution Capteur de pression capacitif integrant une mesure de temperature compatible avec les milieux chauds.
FR2949008B1 (fr) 2009-08-07 2011-09-16 Nanotec Solution Dispositif de detection capacitif a integration de fonctions.
FR2949007B1 (fr) 2009-08-07 2012-06-08 Nanotec Solution Dispositif et procede d'interface de commande sensible a un mouvement d'un corps ou d'un objet et equipement de commande integrant ce dispositif.
US9152317B2 (en) * 2009-08-14 2015-10-06 Microsoft Technology Licensing, Llc Manipulation of graphical elements via gestures
US8405677B2 (en) 2009-09-09 2013-03-26 Mitac International Corp. Method of improving the accuracy of selecting a soft button displayed on a touch-sensitive screen and related portable electronic device
KR101629711B1 (ko) 2009-09-16 2016-06-13 엘지전자 주식회사 이동 단말기
KR20110061285A (ko) * 2009-12-01 2011-06-09 삼성전자주식회사 휴대용 디바이스 및 이의 터치 패널 운용 방법
TW201124766A (en) 2010-01-08 2011-07-16 Wintek Corp Display device with touch panel
US10007393B2 (en) * 2010-01-19 2018-06-26 Apple Inc. 3D view of file structure
JP5407950B2 (ja) 2010-03-11 2014-02-05 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
US20110296351A1 (en) * 2010-05-26 2011-12-01 T-Mobile Usa, Inc. User Interface with Z-axis Interaction and Multiple Stacks
JP2012032852A (ja) 2010-07-28 2012-02-16 Sony Corp 情報処理装置、情報処理方法およびコンピュータプログラム
US8923014B2 (en) 2010-08-19 2014-12-30 Lg Display Co., Ltd. Display device having touch panel
JP5732784B2 (ja) 2010-09-07 2015-06-10 ソニー株式会社 情報処理装置、情報処理方法およびコンピュータプログラム
FR2971066B1 (fr) 2011-01-31 2013-08-23 Nanotec Solution Interface homme-machine tridimensionnelle.
US8736583B2 (en) * 2011-03-29 2014-05-27 Intel Corporation Virtual links between different displays to present a single virtual object
CN102752435A (zh) 2011-04-19 2012-10-24 富泰华工业(深圳)有限公司 具有光标的手机及其控制光标的方法
KR20130057637A (ko) 2011-11-24 2013-06-03 삼성전기주식회사 접촉 감지 장치
US9342169B2 (en) 2012-02-28 2016-05-17 Sony Corporation Terminal device
EP2634680A1 (en) 2012-02-29 2013-09-04 BlackBerry Limited Graphical user interface interaction on a touch-sensitive device
FR2990020B1 (fr) 2012-04-25 2014-05-16 Fogale Nanotech Dispositif de detection capacitive avec arrangement de pistes de liaison, et procede mettant en oeuvre un tel dispositif.
US8935625B2 (en) 2012-08-15 2015-01-13 Conductor, Inc. User interface for task management
US20140062875A1 (en) 2012-09-06 2014-03-06 Panasonic Corporation Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function
US9411474B2 (en) 2012-11-15 2016-08-09 Nokia Technologies Oy Shield electrode overlying portions of capacitive sensor electrodes
FR3002052B1 (fr) * 2013-02-14 2016-12-09 Fogale Nanotech Procede et dispositif pour naviguer dans un ecran d'affichage et appareil comprenant une telle navigation
EP2778859B1 (en) 2013-03-15 2015-07-08 BlackBerry Limited Method and apparatus for word prediction using the position of a non-typing digit
FR3008809B1 (fr) 2013-07-18 2017-07-07 Fogale Nanotech Dispositif accessoire garde pour un appareil electronique et/ou informatique, et appareil equipe d'un tel dispositif accessoire
US10042509B2 (en) 2014-04-22 2018-08-07 International Business Machines Corporation Dynamic hover grace period

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090128498A1 (en) * 2004-06-29 2009-05-21 Koninklijke Philips Electronics, N.V. Multi-layered display of a graphical user interface
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
WO2009028892A2 (en) * 2007-08-30 2009-03-05 Lg Electronics Inc. A user interface for a mobile device using a user's gesture in the proximity of an electronic device
US20090327969A1 (en) * 2008-06-27 2009-12-31 Microsoft Corporation Semantic zoom in a virtual three-dimensional graphical user interface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2012104529A1 *

Also Published As

Publication number Publication date
US20190361539A1 (en) 2019-11-28
US10303266B2 (en) 2019-05-28
CN103460175A (zh) 2013-12-18
US11175749B2 (en) 2021-11-16
CN109240587B (zh) 2022-01-25
WO2012104529A1 (fr) 2012-08-09
CN103460175B (zh) 2018-08-17
FR2971066A1 (fr) 2012-08-03
US20130307776A1 (en) 2013-11-21
FR2971066B1 (fr) 2013-08-23
CN109240587A (zh) 2019-01-18

Similar Documents

Publication Publication Date Title
EP2671146A1 (fr) Interface homme-machine tridimensionnelle
EP2956846B1 (fr) Procédé, appareil et support de stockage pour naviguer dans un écran d'affichage
EP2842019B1 (fr) Procede pour interagir avec un appareil mettant en oeuvre une surface de commande capacitive, interface et appareil mettant en oeuvre ce procede
US8631354B2 (en) Focal-control user interface
EP2602706A2 (en) User interactions
FR2917516A1 (fr) Transpositions mode vitesse / mode positionnel
EP2332035A2 (fr) Dispositif pour le controle d'appareil electronique par la manipulation d'objets graphiques sur un ecran tactile multicontacts
FR3003364A1 (fr) Procede de traitement d'un geste compose, dispositif et terminal d'utilisateur associes
EP2981879B1 (fr) Dispositif pour interagir, sans contact, avec un appareil électronique et/ou informatique, et appareil équipé d'un tel dispositif
FR2898197A1 (fr) Ecran tactile a point d'interaction distinct du point de contact
WO2014193453A1 (en) Graphical user interface with dial control for a parameter
WO2015124564A1 (fr) Procédé d'interaction homme-machine par combinaison de commandes tactiles et sans contact
FR2980004A1 (fr) Dispositif de pointage temporaire pour terminal mobile equipe d'un ecran de visualisation tactile principal et d'un ecran de visualisation auxiliaire
EP3221780B1 (fr) Interface graphique et procede de gestion de l'interface graphique lors de la selection tactile d'un element affiche
WO2015082817A1 (fr) Procédé de commande de l'interaction avec un écran tactile et équipement mettant en oeuvre ce procédé
EP2943949A2 (fr) Dispositif portable de lecture interactif et procédé d'affichage d'un document numérique sur ce dispositif
FR3017470A1 (fr) Procede de saisie sur un clavier numerique, interface homme machine et appareil mettant en œuvre un tel procede
FR2946768A1 (fr) Procede d'entree tactile d'instructions de commande d'un programme d'ordinateur et systeme pour la mise en oeuvre de ce procede
FR3112628A1 (fr) Dispositif de pointage informatique
FR3108998A1 (fr) Procede et dispositif pour gerer des appuis « multitouch » sur une surface tactile
FR3017471A1 (fr) Procede d'interaction avec un appareil electronique/informatique tactile et appareil mettant en œuvre un tel procede

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130731

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: QUICKSTEP TECHNOLOGIES LLC

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180228

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210119