EP1015960A1 - Systeme de commande d'ordinateur - Google Patents

Systeme de commande d'ordinateur

Info

Publication number
EP1015960A1
EP1015960A1 EP97953649A EP97953649A EP1015960A1 EP 1015960 A1 EP1015960 A1 EP 1015960A1 EP 97953649 A EP97953649 A EP 97953649A EP 97953649 A EP97953649 A EP 97953649A EP 1015960 A1 EP1015960 A1 EP 1015960A1
Authority
EP
European Patent Office
Prior art keywords
computer
cursor
function
control according
sensor area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP97953649A
Other languages
German (de)
English (en)
Inventor
Detlef Günther
Andreas Bohn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Krahl Martin
Original Assignee
Krahl Martin
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE1996153682 external-priority patent/DE19653682C2/de
Application filed by Krahl Martin filed Critical Krahl Martin
Publication of EP1015960A1 publication Critical patent/EP1015960A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the invention relates to a controller for a computer.
  • Controls for multifunctional systems e.g. Computers, in simple form, have been known for some time.
  • computers are known in which a user functions the computer through a trigger, e.g. triggers or influences a mouse-based cursor on a screen or a data glove for manipulating spatial objects.
  • a trigger e.g. triggers or influences a mouse-based cursor on a screen or a data glove for manipulating spatial objects.
  • Multimedia systems are understood to mean systems in which the senses of a person are conveyed through different media, e.g. be influenced by texts, images, videos, sounds, noises or music.
  • multimedia systems also include touch (tactile) or smell (olfactory) stimuli that affect a person.
  • the present invention has for its object to provide a controller for a computer and a method for computer control with which the functions of the computer can be controlled in a particularly differentiated manner and the functionality of the computer is increased.
  • the control of a computer according to the invention has sensor means with which the position of at least one cursor on a display of the computer is detected . If the position of the cursor lies within a certain partial area of the display, the sensor area, the dwell time and the position of the cursor in the sensor area are measured. The information about the dwell time and the position is used to influence at least one function of the computer. The dwell time of the cursor is not measured outside the sensor range.
  • a computer is understood to mean any data processing device which is equipped, inter alia, with a screen and a device for controlling a cursor (eg mouse, digitizer).
  • the control of the computer according to the invention can be designed, for example, in the form of a processor or a program. In general, the functional units described below can be implemented either as software or hardware.
  • An advantageous embodiment of the control according to the invention detects the speed and / or acceleration of at least one cursor and uses this information to influence at least one function of the computer.
  • the functionality of the computer is increased by these additional kinematic parameters of the trigger.
  • a controller according to the invention for a computer can e.g. react differently to a fast or slow movement of a cursor.
  • the trajectory is recorded, which is described by at least one cursor on the computer screen.
  • at least one function of the computer is influenced.
  • the path curves of the cursor on the computer display are characteristic of certain situations when operating the computer or for certain users. This information can be used to improve the adaptation of the computer to a user.
  • the control according to the invention has means with which the kinematic (dynamic) behavior of the cursor is quantified.
  • the kinematic behavior of the cursor here is generally understood to mean the space-time behavior of the cursor on the computer screen, which in particular includes the dwell time, the position, the speed and the acceleration of the cursor.
  • Quantification means that the kinematic behavior of the cursor is determined by parameters or functions which e.g. describe the dwell time or the shape of the trajectory. These parameters and functions form the input values for functional relationships that directly link the kinematic behavior of the cursor with a function of the computer. These functional relationships can be permanently stored in a database or can be changed over time.
  • the quantization is used with particular advantage in combination with a random generator, so that novel effects can be achieved again and again, in particular in computers with multimedia applications or games.
  • the control according to the invention advantageously has means of changing the position, shape and / or the function of at least one sensor area in the computer in a predeterminable or randomly controlled manner. This allows the sensor areas to be adapted to changing situations, which increases the flexibility of the control and the computer.
  • a database is used to store the kinematic behavior of at least one cursor. It is also advantageous to record and store the spatial, temporal and / or functional changes in at least one sensor area in a database. In this way, for example, certain movements or movement patterns of the cursor can be stored and used in a particularly advantageous manner for influencing functions of the computer and / or at least one sensor area.
  • control according to the invention there is a continuous transition (fading) between at least two different functions of the computer.
  • fading a continuous transition between at least two different functions of the computer.
  • a particularly advantageous embodiment of the control system according to the invention has a database in which objects for influencing at least one function of the computer are stored. At least one of these objects has an attribute that describes a property of the object. This attribute can e.g. describe the type of object (e.g. text) or the content of the object (e.g. poem). By using attributes, the controller can easily establish relationships between different objects.
  • At least one object stored in the database and / or an attribute of the object has a modifier.
  • This modifier is a measure by which the control can compare different objects or attributes with each other.
  • a modifier can be be tunable stored in a database or changed by the controller over time.
  • the control according to the invention advantageously has means with which at least one function of the computer can be controlled by the kinematic behavior of the cursor in connection with attributes and / or modifiers of at least one object. This makes it possible for the kinematic behavior of the cursor and the properties of the objects to influence the function of the computer, which enables very flexible control of the computer.
  • the controller according to the invention also advantageously has means with which objects, in particular media, can be automatically stored in the database, sorted according to their type. This considerably speeds up the acquisition of objects (e.g. texts or images) that are to be used as functions of a computer.
  • the controller can e.g. automatically assign certain attributes to the objects.
  • At least one of the objects advantageously has information about a sensor area, an image, a text, a sound, a piece of music, control data for external devices, data for a three-dimensional representation, a modifier, an attribute or a group of objects.
  • this information can be used in a uniform manner to influence the computer.
  • At least one sensor area for a cursor is advantageously stored invisibly on the display of the computer.
  • the display appears to the user in the usual form.
  • the cursor on the display of the computer can be controlled by eye movements of a user of the computer.
  • the eye movements can be recorded, for example, by video monitoring of the pupils or by deriving action potentials from facial muscles. Operating the cursor using eye movements is particularly useful for people who cannot use their hands when working on the computer.
  • the position of at least one cursor on a display of the computer is first detected by sensor means.
  • Computer functions such as e.g. audio-visual signals, can be activated.
  • the information recorded by the sensor means is then transmitted to the control.
  • the control determines whether the at least one cursor is located in a 1- or 2-dimensional partial area (sensor area) of the computer display. If the position lies within the sensor area, the control then determines the dwell time and the position of the at least one cursor within the sensor area. Depending on the dwell time and the position of the at least one cursor in the sensor area, the control finally influences the at least one function of the computer.
  • Fig. 1 - a schematic representation of a display of a computer, wherein functions of the computer can be influenced by a cursor and sensor areas for the cursor; 2 shows a schematic representation of a sensor area on a display of a computer;
  • FIG. 3 shows a schematic representation of a functional relationship between the position of a cursor on the display of the computer and a function of the computer (interaction graph);
  • FIG. 4 shows a schematic representation of a functional relationship between the temporal behavior of a cursor and a function of the computer
  • FIG. 6 shows a schematic illustration of the temporal sequence of positioning a cursor
  • FIG. 1 shows, as an example, a schematic view of a display of a computer 1 which is equipped with a control according to the invention.
  • a cursor 3 serves as a cursor for functions 10 of the computer 1.
  • Functions 10 are, for example, the volume of a sound clip or the provision of program menus.
  • the cursor 3 is moved with the aid of a mouse or another handling device over the display of the computer 1.
  • the control of the computer 1 defines certain sub-areas at some points on the display, in which it is registered when a position 8 of the cursor 3 lies inside this sub-area. These subareas are called sensor areas 2 below.
  • the control detects and stores not only the position 8 of the cursor 3 but also the dwell time of the cursor 3 in a sensor area 2.
  • the shape of a sensor area 2 is not rigid, but can be adjusted as required in terms of position, shape and / or function on the displays of the computer 1. It is also possible for the entire display of the computer 1 to be covered with sensor areas 2, so that the dwell time of the cursor 3 is measured at every point on the screen, two different functions 10 of the computer 1 being triggered depending on the sensor area 2.
  • An overlap of sensor areas 2 is also possible, with the control of the computer 1 then determining how the dwell times are processed (e.g. weighting, addition of the dwell times).
  • the sensor areas 2 are invisible on the display, i.e. they are e.g. the usual display of a multimedia application or a word processing system.
  • the sensor areas 2 can, however, be made visible when a program is being created or when a program is being debugged, in order to check the function 10.
  • a multimedia lexicon according to the invention shows e.g. texts, images and videos on a screen, sensor areas 2 being stored at certain points on the screen.
  • a user of the multimedia lexicon guides the cursor 3 into the area of the display that is of particular interest to him siert. If the cursor 3 is guided into a sensor area 2, the control system detects the position 8 of the cursor 3 and its dwell time in this sensor area 2. For this purpose, the control system has timer functions.
  • the control interprets the dwell time of the cursor 3 in a sensor area 2 as the interest of the viewer and quantifies this interest as the so-called energy value. In this way, the perception of a user can be described by a measure.
  • the energy value is stored in a database and thus serves as a memory for the interest of a viewer.
  • the controller ensures that the energy value is changed after some time, so that forgetting or a waning interest is simulated.
  • the control system maintains an "energy budget" with which it can always be determined in which sensor areas 2 which energy was consumed.
  • the state of a cursor 3 is detected at all times by the current position, the current speed and the dwell time at its current position 8. In a two-dimensional display of a computer 1, the state can therefore be described by five values.
  • the control determines the further behavior of the computer 1 (see also FIGS. 2 to 4). After a certain time (when a threshold value for energy is reached), cross-references to related topics are displayed, for example, or a piece of music that matches the context is played. It is possible that the newly displayed pictures or recorded music overlap each other, thus creating a continuous transition between the scenes (fading).
  • the controller can control the behavior of the computer 1 not only in a deterministic dependence on the kinematic behavior of the cursor 3. Rather, multimedia content can also be selected and presented via a random generator. In the case of an electronic lexicon, for example, this gives the possibility of "browsing". Through a combination of deterministic and random selection of content, certain associations of the user can be taken into account.
  • random control e.g. create images and atmospheres in an artistic multimedia program that are not repeatable and that challenge the creativity of a user.
  • randomly controlled images and texts can be used in games, which always unfold new aspects.
  • a sensor area 2 can e.g. also be a menu item of an operating system of the computer 1. If the cursor 3 remains on this sensitive menu item for a longer period of time, this is interpreted as increased interest by the control and an auxiliary text for this menu item is displayed. Additional functions can then be addressed via the position of the cursor 3 in the sensor area 2.
  • the control of the computer 1 can detect and use the kinematic or dynamic behavior of the cursor 3 in another way.
  • the control of the computer 1 not only registers the position 8 of the cursor 3, but also measures the speed, the acceleration inclination and the trajectory of the cursor 3 on the display of the computer 1. Furthermore, the regions which a cursor 3 frames by opening a window are also detected.
  • the controller By detecting the trajectory of the cursor 3, the controller recognizes the order in which the cursor 3 was in certain sensor areas 2. The controller triggers different functions of the computer 1 depending on the sequence that has been run.
  • the control of the computer 1 can also carry out numerical differentiations at certain points on the trajectory, by means of which the speeds and the accelerations at the points of the trajectory are calculated.
  • the kinematic behavior of the cursor 3 is thus completely captured. These measurements of the kinematic behavior of the cursor are also quantified as energy values.
  • the control system evaluates this as a small release of energy, i.e. The interest of the user is rated as low and only text is displayed. If, however, a cursor 3 moves slowly over a text, more energy is consumed. The interest is rated higher, which leads to a different behavior of the computer 1, e.g. playing a video.
  • the kinematic behavior of the cursor 3 depends crucially on the person of the user of the computer.
  • the kinematic behavior of a user is stored in a database.
  • the controller can thus assign functions 10 of the computer 1 to a specific Customize the user (e.g. using an expert system or a neural network). It is also possible that it recognizes from the kinematic behavior of the cursor 3 that a particular behavior of a user is not efficient, and it adjusts a function 10 of the computer 1 accordingly or indicates the inefficiency to the user. This can lead to a considerable improvement in learning progress, especially with learning software.
  • the cursor 3 of the computer 1 can be controlled by the control according to the invention via the eye movements of a user.
  • the eye movements can e.g. via video surveillance of the pupils.
  • By controlling the kinematic behavior of the cursor 3 through the eye movements and the sensor areas 2, in particular people who cannot fully use their hands (e.g. physically disabled) can operate computers in an efficient and flexible manner.
  • FIG. 2 shows a circular sensor area 2 on the display of a computer 1 with a radius 7. If a cursor 3 is located within the sensor area 2, as shown in FIG. 2, the kinematic behavior of the cursor 3 and its dwell time in the sensor area 2 are detected by the control of the computer 1 according to the invention.
  • the position 8 of the cursor 3 is represented in a polar coordinate system with the center as the reference point 6 of the sensor area 2.
  • the position 8 of the cursor 3 is determined from the distance of the cursor 3 from the reference point 6 and an angle (not shown here) to a reference line.
  • a corner of the sensor area 2 or the center of gravity of the sensor area 2 serves as a reference point 6.
  • the position 8 of the cursor 3 is shown in an absolute coordinate system of the display of the computer 1, ie the coordinates are from the Corner of the display counted out.
  • the control according to the invention additionally evaluates the angular coordinate and the dwell time at different points in the sensor area 2 and determines at least one function 10 of the computer 1 therefrom.
  • FIGS. 3 and 4 The relationship between the kinematic behavior of the cursor 3 and a function 10 of the computer 1 is shown in FIGS. 3 and 4.
  • the functional relationship 9 is part of the control according to the invention.
  • the functional relationships 9 between a function 10 of the computer 1 and the position 8 of a cursor 3 can be both linear and non-linear.
  • the controller typically uses the following input variables: key presses, mouse movements, trackball movements, data glove actions, sensor information, camera information.
  • the input variables are linked by the controller via interaction graphs to the functions 10 of the computer 1.
  • the output variables are typically: visual 2D and SD representations, video information, slide projections, sound, tactile information about active sensors in data gloves.
  • function 10 ' is the opacity of an image in a multimedia application.
  • the start time 11 is defined by a specific action (e.g. pressing a key, exceeding a specific dwell time of the cursor 3 in a sensor area 2). From this point in time, the opacity of an image is determined by the functional relationship 9 ', i.e. the opacity increases and decreases again after a while. If the cursor 3 is removed from the sensor area 2 at any point in time 13, the opacity 10 ′ of the image assigned at this point in time 13 remains.
  • Both the spatial (see FIG. 3) and the temporal evaluation of interaction graphs (see FIG. 4) can be used in combination.
  • Several functions 10 can be influenced as a function of them or independently of one another.
  • FIG. 5 shows an example of how control of the computer 1 according to the invention via interaction graphs 9 ′′, 9 ′′ ′′ functions 10 ′′, 10 ′′ ′′ of a multimedia system influences.
  • a database is essential for the function of the control according to the invention, in which all signals measured by the control and output by the control are stored.
  • the database contains objects 14, such as images, texts, music, sounds, videos, programs, control commands for external devices, which the user of the computer puters 1 are made accessible.
  • objects 14 such as images, texts, music, sounds, videos, programs, control commands for external devices, which the user of the computer puters 1 are made accessible.
  • information about sensor areas 2 is also treated as objects 14.
  • Media are stored in the database as objects 14 of various types.
  • the objects 14 are combined in a container 15 in terms of program technology, the contents 14 of the objects 14 stored in the container 15 belonging together (i.e. images, texts, music on a subject).
  • a container 15 is also an object 14 from a program point of view.
  • An object 14 can be a member of different containers 15.
  • the selection of an object 14 or a specific number of objects 14 takes place as a function of the position 8 ′′, 8 ′′ ′′ of the cursor 3 via the interaction graphs 9 ′′, 9 ′′ ′′.
  • a measure is determined from the positions 8 ′′, 8 ′′ ′′ and / or another kinematic parameter of the cursor 3 via the interaction graphs 9 ′′, 9 ′′ ′′ that are valid at the respective points and / or at the respective time.
  • the control according to the invention determines which object 14 or which group of objects 14 from the suitable container 15 is displayed or played.
  • Each object 14 has attributes 16 that describe the properties of the object 14. Using these attributes 16, the controller determines which objects 14 are displayed, among other things.
  • the image of a Greek temple is stored, which has the attributes 16 "building", “Greece”, “religion” and "antiquity”.
  • the control system displays the image of the temple. If the control has determined, for example, that a user requests information about Greece, it determines depending on speed of the kinematic behavior of the cursor 3 in sensor areas 2, for example whether the image of the temple is displayed in addition to travel information about Greece. If a user informs himself about the ancient world on the computer 1, the image of the temple can again be displayed depending on the kinematic behavior of the cursor 3.
  • the attributes 16 thus establish cross-connections between different objects 14 stored in a database. Since all information is stored in the database as objects 14 in terms of program technology, diverse interactions between the information and the kinematic behavior of the cursor 3 can be established.
  • the control according to the invention does not specify a rigid information hierarchy, where e.g. under the generic term Greece only the sub-terms "travel information” and "pictures” can be called up. Rather, the range of information on the display is determined dynamically by the control as a function of the kinematic behavior of the cursor 3. Simply by lingering the cursor 3 at a certain point in a sensor area 2, the focus (see FIG. 6), different information can be displayed or played back gradually; the controller interprets the stay in the sensor area 2 as increased interest and controls the display of the computer 1 on the basis of the respective energy values.
  • each object 14 has a modifier 17 which assigns a measure (for example in the range 1 to 100) to the object 14.
  • the modifier 17 can be used, for example, to determine the transparency with which an image is displayed. With a modifier 17 with the value 100, the control system displays the image with full opacity, the background of the display is completely covered. With a value of 10, the image is only translucent on the screen, so that elements behind the picture shine through the picture.
  • a modifier 17 can also be used, for example, to influence the volume of a noise, the frequency with which images are displayed or music is played, the selection of an image from a container or the sensitivity of the energy output or the energy consumption.
  • Both the attributes 16 and the modifiers 17 can be changed in a predeterminable manner by the control. Likewise, it is possible for attributes 16 or modifiers 17 to be changed by the kinematic behavior of the cursor 3 and thus to be influenced directly by the behavior of the user.
  • here is an audio system that controls the playback of pieces of music depending on the movement of a cursor 3. If the cursor 3 interacts with different sensor areas 2 one after the other in time, a sensor area 2 will not necessarily play the same pieces of music as the first time when it is searched again. Rather, it is possible to play thematically related pieces of music. Under certain circumstances, the interaction between the cursor 3 and the controller has signaled that the interest of a user has changed. After evaluating the information about the energy, the attributes 16 and the modifiers 17, the container content is therefore recompiled and the pieces of music then contained are played.
  • the control controls the cooperation of the database and the evaluation of the kinematic behavior of the cursor 3 so that new information is always displayed. This creates a knowledge browser with completely new properties, namely the creation and viewing of data rooms and the possibility of interacting with a cursor.
  • Fig. 6 the influence of a function 10 of a computer 1 by the dwell time of a cursor (not shown here) at position 8 is shown in a schematic manner.
  • position 8 is in a sensor area 2, which is assigned to the image of a church
  • different views of the church are shown after a certain time, ie information that is directly related to the selected ten object 14 are related.
  • pictures of churches are shown that can be assigned to the same style.
  • Church music from the corresponding epoch is played even later.
  • the control according to the invention offering the user the possibility at any time of influencing the information provided by movements of the cursor 3 (ie by means of dwell time and position within the sensor area).
  • the direction of the time axis 18 and the orientation of a so-called focal funnel thus indicate the “direction of interest”, that is, the focus of the user of the computer 1.
  • the increasing interest is therefore shown in FIG. 6 by an expanding focal funnel 19; more and more objects 14 are being detected.
  • a shift of the position 8 into another sensor area 2 therefore corresponds to a changed orientation of the focal funnel 19.
  • the embodiment of the invention is not limited to the preferred exemplary embodiments specified above. Rather, a number of variants are conceivable which make use of the computer control according to the invention even in the case of fundamentally different types.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Toys (AREA)

Abstract

L'invention concerne un système de commande pour au moins une fonction (10) d'un ordinateur (1). Ce système de commande présente des moyens de détection (5) servant à détecter une position (8) d'au moins un curseur (3) sur un dispositif d'affichage de l'ordinateur (1) et des moyens pour déterminer un temps de séjour du curseur (3) dans la position (8). Ce système de commande présente en outre des moyens qui influencent la ou les fonctions (10) de l'ordinateur (1) en fonction (9) de la position (8) du curseur (3) et du temps de séjour du curseur (3) dans au moins une zone partielle prédéterminée (zone de détection (2)), en une ou deux dimensions, du dispositif d'affichage de l'ordinateur (1). Il est ainsi possible de créer un système de commande permettant de commander l'ordinateur (1) de manière flexible et très différenciée, par l'intermédiaire du comportement d'un curseur (3), et d'exécuter des fonctions particulières (10) de l'ordinateur (1) en fonction du comportement du curseur (3).
EP97953649A 1996-12-13 1997-12-15 Systeme de commande d'ordinateur Withdrawn EP1015960A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE1996153682 DE19653682C2 (de) 1996-12-13 1996-12-13 Steuervorrichtung und -verfahren für mindestens eine Einrichtung eines Raumes, und Raum mit Steuervorrichtung
DE19653682 1996-12-13
PCT/DE1997/002970 WO1998026346A1 (fr) 1996-12-13 1997-12-15 Systeme de commande d'ordinateur

Publications (1)

Publication Number Publication Date
EP1015960A1 true EP1015960A1 (fr) 2000-07-05

Family

ID=7815787

Family Applications (2)

Application Number Title Priority Date Filing Date
EP97953648A Withdrawn EP1015959A1 (fr) 1996-12-13 1997-12-15 Dispositif de commande pour locaux
EP97953649A Withdrawn EP1015960A1 (fr) 1996-12-13 1997-12-15 Systeme de commande d'ordinateur

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP97953648A Withdrawn EP1015959A1 (fr) 1996-12-13 1997-12-15 Dispositif de commande pour locaux

Country Status (5)

Country Link
EP (2) EP1015959A1 (fr)
JP (2) JP2000512467A (fr)
CA (2) CA2274702A1 (fr)
DE (1) DE19654944A1 (fr)
WO (2) WO1998026345A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6393407B1 (en) * 1997-09-11 2002-05-21 Enliven, Inc. Tracking user micro-interactions with web page advertising
DE10125309C1 (de) * 2001-05-21 2002-12-12 Humatic Gmbh Verfahren und Anordnung zum Steuern von audiovisuellen medialen Inhalten
KR100575906B1 (ko) 2002-10-25 2006-05-02 미츠비시 후소 트럭 앤드 버스 코포레이션 핸드 패턴 스위치 장치
JP2005242694A (ja) 2004-02-26 2005-09-08 Mitsubishi Fuso Truck & Bus Corp ハンドパターンスイッチ装置
DE102007057799A1 (de) * 2007-11-30 2009-06-10 Tvinfo Internet Gmbh Grafische Benutzerschnittstelle
DE102011102038A1 (de) * 2011-05-19 2012-11-22 Rwe Effizienz Gmbh Heimautomatisierungssteuerungssystem sowie Verfahren zum Steuern einer Einrichtung eines Heimautomatisierungssteuerungssystems

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4109145A (en) * 1974-05-20 1978-08-22 Honeywell Inc. Apparatus being controlled by movement of the eye
WO1987002168A1 (fr) * 1985-10-07 1987-04-09 Hagai Sigalov Signaux de commande a rayons lumineux pour instruments musicaux
US4896291A (en) * 1988-05-20 1990-01-23 International Business Machines Corporation Valuator menu for use as a graphical user interface tool
CA2012796C (fr) * 1989-06-16 1996-05-14 Bradley James Beitel Selecteur d'affichage de zones de declenchement
US5107746A (en) * 1990-02-26 1992-04-28 Will Bauer Synthesizer for sounds in response to three dimensional displacement of a body
JP3138058B2 (ja) * 1992-05-25 2001-02-26 東芝キヤリア株式会社 換気扇の制御装置
US5196838A (en) * 1990-12-28 1993-03-23 Apple Computer, Inc. Intelligent scrolling
US5534917A (en) * 1991-05-09 1996-07-09 Very Vivid, Inc. Video image based control system
JPH05108258A (ja) * 1991-10-14 1993-04-30 Nintendo Co Ltd 座標データ発生装置
US5326028A (en) * 1992-08-24 1994-07-05 Sanyo Electric Co., Ltd. System for detecting indoor conditions and air conditioner incorporating same
US5448693A (en) * 1992-12-29 1995-09-05 International Business Machines Corporation Method and system for visually displaying information on user interaction with an object within a data processing system
DE4406668C2 (de) * 1993-04-27 1996-09-12 Hewlett Packard Co Verfahren und Vorrichtung zum Betreiben eines berührungsempfindlichen Anzeigegeräts
US5452240A (en) * 1993-11-23 1995-09-19 Roca Productions, Inc. Electronically simulated rotary-type cardfile
JP3546337B2 (ja) * 1993-12-21 2004-07-28 ゼロックス コーポレイション 計算システム用ユーザ・インタフェース装置及びグラフィック・キーボード使用方法
US5704836A (en) * 1995-03-23 1998-01-06 Perception Systems, Inc. Motion-based command generation technology

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO9826346A1 *

Also Published As

Publication number Publication date
DE19654944A1 (de) 1998-06-25
JP2000512415A (ja) 2000-09-19
WO1998026345A1 (fr) 1998-06-18
CA2274786A1 (fr) 1998-06-18
CA2274702A1 (fr) 1998-06-18
EP1015959A1 (fr) 2000-07-05
WO1998026346A1 (fr) 1998-06-18
JP2000512467A (ja) 2000-09-19

Similar Documents

Publication Publication Date Title
DE69635902T2 (de) Verfahren und einrichtung zur kraftrückkopplung für eine graphische benutzerschnittstelle
DE69231270T2 (de) Gerät zur auf einem Anzeigeschirm angezeigten Manipulation eines Objektes
DE102018100809A1 (de) Verfahren, vorrichtung und endgerät zum anzeigen einer virtuellen tastatur
DE60024655T2 (de) Verfahren zur benutzung von mit einem anzeigegerät verbundenen tasten für den zugriff und die ausführung von damit verbundenen funktionen
DE112006002954B4 (de) Virtuelles Schnittstellensystem
DE112014000441T5 (de) Dynamische Benutzerinteraktionen für Displaysteuerung und Angepaßte Gesten Interpretation
DE60028894T2 (de) Präsentationssystem mit einer interaktiven Darstellung
DE69715367T2 (de) Navigation in einer virtuellen umgebung
DE102009014555A1 (de) Verfahren zum Unterstützen der Steuerung der Bewegung eines Positionsanzeigers mittels eines Tastfelds
DE202005021427U1 (de) Elektronische Vorrichtung mit berührungsempfindlicher Eingabeeinrichtung
EP2350799A1 (fr) Procédé et dispositif d'affichage d'informations ordonnées sous forme de liste
CN108495166B (zh) 弹幕播放控制方法、终端及弹幕播放控制系统
DE202011109296U1 (de) Vorrichtung zur Bereitstellung eines visuellen Übergangs zwischen Bildschirmen
DE202014011483U1 (de) Endgerät
DE112016005818T5 (de) Umstellung erweiterter realitätsobjekte in physischen und digitalen umgebungen
EP3040817A1 (fr) Dispositif et procede de saisie d'un texte au moyen d'elements de commande virtuels avec reaction haptique destinee a simuler une haptique tactile, en particulier dans un vehicule automobile
DE102015116477A1 (de) Datenverarbeitungsverfahren und Elektronikgerät
EP1015960A1 (fr) Systeme de commande d'ordinateur
DE19653682A1 (de) Systemsteuerung
DE102016010920A1 (de) Smart touch
EP1345110A2 (fr) Système pour adapter une interface homme-machine en fonction du profil psychologique et de la sensibilité momentanée d'un utilisateur
WO2014094699A1 (fr) Procédé d'utilisation d'un dispositif qui présente une interface utilisateur comportant un détecteur de contact, et dispositif correspondant
DE102009031158A1 (de) Vorrichtung und Verfahren zur Erkennung einer Zeigegeste eines Nutzers zur Interaktion mit einer Eingabefläche
DE102014019648A1 (de) Datenverarbeitungsverfahren und elektronische Vorrichtung
DE102019113133A1 (de) Systeme und verfahren zur beeinflussung von spotlight-effekten_

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19990702

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB IE IT LI LU MC NL PT

17Q First examination report despatched

Effective date: 20000817

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20020425