WO1998015964A1 - Procede et dispositif pour commander au moins un appareil - Google Patents

Procede et dispositif pour commander au moins un appareil Download PDF

Info

Publication number
WO1998015964A1
WO1998015964A1 PCT/DE1997/002304 DE9702304W WO9815964A1 WO 1998015964 A1 WO1998015964 A1 WO 1998015964A1 DE 9702304 W DE9702304 W DE 9702304W WO 9815964 A1 WO9815964 A1 WO 9815964A1
Authority
WO
WIPO (PCT)
Prior art keywords
operating
output unit
display
input
unit
Prior art date
Application number
PCT/DE1997/002304
Other languages
German (de)
English (en)
Inventor
Peter Kleinschmidt
Luc De Vos
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO1998015964A1 publication Critical patent/WO1998015964A1/fr

Links

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H9/00Details of switching devices, not covered by groups H01H1/00 - H01H7/00
    • H01H9/02Bases, casings, or covers
    • H01H9/0214Hand-held casings
    • H01H9/0235Hand-held casings specially adapted for remote control, e.g. of audio or video apparatus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H13/00Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
    • H01H13/70Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard
    • H01H13/84Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch having a plurality of operating members associated with different sets of contacts, e.g. keyboard characterised by ergonomic functions, e.g. for miniature keyboards; characterised by operational sensory functions, e.g. sound feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the invention relates to a method and an arrangement for operating at least one device by means of a computer using an operating unit which comprises a plurality of input fields, each of which has a cognitive peculiarity.
  • Information display systems for people or users can be widely used.
  • One example is the provision of information for the driver of a vehicle, for example a motor vehicle.
  • electronic control and information systems can be a vehicle mounted on the dashboard, and installation positions are preferred, which are as high as possible, ie close to the windshield.
  • Other devices for displaying information can be attached to the dashboard as high as possible with a bracket or a swivel arm to the right of the steering wheel with left-hand control.
  • all of these installation measures require that the driver directs his gaze from the street to the information display system. He also has to change the focus of his eye from infinite to the short distance to the display system. The presbyopic may need different glasses.
  • the displays for the display are mostly small. Future studies show a large display that is attached to the dashboard near the windshield. The problem of different vision distances is then reduced, but not yet completely eliminated. Large displays are also expensive.
  • the problem underlying the invention is to provide an information display system that can be placed so that the operator's attention is not unnecessarily distracted and operation is very simple.
  • Another object of the invention is to provide a Method and an arrangement to specify that at least one device can be operated by means of an operating unit and / or using voice input.
  • a method for operating at least one device by means of a computer is specified, an operating unit comprising several input fields being used for this.
  • the input fields each have a cognitive peculiarity, so that by feeling the input field it can be recognized which input field it is.
  • the shape of the individual input field or at least one tactile marking, as with typewriter keyboards, can be used to ensure the cognitive peculiarity.
  • Functional assignments that correspond to key labels in conventional input fields are displayed on an output unit. This ensures a high level of flexibility for the individual input fields, since each input field can take on an unlimited number of functional assignments and these versatile options can still be economically perceived simply by reprogramming the display on the output unit.
  • the operating unit thus represents an input tool that can be used flexibly for different applications (devices).
  • the device is operated by actuating the input unit, which corresponds to the desired predefinable functional assignment for this device in a current menu display.
  • a development of the invention is more D evices with the operating unit to operate, which is also selected between the devices by means of the operating unit.
  • Another development consists in outputting visual and / or auditory information by means of the output unit.
  • operation can be carried out via voice input.
  • options for performing the voice input are shown on the output unit.
  • Setting options of the device are also shown on the output unit. It is operated by voice input of the corresponding option or setting option.
  • the output unit is also a further development of the invention to design the output unit as a display or a head-up display.
  • voice input can be simple and intuitive, but active knowledge of the type the input and the required service.
  • the operating unit enables simple selection among terms and / or services that do not have to be actively known, but are propagated by the display unit.
  • Services designate the functions (devices) that can be perceived via the control unit.
  • An example of a service is "Establish a telephone connection”.
  • the information to be communicated to the person is generated on an output display at a location which is not critical for the use.
  • the representation of the information generated by the output display can then be reflected in the direction of the user's gaze and at the same time a virtual image of the representation can be generated at a distance from the user.
  • an operating arrangement is expediently provided, by means of which the information to be displayed can easily be called up.
  • the second optical means is a pane which is arranged in the direction of view of the user and which, for example, is present anyway, as is the case with a windshield in a motor vehicle.
  • the output display also adjustable, or to implement the first optical means in such a way that it can be pivoted.
  • a concave mirror for example, can be used as the first optical means is arranged in the viewing direction of the user and reflects the display in the viewing direction of the user and can also be designed to be pivotable in order to make the location of the creation of the virtual image adjustable. The concave mirror can then be arranged so that the output display can be closed with it.
  • the output display can be operated in such a way that it generates a representation of the information consisting of at least two partial areas, of which one partial area does not
  • Interaction with the user indicates the other sub-area with the operating arrangement of accessible units or devices.
  • the output display can then be controlled by the user via the operating arrangement and the information displayed there can be selected.
  • the subareas can represent maps, sections of maps or possible settings of the operating arrangements.
  • a partial area can represent a map and the virtual image a selectable enlarged section of the map and the map.
  • the second optical means in such a way that the virtual images can be recognized by more than one person, the persons being able to call up the information to be displayed independently of one another with the operating arrangement.
  • the operating arrangement it is also possible that only one person can use the operating arrangement to call up the information to be displayed and thus e.g. the driver of a motor vehicle can pass on the information that he currently needs without the driver having to operate the operating arrangement.
  • a video camera which is aimed at the person and which determines and tracks the location of the eyes of the person or the user and controls the first optical means, for example the concave mirror, accordingly.
  • An operating device with a blind navigator e.g. be provided with controls and / or with voice input means, and a feedback arrangement that acknowledges the operations and gives advice for further, executable operating steps.
  • the speech input device could be useful to design the speech input device as a directional microphone. Then it is expedient to make the directional lobe pivotable for entering the command and always on the user's mouth, e.g. of the driver.
  • Such an operating arrangement can furthermore be designed in such a way that a video camera and image recognition software directed towards the person are provided, by means of which the position of the mouth of the user is determined and accordingly the microphone is controlled so that it is always aimed at the mouth of the user .
  • the operating arrangement is provided in a motor vehicle, it is expedient to arrange the video camera in the region of the windshield and to align it with the driver's mouth in order to accordingly adjust the sensitivity maximum of the directional characteristic of the microphone towards the mouth.
  • a feedback arrangement which advantageously has an acoustic voice output. Furthermore, it is advantageous that in the case of a plurality of operable devices, for example a motor vehicle and a telephone, a virtual image of the respectively controlled device is provided on a display. T he feedback arrangement may also include a visual display device in addition to the speech output, wherein the visual device can be arranged in the operating device, but also in the operated device. The devices which can be operated can thus be represented with the visual display device, but also symbolized operating elements of the operating device which can be operated by the navigator.
  • the control device which has the navigator can be a handheld control device, e.g. a hand-held device for remote control, which can additionally have a chip card for legitimation functions or similar functions.
  • Such a handheld device that can be used for remote control of the operable devices is particularly expedient when the software required for operation, which is normally contained in the operable device, is automatically loaded into the handheld device when it is used for the first time to control the device.
  • the navigator of the operating device can also be used as
  • Control panel run which is conveniently located near the hand position of the user. This solution is particularly useful when used in a motor vehicle, since here the control panel e.g. can be integrated into the steering wheel.
  • the operating device can have a microphone with speech recognizer, and thus enable operation using speech. It is useful if the speech recognizer has different speech models adapted to the device to be operated.
  • the control elements contained in the navigator advantageously allow movement in three dimensions that can be blindly distinguished by the operator. Buttons, rollers, wheels and rotary knobs can be used as operating elements.
  • Such an operating device with a navigator is particularly advantageous if it uses buttons as the operating elements, namely a button for voice and five binary buttons, four of which form two mutually perpendicular directional button pairs and the further button can be used to confirm a selection.
  • the operating processes have simple logic, are understandably reduced to operation with a few operating elements, or can be mapped.
  • Such a general input structure can be operated blindly after a short learning phase, e.g. if Buttons are used and these distinctive, tactile shapes or an easy im
  • the required success control is achieved via the display, which is primarily a speech output, but can be supported by a visual display device.
  • the operating concept thus combines an intuitive and blind-operated navigator for manipulating the functions of the devices, consisting, for example, of a minimal keyboard or a voice input and the feedback arrangement, implemented as acoustic voice output or additionally as a visual display device.
  • Operating or user populations of one or more devices are combined as virtual devices and shown symbolically or textually on a display. This division into virtual devices allows one physically separate device to be operated one after the other.
  • Fig.l an embodiment of the information display system when used in a motor vehicle as an example, wherein Fig. La the closed information display system and Fig. Lb the information display system in
  • FIG.2 is a schematic representation of the
  • Figure 3 is a schematic diagram of a motor vehicle
  • FIG. 4 shows a side view of the information display system when the virtual images of the representations are generated at different locations on the windshield of a motor vehicle, Fig.5 and
  • FIG. 6 shows an example in the event that several people can observe the virtual images simultaneously, for example when used in a motor vehicle
  • FIG. 8 shows an exemplary embodiment of the information display system in which the
  • Fig.9 is a possible representation on the output display and thus the virtual image, which consists of several
  • FIG. 10 a possible representation of a map as a virtual image
  • FIG. 11 a basic representation of the operating arrangement
  • FIG. 12 an embodiment of the operating device as a hand-held device
  • FIG. 13 a second embodiment of the operating device as a hand-held device
  • Fig. 14 is a picture that shows how the different
  • Fig. 15 a second representation of the multi-level concept
  • Fig. 16 a first version of the navigator
  • Fig. 17 and
  • Fig. 18 further versions of the navigator
  • Fig. 19 is a block diagram showing an operating concept for at least one device
  • Fig. 20 is an image showing an operating concept for at least one
  • the information display system which shows the application in a motor vehicle as an example, has an output display A-DIS, on which a representation of the information to be displayed is generated, a first optical means OPl, for example a concave mirror, with which the Output display A-DIS light rays emitted are reflected in the driver's line of sight, a second optical means 0P2, implemented, for example, as a disk WS, in which the virtual image of the representation is generated and operating arrangements in the Steering wheel LR, designed for example as a Zappnch ZA or as a handheld device HG.
  • a video camera CA and a microphone MIK are arranged next to the rearview mirror RS. With the help of the video camera CA you can, for example, determine the position of the driver's mouth or the driver's eyes, for example to control the reflection of the images in the driver's line of sight or to always align the microphone MIK directly to the driver's mouth.
  • the output display A-DIS is arranged in front of the person PE, the driver, and generates representations of the information to be displayed. These representations are provided by the first optical means OP1, e.g. a concave mirror, projected in the driver's line of sight and appear on the second optical means OP2 as a virtual image of the representation.
  • the second optical means is implemented as a windshield WS.
  • Fig. 3 shows more clearly how e.g. the output display can be realized.
  • This can consist of a display DP, e.g. a multi mirror chip that is illuminated with light that comes from a lamp LA and that is directed by a condenser KO onto the display DP.
  • the position of the lamp LA can be changed using servomotors MO.
  • the light coming from the display DP is bundled with the aid of an objective OB, passes through an illuminating lens LI, and continues through a focusing screen MA to the first optical means OP1, e.g. a mirror ocular, and from there in the line of sight of the person PE, who displays the image as a virtual image VIR e.g. looks in the windshield.
  • OP1 e.g. a mirror ocular
  • the location at which the virtual images VIR are generated with the aid of the second optical means can, according to FIGS. 4 and 5, be at the upper edge or at the lower edge of the windshield WS, the different location of the virtual images is determined by a different position of the first optical means OP1, for example the concave mirror. 6, it is also possible to generate the virtual image on the side edge of the windshield, again caused by the position and arrangement of the first optical means OP1.
  • Fig. 7 An example is shown in which two people have virtual images, e.g. generated in the windshield of a motor vehicle. It is possible to generate a separate virtual image of a representation of information for each person or, under the control of one person, a virtual image for both persons. It is possible that e.g. the front passenger calls up the information for the driver with an operating arrangement, so that the driver is not burdened with the operation of the information display system.
  • FIG. 8 A further exemplary embodiment, with which different locations of the virtual image can be generated, is shown in FIG. 8.
  • the output display DIS is pivoted, with the result that e.g. the WS windshield can create virtual images.
  • the first optical means can e.g. be a mirror, a concave mirror, but it can also be part of the windshield WS, which must then be mirrored accordingly.
  • the first optical means must in any case be a means by which the light rays emanating from the output display are reflected in the direction of the person's gaze and must also be partially transparent so that a virtual image can be created for the person.
  • Fig. Shows what representations generated by the output display can look like.
  • Such a display of information can consist of different sub-areas TB.
  • a first one Partial area TB1 can represent a section of a map
  • a second partial area TB2 show several operable devices
  • a third partial area TB3 represent a section of a street with directional indicators and signposts.
  • the map can be generated as a virtual image, for example, as shown in FIG. There the map is first shown as a spherical map and, with the aid of a magnifying glass LP, a central area of this map is shown enlarged.
  • Fig. 11 generally shows an arrangement for operating several devices G (I) to G (n), e.g. a motor vehicle and a telephone or other devices or aggregates of devices or their functions.
  • the operating arrangement for these devices G is designated BEA.
  • the BEA arrangement enables the virtual
  • One of the devices G (I) to G (n) can be selected by operating the operating device BD accordingly. If a device has been activated, e.g. the device G (I), then it is shown on the display DIS and the selection of the device is communicated to the user via the feedback device RE.
  • FIG. A possible embodiment of the operating device BEA can be seen in FIG.
  • the operating device is implemented as a handheld device HG.
  • the display DIS as an example, enables the selection of various television channels which are shown virtually as text there.
  • an operating device BD which consists of buttons TA and a so-called Zappnch ZA.
  • the TA buttons, but in particular the ZA cross, are then the Navigator NV, which can be used to select from the devices.
  • the connection to the devices can be established using infrared rays or other remote control methods.
  • the GSM / DECT method is exemplified.
  • the handheld device of FIG. 12 also enables input using a MI microphone, so that the devices can be operated even without keys.
  • the feedback arrangement, a handset, is labeled H_RE. If the corresponding device has been activated, this can be shown on the DIS display and the activated device can also be acoustically reported using the feedback arrangement RE.
  • FIG. 13 Another embodiment of a handheld device HG is shown in FIG. 13.
  • a display DIS is also provided here, as a navigator of the operating device a zapper cross ZA and feedback arrangements RE, which in addition to a handset H_RE can also contain a visual display device V_RE.
  • a microphone MI can also be used to enter the operating procedures, so that the zapper could also be dispensed with.
  • a structure of a Zappnches ZA shows Fig. 16. It can be seen here that the devices can be operated with five keys, two key pairs TA_R, TA_L or TA_0, TA_U lying perpendicular to one another and a key TA_Z lying centrally within the two key pairs.
  • TA_0 the button pair
  • TA_U e.g. selected one of the devices shown on the DIS display in Fig. 13.
  • buttons TA_R, TA_L e.g. Functions can be selected. The entry is confirmed with the central key TA_Z.
  • Fig. 16 shows that a further key TA_M can also be provided next to the zapper cross, with which, for example, a voice input can be switched on via the microphone MI.
  • the further key TA_M can also be arranged elsewhere, see Fig. 13.
  • Fig. 14 shows that various settings are possible with the hand-held device, it being possible to change from an upper level, which is assigned to the individual devices in Fig. 13, to lower levels of functions and sub-functions of a selected device.
  • the example in FIG. 14 assumes, for example, that a photographic device is operated with the operating arrangement. In this case, the handheld device can show in the DIS display what is to be done with the photographic device, for example a photograph is to be reproduced.
  • the handheld device is set using the ZA cross, for example using the TA_U key.
  • the photo to be displayed is shown on the DIS display.
  • Additional settings can now be triggered with the zapper cross.
  • the brightness can be set using the TA_R key on the cross, which is then shown on the V_RE display.
  • a special function can be set, namely the dubbing of a photo with another button on the zapper.
  • 14 shows three settings I to III and shows how these can be achieved.
  • the individual functions can also be selected by voice using the MI microphone.
  • FIG. 15 Another example of the multi-level concept can be seen in FIG. 15.
  • the operating arrangement is shown in four different settings.
  • the DIS display shows which devices can be selected.
  • the "television" device can be selected by pressing the central TA_Z key.
  • the "TV" device has been selected, a list of the addressable stations appears on the DIS display and the selected device appears on the V_RE display.
  • TA_R TA_L buttons
  • one of the stations for example the ZDF
  • This setting shows II.
  • a special function can be selected by pressing the cross or an additional button. Such special functions are shown in setting III.
  • a function that results from setting IV can be selected from the functions shown in the DIS display using the cross zapper.
  • the multilevel concept according to Fig. 15 thus shows the way from the selection of a device to the selection of a special function that is to be carried out with the selected device.
  • the selection is made using the Zappcross ZA and with the key pairs and confirmed by the central key TA_Z. All selection processes can of course also be carried out with the aid of a voice input via the microphone MI.
  • an acoustic success control H_RE is used, which can be supplemented by a visual success control V_RE. It is also possible to use the MI microphone to implement operator input with voice.
  • the Navigator comes with a minimal concept of easy-to-remember haptic input elements, namely buttons. It is sufficient to use six easy-to-learn controls that can be distinguished by the sense of touch.
  • the figures show that the entirety of the devices with their functions can be operated more easily in that the operating effort is broken down into subtasks.
  • the subtasks clear visual operating scenes are shown on the DIS display and continue to only show what is required at the moment of operation.
  • the Navigator NA manages with three cognitive dimensions, which can be selected with the control elements, the buttons.
  • the visual dimension, the xy plane can be assigned to the key pairs TA_R, TA_L or TA_O, TA_U, the penetration into the depth of detail can be achieved with the central key TA_Z, the way back by pressing this central key for a correspondingly long time TA_Z can be reached.
  • the depth indexing or detailing is based on the multi-level concept shown in Fig. 15.
  • the chronological sequence of use or the sequence of reminders can also be achieved, for example, using the horizontal button pair TA_L, TA_R.
  • TA_L the horizontal button pair
  • TA_R the horizontal button pair
  • FIGS. 12 to 15 several different real or virtual devices can be listed on the display DIS of the top level or primary level, for example by text or symbolically by an icon, and for example scanned by means of a vertical pair of buttons TA_0, TA_U and by means of the central key TA_Z can be triggered. With the central button, a new operating level, the device level, is reached.
  • the respective device with its controls can be illustrated on the DIS display on the device level, the horizontal pair of buttons for operating an important specific function, e.g. Volume, can be used and the vertical pair of buttons mainly for switching text fields such as TV channels, telephone subscribers, and the central function TA_Z is used to select the corresponding function.
  • Special functions that are required less often are e.g. accessible via a single operating step, such as e.g. in Fig. 12, left column SPI.
  • the central key TA_Z you can e.g. each to a new device level, with the return e.g. with a long press on the central key TA_Z, see SPI column. For example, a double click on the central button TA_Z can lead directly to the primary selection level, see SPI column.
  • column SPI shows, with a different duration of the actuation, e.g. a key or by repeatedly pressing a key to get from one level to another and back, or to address different functions within a level.
  • the respectively addressed level and there the respectively addressed function can be shown on the DIS display, wherein an additional display V_RE can also be used for the feedback.
  • buttons and wheels allows, for example, parallel functions and multiple functions to be assigned to an input element.
  • buttons in particular the zapper cross and additional buttons, it being possible to proceed according to the multilevel concept of FIG. 15.
  • the relationships of the actuation of the key to key functions specified in the columns SPI, SPII, SPIII in FIG. 12 are only given as examples; a different assignment of key functions for the actuation of keys is of course possible.
  • FIG. 19 shows an operating unit BE which has several input fields TA_O, TA_R, TA_U, TA_L, TA_Z and TA_M (see also FIG. 16).
  • a computer RE processes the input from the control unit BE and represents a corresponding functional assignment FB1 or FB2 on the output unit DISP.
  • a high degree of flexibility is guaranteed since the input fields of the control unit BE can be differentiated and a labeling of these input fields is provided does not function as a device. This also eliminates the frequently used multiple assignments of input fields, which require plenty of practice for economical handling and always require visual contact with the respective input field.
  • the multiple assignments of the input fields TA_O, TA_R, TA_U, TA_L, TA_Z and TA_M are displayed on the DISP output unit.
  • 20 shows the output unit DISP a virtual device "electronic mailbox". Are on the functional assignments of the input units are shown to the output unit. The user can carry out a corresponding interaction without looking at the input fields.
  • Each input field can be identified by feeling and, based on the knowledge of the arrangement, operation is possible without looking at the control unit.
  • a virtual device is understood to mean a device that is only displayed electronically and is not necessarily designed as a single device with operating options.
  • Examples of virtual devices are a keyboard of a mobile phone that is displayed on the output unit, the electronic mailbox mentioned above or a navigation system.
  • the virtual devices are presented to the user uniformly via the output unit, and a selection of the individual device is also possible via the control unit.
  • the input field TA_Z can perform different functions on single actuation (single click) or double actuation within a predefinable period (double click).
  • the input fields TA_R, TA_L, TA_0 or TA_U are used to identify the next right, left, top or bottom object. With such a marking, the user becomes the current position of his input mark (Cursor) is displayed. For this purpose, the switch (or the corresponding object) just marked is preferably displayed with a different color, so that the user can see at a glance the position of his cursor (cursor).
  • the input field TA_Z corresponds to simple
  • the TA_M key is used for direct selection by voice input.
  • the input mark is also positioned by actuating the input fields TA_R, TA_L, TA_0 or TA_U, the input field TA_Z represents a zoom function.
  • direct navigation is also possible via voice input by speaking the desired name while pressing the TA_M key.
  • HTML Hyper Text Markup Language
  • a collection of HTML links corresponds to entries as in a telephone book, in which the input fields TA_L and TA_R can be used to page back and forth individually and the input fields TA_0 and TA_U can be used to skip a predeterminable number of HTML links (in both directions: forward and backwards).
  • the input field TA_Z is used again for selection with a single operation or to jump to the next higher menu level with a double operation within a predefinable period. Also, while that
  • Input field TA_M is pressed, an HTML link can be called directly. If it has been recognized by the speech recognizer, this is started.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une unité de commande, constituée de champs d'entrée qui présentent une caractéristique cognitive, avec laquelle il est possible de commander plusieurs appareils sans qu'il soit nécessaire de la regarder. Grâce à des moyens optiques appropriés, les éléments fonctionnels de l'unité de commande sont représentés de façon visuelle et/ou auditive par une unité d'entrée. Ainsi il est possible de commander avec l'unité de commande un nombre quelconque d'appareils et de fonctions. En combinant un dispositif de reconnaissance vocale avec cette unité de commande, il est possible de sélectionner, par entrée d'une instruction vocale, directement un appareil et, éventuellement, une fonction de cet appareil. L'unité de commande convient à l'exécution d'entrées plus approfondies.
PCT/DE1997/002304 1996-10-10 1997-10-08 Procede et dispositif pour commander au moins un appareil WO1998015964A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE19641883.6 1996-10-10
DE19641883 1996-10-10
DE19641884.4 1996-10-10
DE19641884 1996-10-10

Publications (1)

Publication Number Publication Date
WO1998015964A1 true WO1998015964A1 (fr) 1998-04-16

Family

ID=26030245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE1997/002304 WO1998015964A1 (fr) 1996-10-10 1997-10-08 Procede et dispositif pour commander au moins un appareil

Country Status (1)

Country Link
WO (1) WO1998015964A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19904519A1 (de) * 1999-02-04 2000-08-10 Volkswagen Ag Bedieneinheit, insbesondere multifunktionale Bedieneinheit
EP1043192A1 (fr) * 1999-04-09 2000-10-11 Eaton Corporation Système de communication sans fils entre composants d'un véhicule
WO2001094156A3 (fr) * 2000-06-08 2002-05-10 A V B A Engineers & Services 9 Systemes de securite pour vehicules a moteur
FR2827065A1 (fr) * 2001-07-09 2003-01-10 Louis Jobert Aide a la conduite automobile par assistance radio/radar, pour eviter les accidents, simplifier le guidage routier, respect du code de la route
EP1043179A3 (fr) * 1999-04-09 2003-05-21 Delphi Technologies, Inc. Système de contrôle de la pression des pneumatiques d'un véhicule
US6980114B2 (en) 2002-10-09 2005-12-27 Siemens Aktiengesellschaft Remote activity controller for people
EP1526364B2 (fr) 2003-10-21 2009-09-30 Mettler-Toledo AG Procédé pour faire fonctionner une balance et balance
DE102009034068A1 (de) 2008-11-10 2010-05-12 Volkswagen Ag Bedienvorrichtung für ein Kraftfahrzeug
DE102009034069A1 (de) 2008-11-10 2010-05-12 Volkswagen Ag Bedienvorrichtung für ein Kraftfahrzeug
US7893850B2 (en) 2004-06-02 2011-02-22 Research In Motion Limited Handheld electronic device with text disambiguation
DE102012005866A1 (de) * 2012-03-22 2013-09-26 Audi Ag Verfahren zur Wiedergabe von möglichen oder bereits erfolgten, von einer Eingabeeinheit zu erfassenden oder bereits erfolgten Bedieneingaben in einem Kraftfahrzeug
US9075449B2 (en) 2004-06-02 2015-07-07 Blackberry Limited Handheld electronic device and associated method employing a multiple-axis input device and selectively disabling disambiguation
EP2822857B1 (fr) 2012-03-07 2016-09-14 GEA Food Solutions Germany GmbH Ligne d'emballage
DE102018200498A1 (de) * 2018-01-12 2019-07-18 Audi Ag Anzeigevorrichtung und Verfahren zur Projizierung eines Bildes auf eine virtuelle Position während einer Fahrt eines Fahrzeuges
DE102023200302B3 (de) 2023-01-16 2024-06-06 Volkswagen Aktiengesellschaft Verfahren zur Interaktion mit einem Benutzer, Computerprogrammprodukt sowie Fahrzeug

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0246472A1 (fr) * 1986-05-12 1987-11-25 Siemens Aktiengesellschaft Station de commande à distance
US4818048A (en) * 1987-01-06 1989-04-04 Hughes Aircraft Company Holographic head-up control panel
US4827520A (en) * 1987-01-16 1989-05-02 Prince Corporation Voice actuated control system for use in a vehicle
EP0366132A2 (fr) * 1988-10-27 1990-05-02 Bayerische Motoren Werke Aktiengesellschaft Dispositif de commande multifunctions
US5088070A (en) * 1991-05-06 1992-02-11 Timex Corporation Selecting apparatus for a multimode electronic wrist instrument
WO1995003664A1 (fr) * 1993-07-26 1995-02-02 Motorola Inc. Emetteur-recepteur radio avec appareil d'interface a affichage visuel d'informations et procede d'utilisation dudit appareil
US5555172A (en) * 1994-08-22 1996-09-10 Prince Corporation User interface for controlling accessories and entering data in a vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0246472A1 (fr) * 1986-05-12 1987-11-25 Siemens Aktiengesellschaft Station de commande à distance
US4818048A (en) * 1987-01-06 1989-04-04 Hughes Aircraft Company Holographic head-up control panel
US4827520A (en) * 1987-01-16 1989-05-02 Prince Corporation Voice actuated control system for use in a vehicle
EP0366132A2 (fr) * 1988-10-27 1990-05-02 Bayerische Motoren Werke Aktiengesellschaft Dispositif de commande multifunctions
US5088070A (en) * 1991-05-06 1992-02-11 Timex Corporation Selecting apparatus for a multimode electronic wrist instrument
WO1995003664A1 (fr) * 1993-07-26 1995-02-02 Motorola Inc. Emetteur-recepteur radio avec appareil d'interface a affichage visuel d'informations et procede d'utilisation dudit appareil
US5555172A (en) * 1994-08-22 1996-09-10 Prince Corporation User interface for controlling accessories and entering data in a vehicle

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19904519A1 (de) * 1999-02-04 2000-08-10 Volkswagen Ag Bedieneinheit, insbesondere multifunktionale Bedieneinheit
EP1043192A1 (fr) * 1999-04-09 2000-10-11 Eaton Corporation Système de communication sans fils entre composants d'un véhicule
US6177867B1 (en) 1999-04-09 2001-01-23 Eaton Corporation System for wireless communication between components of a vehicle
EP1043179A3 (fr) * 1999-04-09 2003-05-21 Delphi Technologies, Inc. Système de contrôle de la pression des pneumatiques d'un véhicule
WO2001094156A3 (fr) * 2000-06-08 2002-05-10 A V B A Engineers & Services 9 Systemes de securite pour vehicules a moteur
US6697721B2 (en) 2000-06-08 2004-02-24 A.V.B.A. Engineers And Services (93) Ltd. Safety devices for use in motor vehicles
FR2827065A1 (fr) * 2001-07-09 2003-01-10 Louis Jobert Aide a la conduite automobile par assistance radio/radar, pour eviter les accidents, simplifier le guidage routier, respect du code de la route
US6980114B2 (en) 2002-10-09 2005-12-27 Siemens Aktiengesellschaft Remote activity controller for people
EP1526364B2 (fr) 2003-10-21 2009-09-30 Mettler-Toledo AG Procédé pour faire fonctionner une balance et balance
US7633018B2 (en) 2003-10-21 2009-12-15 Mettler-Toledo Ag Method of operating a balance, and balance
US9075449B2 (en) 2004-06-02 2015-07-07 Blackberry Limited Handheld electronic device and associated method employing a multiple-axis input device and selectively disabling disambiguation
US7893850B2 (en) 2004-06-02 2011-02-22 Research In Motion Limited Handheld electronic device with text disambiguation
DE102009034069A1 (de) 2008-11-10 2010-05-12 Volkswagen Ag Bedienvorrichtung für ein Kraftfahrzeug
US8700332B2 (en) 2008-11-10 2014-04-15 Volkswagen Ag Operating device for a motor vehicle
DE102009034068A1 (de) 2008-11-10 2010-05-12 Volkswagen Ag Bedienvorrichtung für ein Kraftfahrzeug
US9108513B2 (en) 2008-11-10 2015-08-18 Volkswagen Ag Viewing direction and acoustic command based operating device for a motor vehicle
EP2822857B1 (fr) 2012-03-07 2016-09-14 GEA Food Solutions Germany GmbH Ligne d'emballage
DE102012005866A1 (de) * 2012-03-22 2013-09-26 Audi Ag Verfahren zur Wiedergabe von möglichen oder bereits erfolgten, von einer Eingabeeinheit zu erfassenden oder bereits erfolgten Bedieneingaben in einem Kraftfahrzeug
DE102018200498A1 (de) * 2018-01-12 2019-07-18 Audi Ag Anzeigevorrichtung und Verfahren zur Projizierung eines Bildes auf eine virtuelle Position während einer Fahrt eines Fahrzeuges
DE102023200302B3 (de) 2023-01-16 2024-06-06 Volkswagen Aktiengesellschaft Verfahren zur Interaktion mit einem Benutzer, Computerprogrammprodukt sowie Fahrzeug

Similar Documents

Publication Publication Date Title
WO1998028649A1 (fr) Systeme d'affichage pour au moins une personne
DE60124539T2 (de) Auf einem head-up anzeigegerät basierende sicherheitseinrichtung für kraftfahrzeuge
EP2328783B1 (fr) Élément de commande pour dispositif d'affichage dans un moyen de transport
DE69928220T2 (de) Graphische schnittstelle-bauelemente für kraftfahrzeug-armaturenbretterzubehör
WO1998015964A1 (fr) Procede et dispositif pour commander au moins un appareil
DE10212173B4 (de) Digitale Kamera und Verfahren zum Erhalten von Informationen betreffend eine Funktion derselben
DE69418908T2 (de) Verfahren und Gerät zum Informationsanschauen in einer Rechnerdatenbank
EP0540570A1 (fr) Procede pour la realisation d'un dialogue variable avec des appareils techniques.
EP0927925A2 (fr) Ecran plat d'affichage sensible au toucher pour véhicule
EP1562102A2 (fr) Automobile à actuation de fonctions par détection de mouvements oculaires
DE3836555A1 (de) Multifunktions-bedieneinrichtung
DE10349673A1 (de) Vorrichtung und Verfahren zur Dateneingabe in einem Kraftfahrzeug
DE10393781T5 (de) Duales haptisches Fahrzeugsteuerungs- und -anzeigesystem
DE102018205664A1 (de) Vorrichtung zur Assistenz eines Insassen im Innenraum eines Kraftfahrzeugs
EP1214640A1 (fr) Procede et dispositif de commande assistee par menu
EP2990251A2 (fr) Dispositif et procede d'utilisation de contenus multimedia dans un moyen de transport
DE10328200B4 (de) Navigationsgerät für ein Kraftfahrzeug
DE102006052897A1 (de) Informationseinrichtung, vorzugsweise in einem Kraftfahrzeug und Verfahren zur Information über Fahrzeugdaten, insbesondere Fahrzeugfunktionen und deren Bedienung
DE19830968A1 (de) Gerät
EP1284432A1 (fr) Système et procédé actionnant un dispositif d'assistance au conducteur
DE19744941C2 (de) Verfahren zur Fernbedienung einer Präsentationseinrichtung
EP2925552B1 (fr) Procédé de commande et système de commande dans un véhicule automobile
DE19941945C2 (de) Verfahren und Vorrichtung zum Darstellen aktivierter Bedien- und Anzeigebereiche
DE10147738A1 (de) Sprachführungs-Umschalteinrichtung
DE102008037060A1 (de) Anzeigesystem

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP KR US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase